Build a tiny Atlas-backed RAG chat flow using local nano embeddings and Ollama for generation.
• Ollama is installed and running locally.
• The llama3.2:3b model is already pulled in Ollama.
• vai nano setup has already completed successfully.
• MongoDB Atlas is configured through MONGODB_URI or vai config set mongodb-uri.
See the exact VAI command, the matching Voyage AI layer, and the MongoDB query shape behind the demo.
vai chat --db "$DEMO_DB" --collection "$DEMO_COLLECTION" --local --llm-provider ollama --llm-model "$OLLAMA_MODEL" --llm-base-url http://localhost:11434 --no-history --no-stream
This is the high-level chat entrypoint from the GIF. VAI handles query embedding, Atlas retrieval, prompt assembly, and Ollama generation behind one command.
Share or copy this demo
Keep it lightweight. The prepared text stays behind the buttons.
Share
Copy
LinkedIn opens the share dialog and copies the prepared text so you can paste it in quickly.
The full walkthrough is included here so anyone can replay the demo exactly as published.
More shareable workflows from the same VAI demo library.
Run the full workflow in one command: create sample docs, chunk them, embed them, store them in Atlas, and auto-create the vector index.
VAI command
vai pipeline /tmp/vai-demo-docs/ --db vai_demo --collection knowledge --create-index
Show Under the Hood
Prerequisites
A valid VOYAGE_API_KEY is set in the environment.
Walk through the classic retrieval stack: embed the query, run Atlas vector search, rerank the candidates, then compare the result to a vector-only pass.
VAI command
vai query 'how does vector search work?' --db vai_demo --collection knowledge --model voyage-4-lite
Show Under the Hood
Prerequisites
A valid VOYAGE_API_KEY is set in the environment.
Run a local CLI workflow with Ollama generation and local embeddings, without a Voyage API key.
VAI command
vai embed "Local inference keeps retrieval private, fast, and API-key free." --local --dimensions 256
Show Under the Hood
Prerequisites
Ollama is installed and running locally.