Compare fixed, sentence, and markdown chunking on the same sample document before any embedding or storage layer is introduced.
• The `vai` CLI is installed locally. No API key is required for chunking-only workflows.
See the exact VAI command, the matching Voyage AI layer, and the MongoDB query shape behind the demo.
vai chunk /tmp/sample.md --strategy markdown
This is a purely local preprocessing demo. The important thing is not the command syntax itself, but how each strategy changes the units of meaning that later get embedded.
Share or copy this demo
Keep it lightweight. The prepared text stays behind the buttons.
Share
Copy
LinkedIn opens the share dialog and copies the prepared text so you can paste it in quickly.
The full walkthrough is included here so anyone can replay the demo exactly as published.
More shareable workflows from the same VAI demo library.
Run the full workflow in one command: create sample docs, chunk them, embed them, store them in Atlas, and auto-create the vector index.
VAI command
vai pipeline /tmp/vai-demo-docs/ --db vai_demo --collection knowledge --create-index
Show Under the Hood
Prerequisites
A valid VOYAGE_API_KEY is set in the environment.
Run a local CLI workflow with Ollama generation and local embeddings, without a Voyage API key.
VAI command
vai embed "Local inference keeps retrieval private, fast, and API-key free." --local --dimensions 256
Show Under the Hood
Prerequisites
Ollama is installed and running locally.