Chonkie provides several installation options to match your specific needs:
# Basic installation (TokenChunker, SentenceChunker, RecursiveChunker)pip install chonkie# For Hugging Face Hub supportpip install "chonkie[hub]"# For visualization support (e.g., rich text output)pip install "chonkie[viz]"# For the default semantic provider support (includes Model2Vec)pip install "chonkie[semantic]"# For OpenAI embeddings supportpip install "chonkie[openai]"# For Cohere embeddings supportpip install "chonkie[cohere]"# For Jina embeddings supportpip install "chonkie[jina]"# For SentenceTransformer embeddings support (required by LateChunker)pip install "chonkie[st]"# For CodeChunker supportpip install "chonkie[code]"# For NeuralChunker support (BERT-based)pip install "chonkie[neural]"# For SlumberChunker support (Genie/LLM interface)pip install "chonkie[genie]"# For Groq Genie support (fast inference)pip install "chonkie[groq]"# For Cerebras Genie support (fastest inference)pip install "chonkie[cerebras]"# For installing multiple features togetherpip install "chonkie[st, code, genie]"# For all featurespip install "chonkie[all]"
We provide separate semantic and all installs pre-packaged that might match other installation options breeding redundancy. This redundancy is intentional to provide users with the best experience and freedom to choose their preferred means.
The semantic and all optional installs may change in future versions, so what you download today may not be the same for tomorrow.
Installing either ‘semantic’ or ‘openai’ extras will enable SemanticChunker, as it can work with any embeddings provider. The difference is in which embedding providers are available for use with this chunker.