Configuration
Customize storage, embeddings, and advanced settings for your Cerebro deployment.
Environment Variables
All settings can be configured via environment variables. Set them in your shell profile, .env file, or directly in your MCP client configuration.
| Variable | Default | Description |
|---|---|---|
CEREBRO_STORAGE_PATH | ~/.cerebro | Root directory for all Cerebro data including the SQLite database, FAISS index, and configuration files. |
CEREBRO_EMBEDDING_MODEL | all-MiniLM-L6-v2 | The sentence-transformers model used for semantic search embeddings. Changing this requires rebuilding the FAISS index. |
CEREBRO_EMBEDDING_DIMENSION | 384 | Dimension of the embedding vectors. Must match the chosen model. 384 for MiniLM, 768 for larger models. |
CEREBRO_LOG_LEVEL | INFO | Logging verbosity. Options: DEBUG, INFO, WARNING, ERROR, CRITICAL. |
CEREBRO_MAX_RESULTS | 20 | Default maximum number of results returned by search operations. |
CEREBRO_DECAY_ENABLED | true | Enable or disable the memory decay system that gradually reduces relevance of unused memories. |
CEREBRO_LICENSE_KEY | (none) | License key for Pro/Pro+ features. Only required for paid tier features. Free tier works without it. |
Storage Options
Cerebro stores all data locally. You can use the default location, a custom directory, or a network-attached storage (NAS) path.
Local Filesystem (Default)
The default storage path is ~/.cerebro. All data stays on your machine.
CEREBRO_STORAGE_PATH=~/.cerebroCustom Directory
Point to any directory with sufficient storage:
CEREBRO_STORAGE_PATH=/data/cerebroNetwork-Attached Storage (NAS)
Mount your NAS and point Cerebro to it for centralized storage accessible from multiple machines:
# Mount your NAS first
sudo mount -t nfs nas-ip:/share /mnt/nas
# Point Cerebro to the NAS path
CEREBRO_STORAGE_PATH=/mnt/nas/cerebro# Map network drive first
net use Z: \\NAS-IP\share
# Point Cerebro to the mapped drive
CEREBRO_STORAGE_PATH=Z:\cerebroEmbedding Configuration
Cerebro uses sentence-transformers for semantic search. The default model provides a good balance of speed and accuracy.
Default Model (Recommended)
CEREBRO_EMBEDDING_MODEL=all-MiniLM-L6-v2
CEREBRO_EMBEDDING_DIMENSION=384Higher Accuracy Model
Use a larger model for better semantic accuracy at the cost of more memory and slower indexing:
CEREBRO_EMBEDDING_MODEL=all-mpnet-base-v2
CEREBRO_EMBEDDING_DIMENSION=768Warning: Changing the embedding model requires rebuilding the FAISS index. Run cerebro rebuild-index after changing the model.
Full Example Configuration
A complete MCP client config with all common options:
{
"mcpServers": {
"cerebro": {
"command": "cerebro",
"args": ["serve"],
"env": {
"CEREBRO_STORAGE_PATH": "/path/to/storage",
"CEREBRO_EMBEDDING_MODEL": "all-MiniLM-L6-v2",
"CEREBRO_EMBEDDING_DIMENSION": "384",
"CEREBRO_LOG_LEVEL": "INFO",
"CEREBRO_MAX_RESULTS": "20",
"CEREBRO_DECAY_ENABLED": "true"
}
}
}
}Docker Deployment
For Pro and Pro+ users, Cerebro can run as a Docker container:
services:
cerebro:
image: professorlow/cerebro:latest
environment:
- CEREBRO_LICENSE_KEY=your-license-key
- CEREBRO_STORAGE_PATH=/data
volumes:
- cerebro-data:/data
ports:
- "8420:8420"
restart: unless-stopped
volumes:
cerebro-data:Start with docker compose up -d. The container exposes port 8420 for the MCP server.