⏱️ Estimated Reading Time: 15 minutes
Chat-Ollama is an open-source AI chatbot platform that prioritizes data privacy and security while providing powerful capabilities of cutting-edge language models. With 3.3k+ stars on GitHub and active development, this project offers a complete solution for secure AI conversations in local environments.
# Check Node.js installation (v18+ recommended)
node --version
# Install pnpm
npm install -g pnpm
# Check Git installation
git --version
# Docker installation (optional, for easy deployment)
docker --version
docker-compose --version
# Clone the project
git clone https://github.com/sugarforever/chat-ollama.git
cd chat-ollama
# Install dependencies
pnpm install
# Setup environment variables
cp .env.example .env
Edit the .env
file to add necessary configurations:
# Database settings
DATABASE_URL="file:../../chatollama.sqlite"
# Server settings
PORT=3000
HOST=
# Vector database settings
VECTOR_STORE=chroma
CHROMADB_URL=http://localhost:8000
# AI model API keys (optional)
OPENAI_API_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key
GOOGLE_API_KEY=your_gemini_key
GROQ_API_KEY=your_groq_key
# Feature flags
MCP_ENABLED=true
KNOWLEDGE_BASE_ENABLED=true
REALTIME_CHAT_ENABLED=true
MODELS_MANAGEMENT_ENABLED=true
# Generate Prisma client
pnpm prisma-generate
# Run database migrations
pnpm prisma-migrate
# Run ChromaDB Docker container
docker run -d -p 8000:8000 --name chromadb chromadb/chroma
# Verify container is running
curl http://localhost:8000/api/v1/version
# Start server in development mode
pnpm dev
# Access http://localhost:3000 in browser
Docker allows you to run Chat-Ollama without complex configuration.
# Run from project directory
docker-compose up -d
# Check service status
docker-compose ps
# View logs
docker-compose logs chatollama
Configure environment variables in the docker-compose.yml
file:
services:
chatollama:
environment:
- NUXT_MCP_ENABLED=true
- NUXT_KNOWLEDGE_BASE_ENABLED=true
- NUXT_REALTIME_CHAT_ENABLED=true
- NUXT_MODELS_MANAGEMENT_ENABLED=true
- OPENAI_API_KEY=your_key_here
- ANTHROPIC_API_KEY=your_key_here
In Docker deployment, data is stored as follows:
chromadb_volume
)~/.chatollama/chatollama.sqlite
)# Install Ollama on macOS
curl -fsSL https://ollama.com/install.sh | sh
# Or use Homebrew
brew install ollama
# Start Ollama service
ollama serve
# Install popular models
ollama pull llama3.1:8b
ollama pull codellama:13b
ollama pull mistral:7b
ollama pull qwen2.5:14b
# Check installed models
ollama list
# Test a model
ollama run llama3.1:8b "Hello, please respond in English"
llama3.1:8b
http://localhost:11434
Chat
MCP enables AI models to access external tools and data sources.
Navigate to Settings → MCP in the web interface to add servers:
Name: Filesystem Tools
Transport: stdio
Command: uvx
Args: mcp-server-filesystem
Environment Variables:
PATH: ${PATH}
Name: Git Tools
Transport: stdio
Command: uvx
Args: mcp-server-git
Environment Variables:
PATH: ${PATH}
# File system manipulation
uvx mcp-server-filesystem
# Git repository management
uvx mcp-server-git
# SQLite database queries
uvx mcp-server-sqlite
# Web search (Brave Search)
uvx mcp-server-brave-search
Once MCP is activated, AI models can automatically use tools during conversations:
Company Documents
1000
200
Supported file formats:
# Create sample document (for testing)
echo "Chat-Ollama is an open-source AI chatbot platform.
Built on Nuxt 3 and Vue 3,
it supports various AI models." > sample_doc.txt
When uploading documents through the web interface, it automatically:
Once knowledge bases are created, conversations reference relevant document content:
User: Please explain Chat-Ollama's technology stack.
AI: Based on the uploaded documents, Chat-Ollama is built on Nuxt 3 and Vue 3... [Document-based response]
# Add Google API key to .env file
GOOGLE_API_KEY=your_google_api_key_here
/realtime
pageFor specific network environments requiring proxy:
# Proxy settings in .env file
NUXT_PUBLIC_MODEL_PROXY_ENABLED=true
NUXT_MODEL_PROXY_URL=http://127.0.0.1:1080
Cohere Rerank API for improved search result accuracy:
# Add Cohere API key to .env file
COHERE_API_KEY=your_cohere_key
Selectively enable specific features:
# Development environment (.env)
MCP_ENABLED=true
KNOWLEDGE_BASE_ENABLED=true
REALTIME_CHAT_ENABLED=false
MODELS_MANAGEMENT_ENABLED=true
# Docker environment (docker-compose.yml)
NUXT_MCP_ENABLED=true
NUXT_KNOWLEDGE_BASE_ENABLED=true
NUXT_REALTIME_CHAT_ENABLED=false
NUXT_MODELS_MANAGEMENT_ENABLED=true
PostgreSQL is recommended for production environments:
# Install PostgreSQL (macOS)
brew install postgresql
brew services start postgresql
# Create database and user
psql postgres
CREATE DATABASE chatollama;
CREATE USER chatollama WITH PASSWORD 'secure_password';
GRANT ALL PRIVILEGES ON DATABASE chatollama TO chatollama;
\q
# Update .env file
DATABASE_URL="postgresql://chatollama:secure_password@localhost:5432/chatollama"
# Run migrations
pnpm prisma migrate deploy
# Backup existing SQLite data
cp chatollama.sqlite chatollama.sqlite.backup
# Run migration
pnpm migrate:sqlite-to-postgres
# Build for production
pnpm build
# Test in preview mode
pnpm preview
# Create plist file for LaunchDaemon
sudo tee /Library/LaunchDaemons/com.chatollama.plist << EOF
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.chatollama</string>
<key>ProgramArguments</key>
<array>
<string>/usr/local/bin/node</string>
<string>/path/to/chat-ollama/.output/server/index.mjs</string>
</array>
<key>WorkingDirectory</key>
<string>/path/to/chat-ollama</string>
<key>KeepAlive</key>
<true/>
<key>StandardOutPath</key>
<string>/var/log/chatollama.log</string>
<key>StandardErrorPath</key>
<string>/var/log/chatollama.error.log</string>
</dict>
</plist>
EOF
# Register and start service
sudo launchctl load /Library/LaunchDaemons/com.chatollama.plist
sudo launchctl start com.chatollama
# Check processes using port
lsof -i :3000
# Run on different port
PORT=3001 pnpm dev
# Check ChromaDB container status
docker ps | grep chroma
# Restart container
docker restart chromadb
# Manually run container
docker run -d -p 8000:8000 --name chromadb chromadb/chroma
# Reset database
rm chatollama.sqlite
pnpm prisma migrate reset
# Create new migration
pnpm prisma migrate dev --name init
# Set Node.js memory limit
NODE_OPTIONS="--max_old_space_size=4096" pnpm dev
# Run ChromaDB with optimized settings
docker run -d -p 8000:8000 \
-e CHROMA_SERVER_HOST=0.0.0.0 \
-e CHROMA_SERVER_HTTP_PORT=8000 \
-v chroma-data:/chroma/chroma \
--name chromadb chromadb/chroma
# Set .env file permissions
chmod 600 .env
# Manage sensitive information via environment variables
export OPENAI_API_KEY="your-secret-key"
export ANTHROPIC_API_KEY="your-secret-key"
# Allow local access only
HOST=127.0.0.1 pnpm dev
# HTTPS configuration (production)
NUXT_PUBLIC_SITE_URL=https://your-domain.com
# Regular backup script
#!/bin/bash
BACKUP_DIR="/path/to/backups"
DATE=$(date +%Y%m%d_%H%M%S)
# SQLite backup
cp chatollama.sqlite "$BACKUP_DIR/chatollama_$DATE.sqlite"
# ChromaDB volume backup
docker run --rm -v chromadb_volume:/data -v $BACKUP_DIR:/backup busybox tar czf /backup/chromadb_$DATE.tar.gz /data
Chat-Ollama provides a complete solution that prioritizes privacy while offering powerful AI capabilities. This guide has covered everything from installation to advanced feature utilization, providing sufficient information to customize according to your environment and requirements.
Build a secure and powerful AI assistant using Chat-Ollama. For questions or issues, please utilize GitHub Issues or the Discord community.
Reference Links: