Self-Hosting
ThinkHive can be self-hosted on your own infrastructure for maximum control over data residency, compliance, and customization.
Self-hosting is recommended for organizations with strict data sovereignty requirements. For most users, the managed service at app.thinkhive.ai is the easiest option.
Requirements
| Component | Minimum | Recommended |
|---|---|---|
| Node.js | 20+ | 22 LTS |
| PostgreSQL | 15+ | 16 (Neon serverless recommended) |
| Memory | 512 MB | 1 GB |
| CPU | 1 vCPU | 2 vCPU |
| Storage | 10 GB | 50 GB+ (depends on trace volume) |
Optional Services
| Service | Purpose | Required? |
|---|---|---|
| Redis | Background job processing (BullMQ) | Optional |
| Pinecone | Vector search for semantic features | Optional (pgvector alternative) |
| Stripe | Credit-based billing | Optional |
| Auth0 | Enterprise SSO authentication | Optional (DEMO_MODE for local) |
Docker Deployment
Pull or build the image
# Build from source
git clone https://github.com/thinkhive/thinkhivemind.git
cd thinkhivemind
docker build -t thinkhive .Configure environment
Create a .env file with your configuration:
# Required
DATABASE_URL=postgresql://user:password@your-db-host:5432/thinkhive?sslmode=require
PORT=5001
NODE_ENV=production
SESSION_SECRET=your-secure-session-secret-min-32-chars
API_KEY_SECRET=your-secure-api-key-secret-min-32-chars
# Authentication (choose one)
DEMO_MODE=false
AUTH0_DOMAIN=your-tenant.auth0.com
AUTH0_CLIENT_ID=your_client_id
AUTH0_CLIENT_SECRET=your_client_secret
AUTH0_AUDIENCE=https://your-api-audience
# Optional: AI Services
OPENAI_API_KEY=sk-your-openai-key
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
# Optional: Redis for background jobs
REDIS_URL=redis://your-redis-host:6379Generate unique secrets for SESSION_SECRET and API_KEY_SECRET. Never reuse secrets across environments. Use a cryptographically secure generator (e.g., openssl rand -hex 32).
Run the container
docker run -d \
--name thinkhive \
-p 5001:5001 \
--env-file .env \
--restart unless-stopped \
thinkhiveInitialize the database
# Push schema
docker exec thinkhive npm run db:push
# Seed demo data (optional)
docker exec thinkhive npm run db:seedVerify
# Health check
curl http://localhost:5001/health/ready
# Should return:
# { "status": "ok", "checks": { "database": "ok" } }Docker Compose
For a complete setup with PostgreSQL and Redis:
# docker-compose.yml
version: '3.8'
services:
thinkhive:
build: .
ports:
- "5001:5001"
environment:
DATABASE_URL: postgresql://thinkhive:password@postgres:5432/thinkhive
PORT: "5001"
NODE_ENV: production
SESSION_SECRET: ${SESSION_SECRET}
API_KEY_SECRET: ${API_KEY_SECRET}
REDIS_URL: redis://redis:6379
DEMO_MODE: "true"
depends_on:
- postgres
- redis
restart: unless-stopped
postgres:
image: postgres:16-alpine
environment:
POSTGRES_DB: thinkhive
POSTGRES_USER: thinkhive
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
volumes:
- postgres_data:/var/lib/postgresql/data
restart: unless-stopped
redis:
image: redis:7-alpine
restart: unless-stopped
volumes:
postgres_data:# Start all services
docker compose up -d
# Initialize database
docker compose exec thinkhive npm run db:pushCloud Deployment
Google Cloud Run
ThinkHive is optimized for Google Cloud Run:
# Submit build to Cloud Build
gcloud builds submit --config=cloudbuild.yaml
# Or deploy directly
gcloud run deploy thinkhive \
--image gcr.io/YOUR_PROJECT/thinkhive:latest \
--region us-central1 \
--platform managed \
--allow-unauthenticated \
--memory 512Mi \
--cpu 1 \
--port 5001Set environment variables as Cloud Run secrets or environment variables.
Railway
Deploy directly from your GitHub repository through the Railway dashboard. Railway auto-detects the Dockerfile and handles deployment.
Database Setup
ThinkHive requires PostgreSQL 15+. Options:
| Provider | Type | Best For |
|---|---|---|
| Neon | Serverless | Cloud deployments, auto-scaling |
| Local PostgreSQL | Self-managed | Full control, on-premise |
| AWS RDS | Managed | AWS infrastructure |
| Google Cloud SQL | Managed | GCP infrastructure |
After setting DATABASE_URL, initialize the schema:
npm run db:push # Push schema (development)
npm run db:migrate # Run migrations (production)Health Checks
Configure your load balancer or orchestrator to use these endpoints:
| Endpoint | Purpose | Checks |
|---|---|---|
/health/live | Liveness probe | Process is running |
/health/ready | Readiness probe | Database connected, services ready |
Updating
To update a self-hosted instance:
# Pull latest changes
git pull origin main
# Rebuild
docker build -t thinkhive .
# Run migrations
docker exec thinkhive npm run db:migrate
# Restart
docker restart thinkhiveBest Practices
- Use managed databases (Neon, RDS, Cloud SQL) for automatic backups and failover
- Set up health check monitoring to detect issues early
- Configure log aggregation — ThinkHive uses Winston for structured logging
- Run database migrations before deploying new versions
- Back up your database regularly — see the Deployment guide for backup strategies
- Keep secrets in a secret manager — avoid
.envfiles in production
Next Steps
- Deployment — Production deployment best practices
- API Key Management — Configure API access for your instance
- PII & Compliance — Data handling configuration