GuidesSelf-Hosting

Self-Hosting

ThinkHive can be self-hosted on your own infrastructure for maximum control over data residency, compliance, and customization.

Self-hosting is recommended for organizations with strict data sovereignty requirements. For most users, the managed service at app.thinkhive.ai is the easiest option.

Requirements

ComponentMinimumRecommended
Node.js20+22 LTS
PostgreSQL15+16 (Neon serverless recommended)
Memory512 MB1 GB
CPU1 vCPU2 vCPU
Storage10 GB50 GB+ (depends on trace volume)

Optional Services

ServicePurposeRequired?
RedisBackground job processing (BullMQ)Optional
PineconeVector search for semantic featuresOptional (pgvector alternative)
StripeCredit-based billingOptional
Auth0Enterprise SSO authenticationOptional (DEMO_MODE for local)

Docker Deployment

Pull or build the image

# Build from source
git clone https://github.com/thinkhive/thinkhivemind.git
cd thinkhivemind
docker build -t thinkhive .

Configure environment

Create a .env file with your configuration:

# Required
DATABASE_URL=postgresql://user:password@your-db-host:5432/thinkhive?sslmode=require
PORT=5001
NODE_ENV=production
SESSION_SECRET=your-secure-session-secret-min-32-chars
API_KEY_SECRET=your-secure-api-key-secret-min-32-chars
 
# Authentication (choose one)
DEMO_MODE=false
AUTH0_DOMAIN=your-tenant.auth0.com
AUTH0_CLIENT_ID=your_client_id
AUTH0_CLIENT_SECRET=your_client_secret
AUTH0_AUDIENCE=https://your-api-audience
 
# Optional: AI Services
OPENAI_API_KEY=sk-your-openai-key
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
 
# Optional: Redis for background jobs
REDIS_URL=redis://your-redis-host:6379
🚫

Generate unique secrets for SESSION_SECRET and API_KEY_SECRET. Never reuse secrets across environments. Use a cryptographically secure generator (e.g., openssl rand -hex 32).

Run the container

docker run -d \
  --name thinkhive \
  -p 5001:5001 \
  --env-file .env \
  --restart unless-stopped \
  thinkhive

Initialize the database

# Push schema
docker exec thinkhive npm run db:push
 
# Seed demo data (optional)
docker exec thinkhive npm run db:seed

Verify

# Health check
curl http://localhost:5001/health/ready
 
# Should return:
# { "status": "ok", "checks": { "database": "ok" } }

Docker Compose

For a complete setup with PostgreSQL and Redis:

# docker-compose.yml
version: '3.8'
 
services:
  thinkhive:
    build: .
    ports:
      - "5001:5001"
    environment:
      DATABASE_URL: postgresql://thinkhive:password@postgres:5432/thinkhive
      PORT: "5001"
      NODE_ENV: production
      SESSION_SECRET: ${SESSION_SECRET}
      API_KEY_SECRET: ${API_KEY_SECRET}
      REDIS_URL: redis://redis:6379
      DEMO_MODE: "true"
    depends_on:
      - postgres
      - redis
    restart: unless-stopped
 
  postgres:
    image: postgres:16-alpine
    environment:
      POSTGRES_DB: thinkhive
      POSTGRES_USER: thinkhive
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
    volumes:
      - postgres_data:/var/lib/postgresql/data
    restart: unless-stopped
 
  redis:
    image: redis:7-alpine
    restart: unless-stopped
 
volumes:
  postgres_data:
# Start all services
docker compose up -d
 
# Initialize database
docker compose exec thinkhive npm run db:push

Cloud Deployment

Google Cloud Run

ThinkHive is optimized for Google Cloud Run:

# Submit build to Cloud Build
gcloud builds submit --config=cloudbuild.yaml
 
# Or deploy directly
gcloud run deploy thinkhive \
  --image gcr.io/YOUR_PROJECT/thinkhive:latest \
  --region us-central1 \
  --platform managed \
  --allow-unauthenticated \
  --memory 512Mi \
  --cpu 1 \
  --port 5001

Set environment variables as Cloud Run secrets or environment variables.

Railway

Deploy directly from your GitHub repository through the Railway dashboard. Railway auto-detects the Dockerfile and handles deployment.

Database Setup

ThinkHive requires PostgreSQL 15+. Options:

ProviderTypeBest For
NeonServerlessCloud deployments, auto-scaling
Local PostgreSQLSelf-managedFull control, on-premise
AWS RDSManagedAWS infrastructure
Google Cloud SQLManagedGCP infrastructure

After setting DATABASE_URL, initialize the schema:

npm run db:push     # Push schema (development)
npm run db:migrate  # Run migrations (production)

Health Checks

Configure your load balancer or orchestrator to use these endpoints:

EndpointPurposeChecks
/health/liveLiveness probeProcess is running
/health/readyReadiness probeDatabase connected, services ready

Updating

To update a self-hosted instance:

# Pull latest changes
git pull origin main
 
# Rebuild
docker build -t thinkhive .
 
# Run migrations
docker exec thinkhive npm run db:migrate
 
# Restart
docker restart thinkhive

Best Practices

  • Use managed databases (Neon, RDS, Cloud SQL) for automatic backups and failover
  • Set up health check monitoring to detect issues early
  • Configure log aggregation — ThinkHive uses Winston for structured logging
  • Run database migrations before deploying new versions
  • Back up your database regularly — see the Deployment guide for backup strategies
  • Keep secrets in a secret manager — avoid .env files in production

Next Steps