simstudioai

    simstudioai/sim

    Build, deploy, and orchestrate AI agents. Sim is the central intelligence layer for your AI workforce.

    ai
    automation
    low-code
    web
    llm
    frontend
    agent-workflow
    agentic-workflow
    agents
    aiagents
    anthropic
    artificial-intelligence
    chatbot
    deepseek
    gemini
    nextjs
    no-code
    openai
    rag
    react
    typescript
    TypeScript
    Apache-2.0
    26.6K stars
    3.3K forks
    26.6K watching
    Updated 2/27/2026
    View on GitHub
    Backblaze Advertisement

    Loading star history...

    Health Score

    75

    Weekly Growth

    +0

    +0.0% this week

    Contributors

    1

    Total contributors

    Open Issues

    169

    Generated Insights

    About sim

    Sim Logo

    Build and deploy AI agent workflows in minutes.

    Sim.ai Discord Twitter Documentation

    Sim Demo

    Quickstart

    Cloud-hosted: sim.ai

    Sim.ai

    Self-hosted: NPM Package

    npx simstudio
    

    http://localhost:3000

    Note

    Docker must be installed and running on your machine.

    Options

    FlagDescription
    -p, --port <port>Port to run Sim on (default 3000)
    --no-pullSkip pulling latest Docker images

    Self-hosted: Docker Compose

    # Clone the repository
    git clone https://github.com/simstudioai/sim.git
    
    # Navigate to the project directory
    cd sim
    
    # Start Sim
    docker compose -f docker-compose.prod.yml up -d
    

    Access the application at http://localhost:3000/

    Using Local Models with Ollama

    Run Sim with local AI models using Ollama - no external APIs required:

    # Start with GPU support (automatically downloads gemma3:4b model)
    docker compose -f docker-compose.ollama.yml --profile setup up -d
    
    # For CPU-only systems:
    docker compose -f docker-compose.ollama.yml --profile cpu --profile setup up -d
    

    Wait for the model to download, then visit http://localhost:3000. Add more models with:

    docker compose -f docker-compose.ollama.yml exec ollama ollama pull llama3.1:8b
    

    Self-hosted: Dev Containers

    1. Open VS Code with the Remote - Containers extension
    2. Open the project and click "Reopen in Container" when prompted
    3. Run bun run dev:full in the terminal or use the sim-start alias
      • This starts both the main application and the realtime socket server

    Self-hosted: Manual Setup

    Requirements:

    Note: Sim uses vector embeddings for AI features like knowledge bases and semantic search, which requires the pgvector PostgreSQL extension.

    1. Clone and install dependencies:
    git clone https://github.com/simstudioai/sim.git
    cd sim
    bun install
    
    1. Set up PostgreSQL with pgvector:

    You need PostgreSQL with the vector extension for embedding support. Choose one option:

    Option A: Using Docker (Recommended)

    # Start PostgreSQL with pgvector extension
    docker run --name simstudio-db \
      -e POSTGRES_PASSWORD=your_password \
      -e POSTGRES_DB=simstudio \
      -p 5432:5432 -d \
      pgvector/pgvector:pg17
    

    Option B: Manual Installation

    1. Set up environment:
    cd apps/sim
    cp .env.example .env  # Configure with required variables (DATABASE_URL, BETTER_AUTH_SECRET, BETTER_AUTH_URL)
    

    Update your .env file with the database URL:

    DATABASE_URL="postgresql://postgres:your_password@localhost:5432/simstudio"
    
    1. Set up the database:
    bunx drizzle-kit migrate 
    
    1. Start the development servers:

    Recommended approach - run both servers together (from project root):

    bun run dev:full
    

    This starts both the main Next.js application and the realtime socket server required for full functionality.

    Alternative - run servers separately:

    Next.js app (from project root):

    bun run dev
    

    Realtime socket server (from apps/sim directory in a separate terminal):

    cd apps/sim
    bun run dev:sockets
    

    Copilot API Keys

    Copilot is a Sim-managed service. To use Copilot on a self-hosted instance:

    • Go to https://sim.ai → Settings → Copilot and generate a Copilot API key
    • Set COPILOT_API_KEY in your self-hosted environment to that value
    • Host Sim on a publicly available DNS and set NEXT_PUBLIC_APP_URL and BETTER_AUTH_URL to that value (ngrok)

    Tech Stack

    Contributing

    We welcome contributions! Please see our Contributing Guide for details.

    License

    This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

    Made with ❤️ by the Sim Team

    Discover Repositories

    Search across tracked repositories by name or description