Sim Studio is an open-source AI agent workflow builder. Sim Studio's interface is a lightweight, intuitive way to quickly build and deploy LLMs that connect with your favorite tools.
Sim Studio is a lightweight, user-friendly platform for building AI agent workflows.
The easiest way to run Sim Studio locally is using our NPM package:
npx simstudio
After running these commands, open http://localhost:3000/ in your browser.
-p, --port <port>
: Specify the port to run Sim Studio on (default: 3000)--no-pull
: Skip pulling the latest Docker images# Clone the repository
git clone https://github.com/simstudioai/sim.git
# Navigate to the project directory
cd sim
# Start Sim Studio
docker compose -f docker-compose.prod.yml up -d
Access the application at http://localhost:3000/
To use local models with Sim Studio:
./apps/sim/scripts/ollama_docker.sh pull <model_name>
# With NVIDIA GPU support
docker compose --profile local-gpu -f docker-compose.ollama.yml up -d
# Without GPU (CPU only)
docker compose --profile local-cpu -f docker-compose.ollama.yml up -d
# If hosting on a server, update the environment variables in the docker-compose.prod.yml file to include the server's public IP then start again (OLLAMA_URL to i.e. http://1.1.1.1:11434)
docker compose -f docker-compose.prod.yml up -d
bun run dev
in the terminal or use the sim-start
aliasgit clone https://github.com/simstudioai/sim.git
cd sim
bun install
cd apps/sim
cp .env.example .env # Configure with required variables (DATABASE_URL, BETTER_AUTH_SECRET, BETTER_AUTH_URL)
bunx drizzle-kit push
bun run dev
We welcome contributions! Please see our Contributing Guide for details.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Made with ❤️ by the Sim Studio Team