Connect 20+ LLM providers with your data through one API.
ORBIT is a self-hosted gateway that unifies LLMs, files, databases, and APIs behind one MCP endpoint, letting teams standardize on one integration surface while keeping security, compliance, and operational controls under their watch.
Try the Sandbox | API Reference | Docker Guide
orbit-demo-web.mp4
Star ORBIT on GitHub to follow new adapters, releases, and production features.
Officially backed by Schmitech, the ORBIT service provider for enterprise deployment and support.
Real-world example: PoliceStats.ca uses ORBIT to power a public chat over Canadian municipal police open data. Users ask in plain language about auto theft, break-ins, crime by neighbourhood, and cross-city comparisons.
A) Try hosted API now
curl -X POST https://orbit.schmitech.ai/v1/chat \
-H 'Content-Type: application/json' \
-H 'X-API-Key: default-key' \
-H 'X-Session-ID: test-session' \
-d '{
"messages": [{"role": "user", "content": "What is ORBIT?"}],
"stream": false
}'B) Run ORBIT locally with Docker Compose (recommended β includes Ollama)
git clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -d
# Wait for services to start, then test
curl -X POST http://localhost:3000/v1/chat \
-H 'Content-Type: application/json' \
-H 'X-API-Key: default-key' \
-H 'X-Session-ID: local-test' \
-d '{
"messages": [{"role": "user", "content": "Summarize ORBIT in one sentence."}],
"stream": false
}'For GPU acceleration (NVIDIA): docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d
C) Run ORBIT from the pre-built image (server only; point it at your own Ollama)
docker pull schmitech/orbit:basic
docker run -d --name orbit-basic -p 3000:3000 schmitech/orbit:basicIf Ollama runs on your host (e.g. port 11434), add -e OLLAMA_HOST=host.docker.internal:11434 so the container can reach it. The image includes simple-chat only; for the full stack (Ollama + models), use option B or the Docker Guide.
- Unified API: Switch OpenAI, Anthropic, Gemini, Groq, or local models (Ollama/vLLM) by config.
- Agentic AI & MCP: Compatible with Model Context Protocol (MCP) for tool-enabled agent workflows.
- Native RAG: Connect Postgres, MongoDB, Elasticsearch, or Pinecone for natural-language data access.
- Voice-First: Real-time, full-duplex speech-to-speech with interruption handling via PersonaPlex.
- Governance Built In: RBAC, rate limiting, audit logging, and circuit breakers.
- Privacy First: Self-host on your own infrastructure for full data control.
| If you use... | You often get... | ORBIT gives you... |
|---|---|---|
| Single-provider SDKs | Vendor lock-in and provider-specific rewrites | One OpenAI-compatible API across providers |
| Basic LLM proxy only | Model routing, but no data connectivity | Unified model + retrieval + tooling gateway |
| RAG-only framework | Strong retrieval, weak multi-provider inference control | Native RAG with multi-provider and policy controls |
| In-house glue scripts | Fragile integrations and high ops cost | A production-ready gateway with RBAC, limits, and logs |
- Deployment Flexibility: Run ORBIT in your own environment for strict data-boundary requirements.
- Operational Control: Standardize access, traffic policies, and audit trails behind one gateway.
- Architecture Fit: Integrates with existing data systems, identity patterns, and model providers.
- Service Backing: Schmitech provides enterprise onboarding, deployment support, and ongoing operations guidance.
- Enterprise RAG: Query SQL, NoSQL, and vector stores with one natural-language API.
- Provider Failover: Route between Ollama, vLLM, OpenAI, Anthropic, Gemini, Groq, etc. without rewrites.
- Voice Agents: Build full-duplex speech-to-speech experiences with interruption handling.
- MCP Tooling Layer: Expose data and actions to agentic apps through MCP compatibility.
| Client | Link | Description |
|---|---|---|
| Web Chat | ORBIT Chat | React UI. |
| CLI | pip install schmitech-orbit-client |
Chat directly from your terminal. |
| Mobile | ORBIT Mobile | iOS & Android app built with Expo. |
| SDKs | Node SDK | Or use any standard OpenAI-compatible SDK. |
Docker Compose (Fastest Path)
git clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -dThis starts ORBIT + Ollama with SmolLM2, auto-pulls models, and exposes the API on port 3000. Connect orbitchat from your host: ORBIT_ADAPTER_KEYS='{"simple-chat":"default-key"}' npx orbitchat
Pre-built image only (server + your own Ollama): docker pull schmitech/orbit:basic then docker run -d --name orbit-basic -p 3000:3000 -e OLLAMA_HOST=host.docker.internal:11434 schmitech/orbit:basic if Ollama runs on the host.
See the full Docker Guide for GPU mode, volumes, single-container run, and configuration.
Stable Release (Recommended for Production)
curl -L https://github.com/schmitech/orbit/releases/download/v2.6.3/orbit-2.6.3.tar.gz -o orbit-2.6.3.tar.gz
tar -xzf orbit-2.6.3.tar.gz && cd orbit-2.6.3
cp env.example .env && ./install/setup.sh
source venv/bin/activate
./bin/orbit.sh start && cat ./logs/orbit.log- Frequent releases: Releases
- Active roadmap and Q&A: Discussions
- Feature requests and bugs: Issues
- Technical writeups: Cookbook β recipes and how-tos
- Enterprise services: Official ORBIT provider (Schmitech)
Inference: OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, Mistral, AWS Bedrock, Azure, Together, Ollama, vLLM, llama.cpp.
Data Adapters: PostgreSQL, MySQL, MongoDB, Elasticsearch, DuckDB, Chroma, Qdrant, Pinecone, Milvus, Weaviate.
- Step-by-Step Tutorial β Learn how to chat with your own data in minutes.
- Cookbook β Recipes and how-tos for configuration and real-world use cases.
- Documentation β Full architecture and setup guides.
- GitHub Issues β Bug reports and feature requests.
- Discussions β Community help and roadmap.
- Enterprise Services β Backed by Schmitech for onboarding, deployment, and production support.
- Good First Issues β Starter tasks for new contributors.
- Help Wanted β High-impact tasks where contributions are needed.
β Help ORBIT grow: Star the repo to support the project and get notified of new adapters!
Apache 2.0 β see LICENSE.