Synapse is a unified AI router for LLM, embeddings, image generation, MCP, STT, and TTS. Configure your provider keys once, then point any OpenAI or Anthropic SDK at your Synapse instance. Synapse handles provider selection, automatic failover, smart routing, rate limiting, MCP aggregation, and usage metering across LLM, embedding, image, STT, and TTS providers.
| Platform | Channel | Command / Link |
|---|---|---|
| All | GitHub Releases | Download from releases page |
| All | crates.io | cargo install synapse |
| macOS / Linux | Homebrew | brew install omnidotdev/tap/synapse |
| Arch Linux | AUR / AUR (bin) | paru -S omnidotdev-synapse or paru -S omnidotdev-synapse-bin |
| Docker | GHCR | docker pull ghcr.io/omnidotdev/synapse:latest |
git clone https://github.com/omnidotdev/synapse
cd synapse
cargo build --release
# Binary will be at target/release/synapseCreate synapse.toml:
[server]
listen_address = "0.0.0.0:6000"
[llm.providers.anthropic]
type = "anthropic"
api_key = "{{ env.ANTHROPIC_API_KEY }}"
[llm.providers.openai]
type = "openai"
api_key = "{{ env.OPENAI_API_KEY }}"API keys support {{ env.VAR }} template syntax for environment variable substitution. See config/synapse.dev.toml for a full development configuration.
synapse --config synapse.toml# OpenAI-compatible
curl http://localhost:6000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-20250514",
"messages": [{"role": "user", "content": "Hello!"}]
}'Works with existing OpenAI and Anthropic SDKs, just point base_url at your Synapse instance.
cargo build # Build
cargo run -- --help # Run
cargo test # Test
cargo clippy # LintThis project uses a dual-package setup (Rust crate + npm package) with automated version synchronization:
- Source of truth:
package.jsonholds the canonical version, and is used for Changesets - Sync script:
scripts/syncVersion.tspropagates the version toCargo.toml - Changesets: Manages version bumps and changelog generation
The sync script runs automatically during the release process via the version npm script:
bun run version # syncs `package.json` version -> `Cargo.toml`| Workflow | Trigger | Purpose |
|---|---|---|
test.yml |
Push/PR to master |
Runs fmt, clippy, and tests |
sync.yml |
PR to master |
Validates version sync, fmt, clippy, test, build |
release.yml |
Push to master |
Creates releases via Changesets, builds multi-platform binaries |
- Create a changeset:
bun changeset - Push to
master - Changesets action creates a "Version Packages" PR
- Merge the PR to trigger a release with binaries for:
x86_64-unknown-linux-gnuaarch64-unknown-linux-gnux86_64-apple-darwinaarch64-apple-darwin
- Manually publish to crates.io:
cargo publish
Build and run with Docker:
docker build -t synapse .
docker run -p 6000:6000 synapse- Beacon routes its LLM, STT/TTS, and tool execution through Synapse
- Omni CLI: Agentic CLI for the Omni ecosystem
- Omni Terminal: GPU-accelerated terminal emulator built to run everywhere
- Aether handles billing metering for managed and credit billing modes
- Warden enforces authorization on management endpoints
The code in this repository is licensed under Apache 2.0, © Omni LLC. See LICENSE.md for more information.