OpenAI-compatible HTTP proxy for LLM providers. Implemented as a Plug that can be mounted in any Phoenix or Plug-based application. Routes requests to multiple providers through ReqLLM.
| Metric |
Value |
| Version |
0.1.0 |
| Elixir |
>= 1.14 |
| Runtime deps |
10 |
| Dev/test deps |
5 |
| Modules (.ex) |
12 |
| Test files |
5 |
| CI |
No dedicated CI workflow (last run was a devcontainer update, failed) |
| Hex published |
No (package metadata is configured in mix.exs) |
- Exposes an OpenAI-compatible
/v1/chat/completions endpoint.
- Delegates to ReqLLM for provider routing. Provider is selected by model string prefix (e.g.,
openai:gpt-4, anthropic:claude-3-sonnet).
- Tracks usage in ETS (no database required).
- Emits telemetry events for request lifecycle.
- Includes a LiveDashboard page for usage stats.
- Rate limiting via Hammer.
- Model name parsing and cost estimation via built-in pricing tables.
git clone https://github.com/jmanhype/req_llm_gateway
cd req_llm_gateway
mix deps.get
mix test --no-start
# router.ex
scope "/v1" do
forward "/chat/completions", ReqLLMGateway.Plug
end
# Optional: LiveDashboard page
live_dashboard "/dashboard",
additional_pages: [req_llm: ReqLLMGateway.LiveDashboard]
export OPENAI_API_KEY="sk-..."
mix run -e "ReqLLMGateway.Application.start(:normal, [])"
# or
mix demo
| Module |
Purpose |
ReqLLMGateway.Plug |
Plug endpoint, request handling |
ReqLLMGateway.LLMClient |
ReqLLM wrapper |
ReqLLMGateway.ModelParser |
Parses provider:model strings |
ReqLLMGateway.Pricing |
Cost calculation per model |
ReqLLMGateway.Usage |
ETS-based usage tracking |
ReqLLMGateway.Telemetry |
Telemetry event definitions |
ReqLLMGateway.LiveDashboard |
Phoenix LiveDashboard page |
ReqLLMGateway.Application |
OTP application |
ReqLLMGateway.DemoEndpoint |
Standalone demo server |
ReqLLMGateway.DemoRouter |
Demo routing |
Mix.Tasks.Demo |
mix demo task |
| Dependency |
Purpose |
| plug ~> 1.14 |
HTTP interface |
| jason ~> 1.4 |
JSON codec |
| plug_cowboy ~> 2.6 |
HTTP server |
| telemetry ~> 1.2 |
Event emission |
| phoenix_live_dashboard ~> 0.8 |
Dashboard UI |
| req_llm ~> 1.0.0-rc.6 |
LLM provider routing |
| decimal ~> 2.1 |
Cost arithmetic |
| hammer ~> 6.1 |
Rate limiting |
| telemetry_metrics ~> 0.6 |
Metric definitions |
| telemetry_poller ~> 1.0 |
Periodic measurements |
- No CI workflow for tests. The only GitHub Action is a devcontainer build.
- req_llm is at release candidate (1.0.0-rc.6); API may change.
- Usage tracking is ETS-only. Data is lost on restart.
- Pricing tables are static and will drift from actual provider pricing.
- 5 test files for 12 modules. Coverage is partial.
- Not published to Hex despite having package metadata configured.
MIT