Skip to content

jmanhype/req_llm_gateway

Repository files navigation

ReqLLMGateway

OpenAI-compatible HTTP proxy for LLM providers. Implemented as a Plug that can be mounted in any Phoenix or Plug-based application. Routes requests to multiple providers through ReqLLM.

Status

Metric Value
Version 0.1.0
Elixir >= 1.14
Runtime deps 10
Dev/test deps 5
Modules (.ex) 12
Test files 5
CI No dedicated CI workflow (last run was a devcontainer update, failed)
Hex published No (package metadata is configured in mix.exs)

What it does

  • Exposes an OpenAI-compatible /v1/chat/completions endpoint.
  • Delegates to ReqLLM for provider routing. Provider is selected by model string prefix (e.g., openai:gpt-4, anthropic:claude-3-sonnet).
  • Tracks usage in ETS (no database required).
  • Emits telemetry events for request lifecycle.
  • Includes a LiveDashboard page for usage stats.
  • Rate limiting via Hammer.
  • Model name parsing and cost estimation via built-in pricing tables.

Setup

git clone https://github.com/jmanhype/req_llm_gateway
cd req_llm_gateway
mix deps.get
mix test --no-start

Mount in a Phoenix app

# router.ex
scope "/v1" do
  forward "/chat/completions", ReqLLMGateway.Plug
end

# Optional: LiveDashboard page
live_dashboard "/dashboard",
  additional_pages: [req_llm: ReqLLMGateway.LiveDashboard]

Standalone demo

export OPENAI_API_KEY="sk-..."
mix run -e "ReqLLMGateway.Application.start(:normal, [])"
# or
mix demo

Modules

Module Purpose
ReqLLMGateway.Plug Plug endpoint, request handling
ReqLLMGateway.LLMClient ReqLLM wrapper
ReqLLMGateway.ModelParser Parses provider:model strings
ReqLLMGateway.Pricing Cost calculation per model
ReqLLMGateway.Usage ETS-based usage tracking
ReqLLMGateway.Telemetry Telemetry event definitions
ReqLLMGateway.LiveDashboard Phoenix LiveDashboard page
ReqLLMGateway.Application OTP application
ReqLLMGateway.DemoEndpoint Standalone demo server
ReqLLMGateway.DemoRouter Demo routing
Mix.Tasks.Demo mix demo task

Dependencies

Dependency Purpose
plug ~> 1.14 HTTP interface
jason ~> 1.4 JSON codec
plug_cowboy ~> 2.6 HTTP server
telemetry ~> 1.2 Event emission
phoenix_live_dashboard ~> 0.8 Dashboard UI
req_llm ~> 1.0.0-rc.6 LLM provider routing
decimal ~> 2.1 Cost arithmetic
hammer ~> 6.1 Rate limiting
telemetry_metrics ~> 0.6 Metric definitions
telemetry_poller ~> 1.0 Periodic measurements

Limitations

  • No CI workflow for tests. The only GitHub Action is a devcontainer build.
  • req_llm is at release candidate (1.0.0-rc.6); API may change.
  • Usage tracking is ETS-only. Data is lost on restart.
  • Pricing tables are static and will drift from actual provider pricing.
  • 5 test files for 12 modules. Coverage is partial.
  • Not published to Hex despite having package metadata configured.

License

MIT

About

OpenAI-compatible LLM proxy with telemetry and multi-provider routing

Topics

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Generated from github/spark-template