Add per-rollout auth to interception servers#1122
Add per-rollout auth to interception servers#1122teilomillet wants to merge 6 commits intoPrimeIntellect-ai:mainfrom
Conversation
The interception server accepts unauthenticated requests from any origin, allowing cross-rollout request injection during training. This adds a per-rollout token generated at setup time: the InterceptionServer and RLMEnv handlers verify Authorization: Bearer <token> on each request, and the token is passed to sandboxes via OPENAI_API_KEY (CliAgentEnv) or RLM_AUTH_TOKEN (RLMEnv). Rollouts registered without a token skip the check for backwards compatibility. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
There are 2 total unresolved issues (including 1 from previous review).
❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
Reviewed by Cursor Bugbot for commit de9ee3f. Configure here.
| bearer_token = ( | ||
| auth_header.removeprefix("Bearer ") if auth_header.startswith("Bearer ") else "" | ||
| ) | ||
| return secrets.compare_digest(bearer_token, expected_token) |
There was a problem hiding this comment.
Empty-string token silently bypasses auth check
Low Severity
has_valid_bearer_auth treats an empty-string expected_token as satisfied by any request lacking an Authorization header, because secrets.compare_digest("", "") returns True. While None correctly means "skip auth", an empty string "" passed as expected_token silently disables auth instead of rejecting all requests. Current callers always use generate_interception_token() (non-empty) or None, so this isn't triggered today, but the public helper has an unsafe contract for any future caller that accidentally passes "".
Reviewed by Cursor Bugbot for commit de9ee3f. Configure here.
The OpenCode config hardcoded "apiKey": "intercepted", which overrides
the OPENAI_API_KEY env var set by the new per-rollout auth. The AI SDK
uses the explicit config value for the Bearer header, so authenticated
rollouts sent "Bearer intercepted" instead of the real token.
Change to "${OPENAI_API_KEY:-intercepted}" so auth-enabled rollouts
use the real token while legacy paths without OPENAI_API_KEY fall back
to the old dummy value.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
secrets.compare_digest("", "") returns True, so an empty expected_token
would silently accept unauthenticated requests. Use `not expected_token`
instead of `is None` to handle both None and "" as "no auth configured".
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>


Summary
CliAgentEnv,OpenCodeEnv,OpenCodeRLMEnv) and the RLM inline server accept unauthenticated HTTP requests routed only by rollout ID. Model-generated code running inside a sandbox can read itsOPENAI_BASE_URLenv var, extract the tunnel base URL, and POST to a sibling rollout's endpoint — injecting a request into another rollout's training loop with no auth barrier.OPENAI_API_KEYfor CliAgentEnv /RLM_AUTH_TOKENfor RLMEnv), and verified viaAuthorization: Bearer <token>on every request using constant-time comparison.Changes
interception_utils.py:register_rollout()accepts optionalauth_token; handler checks bearer token withsecrets.compare_digest; newgenerate_interception_token()helpercli_agent_env.py: generates token, passes toregister_rollout(), setsOPENAI_API_KEYin sandbox env vars (OpenAI SDK sends it asAuthorization: Bearerautomatically), addsOPENAI_API_KEYtoPROTECTED_ENV_VARSrlm_env.py: generates token, stores inactive_rollouts, checks in both_handle_sub_llm_requestand_handle_root_tool_request, passes to worker viaRLM_AUTH_TOKENenv var, worker includes it inrequests.post()headersTest plan
tests/test_interception_auth.py— 6 new tests: valid token accepted, missing token rejected (401), wrong token rejected (401), unknown rollout (404), graceful fallback (no token = no check), cross-rollout token rejectedinterception,composable,rlm,opencode_rlm,xml_parser)ruff check,ruff format,pre-commit run --all-filesall pass🤖 Generated with Claude Code
Note
High Risk
Adds authentication to internal interception and RLM HTTP endpoints and changes how sandboxes/agents pass credentials, which is security-sensitive and could break agent traffic if tokens are mishandled.
Overview
Introduces per-rollout bearer-token authentication for interception endpoints to prevent cross-rollout request injection. Rollouts can now be registered with an
auth_token, requests are validated via constant-time comparison, and missing/invalid tokens return401while unknown rollouts return404.Updates
CliAgentEnvandRLMEnvto generate a fresh token per rollout, store it in state/server context, and pass it into sandbox/worker processes (OPENAI_API_KEY/RLM_AUTH_TOKEN) so outbound calls includeAuthorization: Bearer ...;OpenCodeEnvis updated to readOPENAI_API_KEYwhen present. Adds stricter request-body validation on the interception server plus new unit tests covering auth success/failure, cross-rollout blocking, and bad payload handling.Reviewed by Cursor Bugbot for commit 71a7585. Bugbot is set up for automated code reviews on this repo. Configure here.