Overall Result: ❌ FAIL
Run: https://github.com/github/gh-aw/actions/runs/25877495682
All three checks failed. The local span was emitted and recorded in the OTEL mirror for this run, but OTLP export errors indicate spans did not reach remote backends. Grafana Tempo shows no registered services. The Sentry MCP lacks span-query tooling.
Checklist
Evidence
Step 1 · Local emission (send_status = fail)
| Check |
Result |
OTEL_EXPORTER_OTLP_ENDPOINT |
set ✅ |
OTEL_EXPORTER_OTLP_HEADERS |
set ✅ |
GH_AW_OTLP_ENDPOINTS |
set ✅ |
OTEL_SERVICE_NAME |
gh-aw ✅ |
COPILOT_OTEL_FILE_EXPORTER_PATH |
/tmp/gh-aw/copilot-otel.jsonl ✅ |
/tmp/gh-aw/otel.jsonl exists |
yes (1 line) ✅ |
Span for run 25877495682 |
gh-aw.agent.setup traceId 705145a696ba9acf997de0b682a9ad33 ✅ |
| OTLP export error count |
2 ❌ |
The local mirror confirms the span was written. The 2 export errors mean at least one OTLP push to a remote endpoint failed.
Step 2 · Sentry (sentry_status = fail)
| Check |
Result |
| MCP connection |
✅ working |
| Organization found |
github ((github.sentry.io/redacted)) ✅ |
| Project found |
gh-aw ✅ |
| Current-run spans queried |
❌ not possible — Sentry MCP exposes only find_organizations, find_projects, whoami; no span/trace/event query tool is available |
Recent gh-aw spans |
❌ cannot determine |
The Sentry MCP does not expose a tool to query traces or events. Whether spans are ingested is unknown from this workflow.
Step 3 · Grafana (grafana_status = fail)
| Check |
Result |
| MCP connection |
✅ working |
| Tempo datasource |
grafanacloud-mnkiefer-traces (uid: grafanacloud-traces) ✅ |
TraceQL query {resource.service.name="gh-aw"} |
{"traces":[],"metrics":{"completedJobs":3,"totalJobs":3}} ❌ |
TraceQL query {rootServiceName="gh-aw"} |
empty ❌ |
tempo_get-attribute-values for rootServiceName |
{"tagValues":{}} — no services registered ❌ |
Grafana Tempo has no traces from gh-aw. The Tempo datasource is reachable but empty.
Blockers
1. OTLP export errors (emit-side)
/tmp/gh-aw/otlp-export-errors.count = 2. Spans are written to the local mirror but at least 2 OTLP pushes failed. Check:
- Whether
OTEL_EXPORTER_OTLP_ENDPOINT / GH_AW_OTLP_ENDPOINTS point to a reachable endpoint from the runner network.
- Whether the endpoint TLS cert or auth headers are valid.
- Add logging/stdout capture from the OTLP exporter to surface the HTTP error code (likely 401, 403, or connection refused).
2. Sentry MCP missing span-query tool (read-side tooling gap)
The Sentry MCP server (sentry CLI) only provides 3 tools. A span or event query tool is needed to verify backend ingestion. Options:
- Extend the Sentry MCP server with a
search_events or get_traces tool.
- Use
GH_AW_OTEL_SENTRY_ENDPOINT + SENTRY_ACCESS_TOKEN directly via curl to query the Sentry API.
3. Grafana Tempo has no gh-aw spans (backend ingestion gap)
The Tempo datasource is accessible but contains zero traces for any service. This is consistent with the OTLP export errors: spans are not being delivered to Grafana. Once the export-error blocker (1) is resolved, re-run to confirm ingestion.
Generated by Smoke OTEL Backends · ● 4.1M · ◷
Overall Result: ❌ FAIL
Run: https://github.com/github/gh-aw/actions/runs/25877495682
All three checks failed. The local span was emitted and recorded in the OTEL mirror for this run, but OTLP export errors indicate spans did not reach remote backends. Grafana Tempo shows no registered services. The Sentry MCP lacks span-query tooling.
Checklist
gh-awspans in Tempo datasourceEvidence
Step 1 · Local emission (
send_status = fail)OTEL_EXPORTER_OTLP_ENDPOINTOTEL_EXPORTER_OTLP_HEADERSGH_AW_OTLP_ENDPOINTSOTEL_SERVICE_NAMEgh-aw✅COPILOT_OTEL_FILE_EXPORTER_PATH/tmp/gh-aw/copilot-otel.jsonl✅/tmp/gh-aw/otel.jsonlexists25877495682gh-aw.agent.setuptraceId705145a696ba9acf997de0b682a9ad33✅The local mirror confirms the span was written. The 2 export errors mean at least one OTLP push to a remote endpoint failed.
Step 2 · Sentry (
sentry_status = fail)github((github.sentry.io/redacted)) ✅gh-aw✅find_organizations,find_projects,whoami; no span/trace/event query tool is availablegh-awspansThe Sentry MCP does not expose a tool to query traces or events. Whether spans are ingested is unknown from this workflow.
Step 3 · Grafana (
grafana_status = fail)grafanacloud-mnkiefer-traces(uid:grafanacloud-traces) ✅{resource.service.name="gh-aw"}{"traces":[],"metrics":{"completedJobs":3,"totalJobs":3}}❌{rootServiceName="gh-aw"}tempo_get-attribute-valuesforrootServiceName{"tagValues":{}}— no services registered ❌Grafana Tempo has no traces from
gh-aw. The Tempo datasource is reachable but empty.Blockers
1. OTLP export errors (emit-side)
/tmp/gh-aw/otlp-export-errors.count= 2. Spans are written to the local mirror but at least 2 OTLP pushes failed. Check:OTEL_EXPORTER_OTLP_ENDPOINT/GH_AW_OTLP_ENDPOINTSpoint to a reachable endpoint from the runner network.2. Sentry MCP missing span-query tool (read-side tooling gap)
The Sentry MCP server (
sentryCLI) only provides 3 tools. A span or event query tool is needed to verify backend ingestion. Options:search_eventsorget_tracestool.GH_AW_OTEL_SENTRY_ENDPOINT+SENTRY_ACCESS_TOKENdirectly viacurlto query the Sentry API.3. Grafana Tempo has no gh-aw spans (backend ingestion gap)
The Tempo datasource is accessible but contains zero traces for any service. This is consistent with the OTLP export errors: spans are not being delivered to Grafana. Once the export-error blocker (1) is resolved, re-run to confirm ingestion.