Skip to content

Commit 9be38ec

Browse files
authored
Merge pull request #566 from future-agi/fix/rewrite-tracing-annotation-spans
rewrite manual tracing adding annotation spans
2 parents 81232bf + bb4db06 commit 9be38ec

1 file changed

Lines changed: 12 additions & 40 deletions

File tree

src/pages/docs/observe/features/manual-tracing/annotating-using-api.mdx

Lines changed: 12 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -7,46 +7,26 @@ description: "Label spans with custom tags, human feedback, and notes using the
77
Looking for the new unified Annotations system? Check out the [Annotations documentation](/docs/annotations) for annotation queues, managed workflows, and the Scores API.
88
</Tip>
99

10-
## What it is
10+
## About
1111

12-
**Annotations** let you label spans with additional information — custom tags, criteria, human feedback, or notes — directly on your trace data. You create annotation labels in the Future AGI UI and then add, update, and retrieve annotations programmatically using the `/tracer/bulk-annotation/` API endpoint.
12+
Traces show what happened but not whether the result was correct, helpful, or safe. Annotations close that gap by attaching labels, scores, notes, and human feedback directly to spans. The `/tracer/bulk-annotation/` API lets this be done programmatically, at scale, across hundreds of spans in a single request. Annotated spans can then be filtered by quality, exported as golden datasets, or used in RLHF workflows.
1313

14-
## Use cases
14+
---
15+
16+
## When to use
1517

16-
- **Label your data** with custom tags and criteria for filtering and analysis.
17-
- **Add custom events** to your spans for richer trace context.
18-
- **Create a golden dataset** for AI training by annotating high-quality examples.
19-
- **Add human feedback** to spans for RLHF and evaluation workflows.
18+
- **Label data for filtering and analysis**: Tag spans with custom criteria so they can be searched and grouped in the dashboard.
19+
- **Build golden datasets**: Annotate high-quality examples for AI training and fine-tuning.
20+
- **Add human feedback**: Attach scores, thumbs up/down, or notes to spans for RLHF and evaluation workflows.
21+
- **Enrich trace context**: Add custom events and notes to spans for richer debugging.
2022

2123
---
2224

2325
## How to
2426

2527
<Steps>
2628
<Step title="Create an annotation label">
27-
Annotation labels must be created in the UI before you can use them in the API.
28-
29-
- Go to your project in Observe/Prototype.
30-
- Click on any Trace or Span to open the Trace Details page.
31-
- Click on the "Annotations" tab.
32-
- Click on the "Create Annotation Label" button.
33-
- Fill in the form with the following information:
34-
- Name: The name of the annotation label.
35-
- Description: The description of the annotation label.
36-
- Type: The type of the annotation label.
37-
- `Text`: this type is used for free text annotations.
38-
- `Numeric`: this type is used for numeric annotations.
39-
- `Categorical`: this type is used for categorical annotations.
40-
- `Star`: this type is used for star rating annotations.
41-
- `Thumbs up/down`: this type is used for thumbs up/down annotations.
42-
- Write the necessary configuration for the annotation label.
43-
- Click on the "Create" button.
44-
- You will be redirected to the Annotation Labels page.
45-
- You can see the created annotation label in the list.
46-
- You can edit the annotation label by clicking on the "Edit" button.
47-
- You can delete the annotation label by clicking on the "Delete" button.
48-
49-
![Annotations Tab](/images/docs/observe/features/manual-tracing/annotations_tab.png)
29+
Annotation labels must be created before using the API. See the [Labels guide](/docs/annotations/features/labels) for how to create and configure labels (text, numeric, categorical, star, thumbs up/down).
5030
</Step>
5131

5232
<Step title="Fetch your annotation label ID">
@@ -205,7 +185,7 @@ Looking for the new unified Annotations system? Check out the [Annotations docum
205185
pretty(resp.json())
206186
```
207187

208-
```typescript Typescript
188+
```javascript JS/TS
209189
#!/usr/bin/env ts-node
210190
import axios from "axios";
211191

@@ -322,17 +302,9 @@ Each element in `result.errors` contains:
322302
| annotationError | string | "Annotation label \"axdf\" does not belong to span's project" | Error message for the annotation operation (optional). |
323303
| noteError | string | "Duplicate note" | Error message for the note operation (optional). |
324304

325-
**Best practices**
326-
327-
- **Immutable labels** — avoid changing the meaning of an existing label; create a new one instead.
328-
- **Consistent annotator IDs** — use stable identifiers (`"human_annotator_1"`, `"model_v1"`, …).
329-
- **Batch updates** — updating many spans? Group 100–500 records per request to minimize network overhead.
330-
- **Idempotency** — sending the same note text twice in a record skips duplicates, keeping data clean.
331-
- **Monitor quotas** — large annotation operations count toward your project's API usage.
332-
333305
---
334306

335-
## What you can do next
307+
## Next Steps
336308

337309
<CardGroup cols={2}>
338310
<Card title="Set Up Tracing" icon="gear" href="/docs/observe/features/manual-tracing/set-up-tracing">

0 commit comments

Comments
 (0)