You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/pages/docs/observe/features/manual-tracing/annotating-using-api.mdx
+12-40Lines changed: 12 additions & 40 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,46 +7,26 @@ description: "Label spans with custom tags, human feedback, and notes using the
7
7
Looking for the new unified Annotations system? Check out the [Annotations documentation](/docs/annotations) for annotation queues, managed workflows, and the Scores API.
8
8
</Tip>
9
9
10
-
## What it is
10
+
## About
11
11
12
-
**Annotations** let you label spans with additional information — custom tags, criteria, human feedback, or notes — directly on your trace data. You create annotation labels in the Future AGI UI and then add, update, and retrieve annotations programmatically using the `/tracer/bulk-annotation/` API endpoint.
12
+
Traces show what happened but not whether the result was correct, helpful, or safe. Annotations close that gap by attaching labels, scores, notes, and human feedback directly to spans. The `/tracer/bulk-annotation/` API lets this be done programmatically, at scale, across hundreds of spans in a single request. Annotated spans can then be filtered by quality, exported as golden datasets, or used in RLHF workflows.
13
13
14
-
## Use cases
14
+
---
15
+
16
+
## When to use
15
17
16
-
-**Label your data**with custom tags and criteria for filtering and analysis.
17
-
-**Add custom events** to your spans for richer trace context.
18
-
-**Create a golden dataset** for AI training by annotating high-quality examples.
19
-
-**Add human feedback**to spans for RLHF and evaluation workflows.
18
+
-**Label data for filtering and analysis**: Tag spans with custom criteria so they can be searched and grouped in the dashboard.
19
+
-**Build golden datasets**: Annotate high-quality examples for AI training and fine-tuning.
20
+
-**Add human feedback**: Attach scores, thumbs up/down, or notes to spans for RLHF and evaluation workflows.
21
+
-**Enrich trace context**: Add custom events and notes to spans for richer debugging.
20
22
21
23
---
22
24
23
25
## How to
24
26
25
27
<Steps>
26
28
<Steptitle="Create an annotation label">
27
-
Annotation labels must be created in the UI before you can use them in the API.
28
-
29
-
- Go to your project in Observe/Prototype.
30
-
- Click on any Trace or Span to open the Trace Details page.
31
-
- Click on the "Annotations" tab.
32
-
- Click on the "Create Annotation Label" button.
33
-
- Fill in the form with the following information:
34
-
- Name: The name of the annotation label.
35
-
- Description: The description of the annotation label.
36
-
- Type: The type of the annotation label.
37
-
-`Text`: this type is used for free text annotations.
38
-
-`Numeric`: this type is used for numeric annotations.
39
-
-`Categorical`: this type is used for categorical annotations.
40
-
-`Star`: this type is used for star rating annotations.
41
-
-`Thumbs up/down`: this type is used for thumbs up/down annotations.
42
-
- Write the necessary configuration for the annotation label.
43
-
- Click on the "Create" button.
44
-
- You will be redirected to the Annotation Labels page.
45
-
- You can see the created annotation label in the list.
46
-
- You can edit the annotation label by clicking on the "Edit" button.
47
-
- You can delete the annotation label by clicking on the "Delete" button.
Annotation labels must be created before using the API. See the [Labels guide](/docs/annotations/features/labels) for how to create and configure labels (text, numeric, categorical, star, thumbs up/down).
50
30
</Step>
51
31
52
32
<Steptitle="Fetch your annotation label ID">
@@ -205,7 +185,7 @@ Looking for the new unified Annotations system? Check out the [Annotations docum
205
185
pretty(resp.json())
206
186
```
207
187
208
-
```typescript Typescript
188
+
```javascript JS/TS
209
189
#!/usr/bin/env ts-node
210
190
importaxiosfrom"axios";
211
191
@@ -322,17 +302,9 @@ Each element in `result.errors` contains:
322
302
| annotationError | string | "Annotation label \"axdf\" does not belong to span's project" | Error message for the annotation operation (optional). |
323
303
| noteError | string | "Duplicate note" | Error message for the note operation (optional). |
324
304
325
-
**Best practices**
326
-
327
-
-**Immutable labels** — avoid changing the meaning of an existing label; create a new one instead.
328
-
-**Consistent annotator IDs** — use stable identifiers (`"human_annotator_1"`, `"model_v1"`, …).
329
-
-**Batch updates** — updating many spans? Group 100–500 records per request to minimize network overhead.
330
-
-**Idempotency** — sending the same note text twice in a record skips duplicates, keeping data clean.
331
-
-**Monitor quotas** — large annotation operations count toward your project's API usage.
332
-
333
305
---
334
306
335
-
## What you can do next
307
+
## Next Steps
336
308
337
309
<CardGroupcols={2}>
338
310
<Cardtitle="Set Up Tracing"icon="gear"href="/docs/observe/features/manual-tracing/set-up-tracing">
0 commit comments