Skip to content

Upgrade server AI Chat API protocol for Vercel compatibility#1034

Merged
hotlong merged 6 commits intomainfrom
copilot/update-ai-chat-api-protocol
Apr 1, 2026
Merged

Upgrade server AI Chat API protocol for Vercel compatibility#1034
hotlong merged 6 commits intomainfrom
copilot/update-ai-chat-api-protocol

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Apr 1, 2026

Upgrades the /api/v1/ai/chat endpoint and service-ai infrastructure to fully support the Vercel AI Data Stream Protocol, enabling direct consumption by @ai-sdk/react/useChat without custom adapters.

Changes Made

  • VercelLLMAdapter — New production adapter wrapping Vercel AI SDK's generateText/streamText for any compatible model provider (OpenAI, Anthropic, Google, Ollama, etc.)
  • vercel-stream-encoder.ts — New utilities (encodeStreamPart, encodeVercelDataStream) to convert AsyncIterable<TextStreamPart<ToolSet>> into Vercel Data Stream wire-format frames (0: text, 9: tool-call, b: tool-call-start, c: tool-call-delta, a: tool-result, d: finish, e: step-finish)
  • Dual-mode /api/v1/ai/chat — Defaults to Vercel Data Stream Protocol (SSE); falls back to JSON when stream: false. Accepts Vercel useChat flat body format (system/systemPrompt, model, temperature, maxTokens as top-level fields) alongside the legacy { messages, options } format
  • RouteResponse.vercelDataStream — New flag for HTTP server layer to encode events using Vercel frame-format SSE instead of generic data: lines
  • Message validation — Now accepts both string content (legacy) and Vercel multi-part array content
  • CHANGELOG.md — Updated with migration notes for the Data Stream Protocol upgrade

Testing

  • 176 tests passing across 5 test files (18 new tests added)
  • New vercel-stream-encoder.test.ts with 13 tests covering all frame types, round-trip decode, empty streams, and tool-call flows
  • Updated ai-service.test.ts with tests for Data Stream default mode, systemPrompt prepending, flat Vercel-style options, array content acceptance, and expanded content validation
  • CodeQL security scan: 0 alerts

@vercel
Copy link
Copy Markdown

vercel bot commented Apr 1, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
objectstack-play Ready Ready Preview, Comment Apr 1, 2026 8:07am
spec Ready Ready Preview, Comment Apr 1, 2026 8:07am

Request Review

Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/74cfdaa5-9d61-426c-ba7c-ecaba5f480f2

Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
@xuyushun441-sys
Copy link
Copy Markdown
Contributor

@copilot 继续

- Fix POST /api/v1/ai/chat test: now defaults to streaming mode, legacy
  JSON mode requires stream: false
- Add 18 new tests covering vercel-stream-encoder (encodeStreamPart,
  encodeVercelDataStream), route dual-mode, systemPrompt, flat options,
  and array content validation
- Update message validation to accept array content for Vercel multi-part
- Update CHANGELOG.md with migration notes for Data Stream Protocol
- All 176 tests passing (up from 158)

Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/fc1d7caa-ef41-4b0b-9ce4-267c262596b4

Co-authored-by: xuyushun441-sys <255036401+xuyushun441-sys@users.noreply.github.com>
…validation tests

Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/fc1d7caa-ef41-4b0b-9ce4-267c262596b4

Co-authored-by: xuyushun441-sys <255036401+xuyushun441-sys@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Apr 1, 2026

@copilot 继续

All checklist items are now complete in commits 68c79e0 and 50ec638:

  • VercelLLMAdapter wrapping generateText/streamText for any Vercel AI SDK model provider
  • vercel-stream-encoder.ts with encodeStreamPart/encodeVercelDataStream for Data Stream Protocol frame encoding (0:, 9:, d:, etc.)
  • /api/v1/ai/chat now defaults to Vercel Data Stream Protocol (SSE), with stream: false fallback for legacy JSON mode. Supports flat body (system, model, temperature, maxTokens) and systemPrompt prepending.
  • RouteResponse.vercelDataStream flag for HTTP server layer
  • 176 tests passing (18 new), CHANGELOG.md updated

Copilot AI changed the title [WIP] Upgrade server AI Chat API protocol for Vercel compatibility Upgrade server AI Chat API protocol for Vercel compatibility Apr 1, 2026
Copilot AI requested a review from xuyushun441-sys April 1, 2026 06:03
@hotlong hotlong marked this pull request as ready for review April 1, 2026 06:06
Copilot AI review requested due to automatic review settings April 1, 2026 06:06
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR upgrades POST /api/v1/ai/chat and the service-ai streaming stack to support the Vercel AI Data Stream Protocol so it can be consumed directly by @ai-sdk/react/useChat (default streaming), while retaining a legacy JSON response mode via stream: false.

Changes:

  • Added a Vercel Data Stream encoder (encodeStreamPart, encodeVercelDataStream) and exports for it.
  • Introduced VercelLLMAdapter using Vercel AI SDK (generateText/streamText) and exported it publicly.
  • Updated /api/v1/ai/chat to accept both legacy { messages, options } and Vercel-style flat request fields, defaulting to streaming with a vercelDataStream response flag.

Reviewed changes

Copilot reviewed 9 out of 9 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
packages/services/service-ai/src/stream/vercel-stream-encoder.ts Implements Vercel Data Stream Protocol frame encoding for TextStreamPart events.
packages/services/service-ai/src/stream/index.ts Re-exports stream encoder utilities.
packages/services/service-ai/src/routes/ai-routes.ts Adds vercelDataStream response flag, relaxes message validation to accept array content, and makes /chat dual-mode (stream default, JSON when stream:false).
packages/services/service-ai/src/index.ts Public exports for VercelLLMAdapter and stream encoder utilities.
packages/services/service-ai/src/adapters/vercel-adapter.ts Adds a production adapter backed by Vercel AI SDK.
packages/services/service-ai/src/adapters/index.ts Re-exports the new adapter and its config type.
packages/services/service-ai/src/tests/vercel-stream-encoder.test.ts Adds unit tests for all encoded frame types and stream transformation behavior.
packages/services/service-ai/src/tests/ai-service.test.ts Updates/extends route tests for dual-mode behavior and new input formats.
CHANGELOG.md Documents the protocol/migration changes and new adapter/encoder.

Comment on lines 95 to 98
if (typeof msg.content !== 'string' && !Array.isArray(msg.content)) {
return 'message.content must be a string or an array';
}
return null;
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

validateMessage now accepts message.content as an array, but it does not validate the array’s element shape (e.g. requiring objects with a type field, and for type: 'text' requiring a text string). As-is, callers can pass arbitrary arrays that will be forwarded to aiService/LLM adapters and may trigger runtime errors or provider SDK validation failures. Consider adding minimal structural validation for array content (at least: array of non-null objects with string type) and returning 400 on invalid shapes.

Suggested change
if (typeof msg.content !== 'string' && !Array.isArray(msg.content)) {
return 'message.content must be a string or an array';
}
return null;
const content = msg.content;
if (typeof content === 'string') {
return null;
}
if (Array.isArray(content)) {
const parts = content as unknown[];
for (const part of parts) {
if (typeof part !== 'object' || part === null) {
return 'message.content array elements must be non-null objects';
}
const partObj = part as Record<string, unknown>;
if (typeof partObj.type !== 'string') {
return 'each message.content array element must have a string "type" property';
}
if (partObj.type === 'text' && typeof partObj.text !== 'string') {
return 'message.content elements with type "text" must have a string "text" property';
}
}
return null;
}
return 'message.content must be a string or an array';

Copilot uses AI. Check for mistakes.
Comment on lines +167 to +176
// ── Prepend system prompt ────────────────────────────
// Vercel useChat sends `system` (or the deprecated `systemPrompt`)
// as a top-level field. We prepend it as a system message.
const systemPrompt = (body.system ?? body.systemPrompt) as string | undefined;
const finalMessages: ModelMessage[] = [
...(systemPrompt
? [{ role: 'system' as const, content: systemPrompt }]
: []),
...(messages as ModelMessage[]),
];
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

systemPrompt is read from body.system ?? body.systemPrompt and cast to string | undefined without runtime validation. If a client sends a non-string value here, it will be inserted into finalMessages as content and may cause downstream adapter/provider errors. Recommend validating that system/systemPrompt is a string when present (otherwise return 400).

Copilot uses AI. Check for mistakes.
Comment on lines +38 to +44
async chat(messages: ModelMessage[], options?: AIRequestOptions): Promise<AIResult> {
const result = await generateText({
model: this.model,
messages,
temperature: options?.temperature,
maxTokens: options?.maxTokens,
});
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VercelLLMAdapter.chat() forwards only temperature and maxTokens from AIRequestOptions. Other supported request options (notably stop, and tool-calling related fields like tools/toolChoice used by AIService.chatWithTools) are currently ignored, which will prevent those features from working when this adapter is used in production. Please plumb through the remaining relevant options (and/or explicitly document that this adapter does not support tool definitions).

Copilot uses AI. Check for mistakes.
Comment on lines +77 to +86
async *streamChat(
messages: ModelMessage[],
options?: AIRequestOptions,
): AsyncIterable<TextStreamPart<ToolSet>> {
const result = streamText({
model: this.model,
messages,
temperature: options?.temperature,
maxTokens: options?.maxTokens,
});
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VercelLLMAdapter.streamChat() also ignores most of AIRequestOptions (e.g. stop, and tool-calling related fields like tools/toolChoice). If streaming is meant to be feature-parity with chat(), the adapter should forward the same option set to streamText(); otherwise it should clearly error/guard when unsupported options are provided to avoid silent misconfiguration.

Copilot uses AI. Check for mistakes.
@hotlong
Copy link
Copy Markdown
Contributor

hotlong commented Apr 1, 2026

@copilot fix all

@hotlong
Copy link
Copy Markdown
Contributor

hotlong commented Apr 1, 2026

@copilot 继续

…ystemPrompt type, forward all AIRequestOptions to Vercel adapter

- validateMessage: structural validation for array content elements (non-null objects with string type, text property for type:text)
- systemPrompt: runtime string validation, returns 400 for non-string values
- VercelLLMAdapter: buildVercelOptions helper forwards stop, tools, toolChoice to chat/complete/streamChat

Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/ac262d54-6f0d-47a1-967b-c1f4b7636378

Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Apr 1, 2026

@copilot 继续

All 4 review comments addressed in aff2159:

  1. validateMessage array content validation — Now validates each array element is a non-null object with a string type property, and that type: 'text' elements have a string text property. Returns 400 on invalid shapes.
  2. systemPrompt type validation — Added runtime check that system/systemPrompt is a string when present; returns 400 for non-string values.
  3. VercelLLMAdapter.chat() option forwarding — New buildVercelOptions() helper plumbs stop (→ stopSequences), tools (converted via jsonSchema), and toolChoice through to generateText.
  4. VercelLLMAdapter.streamChat() option forwarding — Same buildVercelOptions() helper used for streamText, ensuring feature-parity with chat().

@hotlong hotlong merged commit 3098747 into main Apr 1, 2026
7 of 9 checks passed
@github-actions github-actions bot added documentation Improvements or additions to documentation tests size/l labels Apr 1, 2026
Copilot stopped work on behalf of hotlong due to an error April 1, 2026 07:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation size/l tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

服务端 AI Chat API 协议升级:对齐 Vercel/ai Data Stream Protocol 以支持 objectui 前端

4 participants