Upgrade server AI Chat API protocol for Vercel compatibility#1034
Upgrade server AI Chat API protocol for Vercel compatibility#1034
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/74cfdaa5-9d61-426c-ba7c-ecaba5f480f2 Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
|
@copilot 继续 |
- Fix POST /api/v1/ai/chat test: now defaults to streaming mode, legacy JSON mode requires stream: false - Add 18 new tests covering vercel-stream-encoder (encodeStreamPart, encodeVercelDataStream), route dual-mode, systemPrompt, flat options, and array content validation - Update message validation to accept array content for Vercel multi-part - Update CHANGELOG.md with migration notes for Data Stream Protocol - All 176 tests passing (up from 158) Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/fc1d7caa-ef41-4b0b-9ce4-267c262596b4 Co-authored-by: xuyushun441-sys <255036401+xuyushun441-sys@users.noreply.github.com>
…validation tests Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/fc1d7caa-ef41-4b0b-9ce4-267c262596b4 Co-authored-by: xuyushun441-sys <255036401+xuyushun441-sys@users.noreply.github.com>
All checklist items are now complete in commits
|
There was a problem hiding this comment.
Pull request overview
This PR upgrades POST /api/v1/ai/chat and the service-ai streaming stack to support the Vercel AI Data Stream Protocol so it can be consumed directly by @ai-sdk/react/useChat (default streaming), while retaining a legacy JSON response mode via stream: false.
Changes:
- Added a Vercel Data Stream encoder (
encodeStreamPart,encodeVercelDataStream) and exports for it. - Introduced
VercelLLMAdapterusing Vercel AI SDK (generateText/streamText) and exported it publicly. - Updated
/api/v1/ai/chatto accept both legacy{ messages, options }and Vercel-style flat request fields, defaulting to streaming with avercelDataStreamresponse flag.
Reviewed changes
Copilot reviewed 9 out of 9 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| packages/services/service-ai/src/stream/vercel-stream-encoder.ts | Implements Vercel Data Stream Protocol frame encoding for TextStreamPart events. |
| packages/services/service-ai/src/stream/index.ts | Re-exports stream encoder utilities. |
| packages/services/service-ai/src/routes/ai-routes.ts | Adds vercelDataStream response flag, relaxes message validation to accept array content, and makes /chat dual-mode (stream default, JSON when stream:false). |
| packages/services/service-ai/src/index.ts | Public exports for VercelLLMAdapter and stream encoder utilities. |
| packages/services/service-ai/src/adapters/vercel-adapter.ts | Adds a production adapter backed by Vercel AI SDK. |
| packages/services/service-ai/src/adapters/index.ts | Re-exports the new adapter and its config type. |
| packages/services/service-ai/src/tests/vercel-stream-encoder.test.ts | Adds unit tests for all encoded frame types and stream transformation behavior. |
| packages/services/service-ai/src/tests/ai-service.test.ts | Updates/extends route tests for dual-mode behavior and new input formats. |
| CHANGELOG.md | Documents the protocol/migration changes and new adapter/encoder. |
| if (typeof msg.content !== 'string' && !Array.isArray(msg.content)) { | ||
| return 'message.content must be a string or an array'; | ||
| } | ||
| return null; |
There was a problem hiding this comment.
validateMessage now accepts message.content as an array, but it does not validate the array’s element shape (e.g. requiring objects with a type field, and for type: 'text' requiring a text string). As-is, callers can pass arbitrary arrays that will be forwarded to aiService/LLM adapters and may trigger runtime errors or provider SDK validation failures. Consider adding minimal structural validation for array content (at least: array of non-null objects with string type) and returning 400 on invalid shapes.
| if (typeof msg.content !== 'string' && !Array.isArray(msg.content)) { | |
| return 'message.content must be a string or an array'; | |
| } | |
| return null; | |
| const content = msg.content; | |
| if (typeof content === 'string') { | |
| return null; | |
| } | |
| if (Array.isArray(content)) { | |
| const parts = content as unknown[]; | |
| for (const part of parts) { | |
| if (typeof part !== 'object' || part === null) { | |
| return 'message.content array elements must be non-null objects'; | |
| } | |
| const partObj = part as Record<string, unknown>; | |
| if (typeof partObj.type !== 'string') { | |
| return 'each message.content array element must have a string "type" property'; | |
| } | |
| if (partObj.type === 'text' && typeof partObj.text !== 'string') { | |
| return 'message.content elements with type "text" must have a string "text" property'; | |
| } | |
| } | |
| return null; | |
| } | |
| return 'message.content must be a string or an array'; |
| // ── Prepend system prompt ──────────────────────────── | ||
| // Vercel useChat sends `system` (or the deprecated `systemPrompt`) | ||
| // as a top-level field. We prepend it as a system message. | ||
| const systemPrompt = (body.system ?? body.systemPrompt) as string | undefined; | ||
| const finalMessages: ModelMessage[] = [ | ||
| ...(systemPrompt | ||
| ? [{ role: 'system' as const, content: systemPrompt }] | ||
| : []), | ||
| ...(messages as ModelMessage[]), | ||
| ]; |
There was a problem hiding this comment.
systemPrompt is read from body.system ?? body.systemPrompt and cast to string | undefined without runtime validation. If a client sends a non-string value here, it will be inserted into finalMessages as content and may cause downstream adapter/provider errors. Recommend validating that system/systemPrompt is a string when present (otherwise return 400).
| async chat(messages: ModelMessage[], options?: AIRequestOptions): Promise<AIResult> { | ||
| const result = await generateText({ | ||
| model: this.model, | ||
| messages, | ||
| temperature: options?.temperature, | ||
| maxTokens: options?.maxTokens, | ||
| }); |
There was a problem hiding this comment.
VercelLLMAdapter.chat() forwards only temperature and maxTokens from AIRequestOptions. Other supported request options (notably stop, and tool-calling related fields like tools/toolChoice used by AIService.chatWithTools) are currently ignored, which will prevent those features from working when this adapter is used in production. Please plumb through the remaining relevant options (and/or explicitly document that this adapter does not support tool definitions).
| async *streamChat( | ||
| messages: ModelMessage[], | ||
| options?: AIRequestOptions, | ||
| ): AsyncIterable<TextStreamPart<ToolSet>> { | ||
| const result = streamText({ | ||
| model: this.model, | ||
| messages, | ||
| temperature: options?.temperature, | ||
| maxTokens: options?.maxTokens, | ||
| }); |
There was a problem hiding this comment.
VercelLLMAdapter.streamChat() also ignores most of AIRequestOptions (e.g. stop, and tool-calling related fields like tools/toolChoice). If streaming is meant to be feature-parity with chat(), the adapter should forward the same option set to streamText(); otherwise it should clearly error/guard when unsupported options are provided to avoid silent misconfiguration.
|
@copilot fix all |
|
@copilot 继续 |
…ystemPrompt type, forward all AIRequestOptions to Vercel adapter - validateMessage: structural validation for array content elements (non-null objects with string type, text property for type:text) - systemPrompt: runtime string validation, returns 400 for non-string values - VercelLLMAdapter: buildVercelOptions helper forwards stop, tools, toolChoice to chat/complete/streamChat Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/ac262d54-6f0d-47a1-967b-c1f4b7636378 Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
All 4 review comments addressed in
|
Upgrades the
/api/v1/ai/chatendpoint and service-ai infrastructure to fully support the Vercel AI Data Stream Protocol, enabling direct consumption by@ai-sdk/react/useChatwithout custom adapters.Changes Made
VercelLLMAdapter— New production adapter wrapping Vercel AI SDK'sgenerateText/streamTextfor any compatible model provider (OpenAI, Anthropic, Google, Ollama, etc.)vercel-stream-encoder.ts— New utilities (encodeStreamPart,encodeVercelDataStream) to convertAsyncIterable<TextStreamPart<ToolSet>>into Vercel Data Stream wire-format frames (0:text,9:tool-call,b:tool-call-start,c:tool-call-delta,a:tool-result,d:finish,e:step-finish)/api/v1/ai/chat— Defaults to Vercel Data Stream Protocol (SSE); falls back to JSON whenstream: false. Accepts Vercel useChat flat body format (system/systemPrompt,model,temperature,maxTokensas top-level fields) alongside the legacy{ messages, options }formatRouteResponse.vercelDataStream— New flag for HTTP server layer to encode events using Vercel frame-format SSE instead of genericdata:linesTesting
vercel-stream-encoder.test.tswith 13 tests covering all frame types, round-trip decode, empty streams, and tool-call flowsai-service.test.tswith tests for Data Stream default mode,systemPromptprepending, flat Vercel-style options, array content acceptance, and expanded content validation