TanStack AI version
0.14.0
Framework/Library version
Tanstack start rc and react 19.2
Describe the bug and the steps to reproduce it
When a tool with [needsApproval: true] completes execution during a continuation flow, [TextEngine.buildToolResultChunks()] emits a [TOOL_CALL_START] chunk with only [toolName] but omits the required [toolCallName]field:
// src/activities/chat/index.ts ~L1217
chunks.push({
type: 'TOOL_CALL_START',
timestamp: Date.now(),
model: finishEvent.model,
toolCallId: result.toolCallId,
toolName: result.toolName, // ← present
// toolCallName is missing! // ← @ag-ui/core requires this
} as StreamChunk)
Impact:
The client [StreamProcessor.handleToolCallStartEvent] reads [chunk.toolCallName] (which is undefined), creating a ghost [ToolCallPart] with [name: ""] in a new assistant message. On the next user message, [convertMessagesToInput] in @tanstack/ai-openai serializes this as a function_call input item without the required [name] field, causing:
400 Missing required parameter: 'input[N].name'
Reproduction:
Define a server tool with [needsApproval: true]
Have the LLM call the tool
Approve the tool via [addToolApprovalResponse]
Tool executes successfully (continuation flow triggers [buildToolResultChunks] with [argsMap]
Send any follow-up message → 400 from OpenAI
Expected fix:
Add [toolCallName: result.toolName] to the [TOOL_CALL_START] chunk in [buildToolResultChunks] same as the [TOOL_CALL_END] chunk already does:
chunks.push({
type: 'TOOL_CALL_START',
timestamp: Date.now(),
model: finishEvent.model,
toolCallId: result.toolCallId,
toolCallName: result.toolName, // ← add this
toolName: result.toolName,
} as StreamChunk)
Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
None
Terms & Code of Conduct
TanStack AI version
0.14.0
Framework/Library version
Tanstack start rc and react 19.2
Describe the bug and the steps to reproduce it
When a tool with [needsApproval: true] completes execution during a continuation flow, [TextEngine.buildToolResultChunks()] emits a [TOOL_CALL_START] chunk with only [toolName] but omits the required [toolCallName]field:
Impact:
The client [StreamProcessor.handleToolCallStartEvent] reads [chunk.toolCallName] (which is undefined), creating a ghost [ToolCallPart] with [name: ""] in a new assistant message. On the next user message, [convertMessagesToInput] in @tanstack/ai-openai serializes this as a function_call input item without the required [name] field, causing:
400 Missing required parameter: 'input[N].name'Reproduction:
Define a server tool with [needsApproval: true]
Have the LLM call the tool
Approve the tool via [addToolApprovalResponse]
Tool executes successfully (continuation flow triggers [buildToolResultChunks] with [argsMap]
Send any follow-up message → 400 from OpenAI
Expected fix:
Add [toolCallName: result.toolName] to the [TOOL_CALL_START] chunk in [buildToolResultChunks] same as the [TOOL_CALL_END] chunk already does:
Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
None
Terms & Code of Conduct