Skip to content

ci: Version Packages#533

Open
github-actions[bot] wants to merge 1 commit intomainfrom
changeset-release/main
Open

ci: Version Packages#533
github-actions[bot] wants to merge 1 commit intomainfrom
changeset-release/main

Conversation

@github-actions
Copy link
Copy Markdown
Contributor

@github-actions github-actions Bot commented May 6, 2026

This PR was opened by the Changesets release GitHub action. When you're ready to do a release, you can merge this and the packages will be published to npm automatically. If you're not ready to do a release yet, that's fine, whenever you add more changesets to main, this PR will be updated.

Releases

@tanstack/ai@0.15.0

Minor Changes

  • Fix thinking blocks getting merged across steps and lost on turn 2+ of Anthropic tool loops. (#391)

    Each thinking step emitted by the adapter now produces its own ThinkingPart on the UIMessage instead of being merged into a single part, and thinking content + Anthropic signatures are preserved in server-side message history so multi-turn tool flows with extended thinking work correctly.

    This includes a public callback signature change: StreamProcessorEvents.onThinkingUpdate now receives (messageId, stepId, content) instead of (messageId, content). ChatClient has been updated to handle the new stepId argument internally, but consumers implementing StreamProcessorEvents directly need to add the new parameter.

    @tanstack/ai:

    • ThinkingPart gains optional stepId and signature fields.
    • ModelMessage gains an optional thinking?: Array<{ content; signature? }> field so prior thinking can be replayed in subsequent turns.
    • StepFinishedEvent gains an optional signature field for provider-supplied thinking signatures.
    • StreamProcessor tracks thinking per-step via stepId and keeps step ordering. getState().thinking / getResult().thinking concatenate step contents in order.
    • The onThinkingUpdate callback on StreamProcessorEvents now receives (messageId, stepId, content) — consumers implementing it directly must add the stepId parameter.
    • TextEngine accumulates thinking + signatures per iteration and includes them in assistant messages with tool calls so the next turn can replay them.

    @tanstack/ai-anthropic:

    • Captures signature_delta stream events and emits the final STEP_FINISHED with the signature on content_block_stop.
    • Includes thinking blocks with signatures in formatMessages for multi-turn history.
    • Passes betas: ['interleaved-thinking-2025-05-14'] to the beta.messages.create call site when a thinking budget is configured. The beta flag is scoped to the streaming path only, so structuredOutput (which uses the non-beta messages.create endpoint) is unaffected.

    @tanstack/ai-client:

    • ChatClient's internal onThinkingUpdate wiring is updated for the new stepId parameter.

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-event-client@0.2.9

@tanstack/ai-client@0.9.0

Minor Changes

  • Fix thinking blocks getting merged across steps and lost on turn 2+ of Anthropic tool loops. (#391)

    Each thinking step emitted by the adapter now produces its own ThinkingPart on the UIMessage instead of being merged into a single part, and thinking content + Anthropic signatures are preserved in server-side message history so multi-turn tool flows with extended thinking work correctly.

    This includes a public callback signature change: StreamProcessorEvents.onThinkingUpdate now receives (messageId, stepId, content) instead of (messageId, content). ChatClient has been updated to handle the new stepId argument internally, but consumers implementing StreamProcessorEvents directly need to add the new parameter.

    @tanstack/ai:

    • ThinkingPart gains optional stepId and signature fields.
    • ModelMessage gains an optional thinking?: Array<{ content; signature? }> field so prior thinking can be replayed in subsequent turns.
    • StepFinishedEvent gains an optional signature field for provider-supplied thinking signatures.
    • StreamProcessor tracks thinking per-step via stepId and keeps step ordering. getState().thinking / getResult().thinking concatenate step contents in order.
    • The onThinkingUpdate callback on StreamProcessorEvents now receives (messageId, stepId, content) — consumers implementing it directly must add the stepId parameter.
    • TextEngine accumulates thinking + signatures per iteration and includes them in assistant messages with tool calls so the next turn can replay them.

    @tanstack/ai-anthropic:

    • Captures signature_delta stream events and emits the final STEP_FINISHED with the signature on content_block_stop.
    • Includes thinking blocks with signatures in formatMessages for multi-turn history.
    • Passes betas: ['interleaved-thinking-2025-05-14'] to the beta.messages.create call site when a thinking budget is configured. The beta flag is scoped to the streaming path only, so structuredOutput (which uses the non-beta messages.create endpoint) is unaffected.

    @tanstack/ai-client:

    • ChatClient's internal onThinkingUpdate wiring is updated for the new stepId parameter.

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-event-client@0.2.9

@tanstack/ai-anthropic@0.8.4

Patch Changes

  • Fix thinking blocks getting merged across steps and lost on turn 2+ of Anthropic tool loops. (#391)

    Each thinking step emitted by the adapter now produces its own ThinkingPart on the UIMessage instead of being merged into a single part, and thinking content + Anthropic signatures are preserved in server-side message history so multi-turn tool flows with extended thinking work correctly.

    This includes a public callback signature change: StreamProcessorEvents.onThinkingUpdate now receives (messageId, stepId, content) instead of (messageId, content). ChatClient has been updated to handle the new stepId argument internally, but consumers implementing StreamProcessorEvents directly need to add the new parameter.

    @tanstack/ai:

    • ThinkingPart gains optional stepId and signature fields.
    • ModelMessage gains an optional thinking?: Array<{ content; signature? }> field so prior thinking can be replayed in subsequent turns.
    • StepFinishedEvent gains an optional signature field for provider-supplied thinking signatures.
    • StreamProcessor tracks thinking per-step via stepId and keeps step ordering. getState().thinking / getResult().thinking concatenate step contents in order.
    • The onThinkingUpdate callback on StreamProcessorEvents now receives (messageId, stepId, content) — consumers implementing it directly must add the stepId parameter.
    • TextEngine accumulates thinking + signatures per iteration and includes them in assistant messages with tool calls so the next turn can replay them.

    @tanstack/ai-anthropic:

    • Captures signature_delta stream events and emits the final STEP_FINISHED with the signature on content_block_stop.
    • Includes thinking blocks with signatures in formatMessages for multi-turn history.
    • Passes betas: ['interleaved-thinking-2025-05-14'] to the beta.messages.create call site when a thinking budget is configured. The beta flag is scoped to the streaming path only, so structuredOutput (which uses the non-beta messages.create endpoint) is unaffected.

    @tanstack/ai-client:

    • ChatClient's internal onThinkingUpdate wiring is updated for the new stepId parameter.
  • Updated dependencies [b2d3cc1]:

    • @tanstack/ai@0.15.0

@tanstack/ai-code-mode@0.1.9

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-code-mode-skills@0.1.9

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-code-mode@0.1.9

@tanstack/ai-devtools-core@0.3.26

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-event-client@0.2.9

@tanstack/ai-elevenlabs@0.2.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-client@0.9.0

@tanstack/ai-event-client@0.2.9

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-fal@0.7.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-gemini@0.10.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-grok@0.7.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-groq@0.1.9

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-isolate-cloudflare@0.1.9

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-code-mode@0.1.9

@tanstack/ai-isolate-node@0.1.9

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-code-mode@0.1.9

@tanstack/ai-isolate-quickjs@0.1.9

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-code-mode@0.1.9

@tanstack/ai-ollama@0.6.11

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-openai@0.8.3

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-client@0.9.0

@tanstack/ai-openrouter@0.8.3

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0

@tanstack/ai-preact@0.6.21

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-client@0.9.0

@tanstack/ai-react@0.8.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-client@0.9.0

@tanstack/ai-react-ui@0.6.3

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai-client@0.9.0
    • @tanstack/ai-react@0.8.1

@tanstack/ai-solid@0.7.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-client@0.9.0

@tanstack/ai-solid-ui@0.6.3

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai-client@0.9.0
    • @tanstack/ai-solid@0.7.1

@tanstack/ai-svelte@0.7.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-client@0.9.0

@tanstack/ai-vue@0.7.1

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-client@0.9.0

@tanstack/ai-vue-ui@0.1.32

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-vue@0.7.1

@tanstack/preact-ai-devtools@0.1.30

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-devtools-core@0.3.26

@tanstack/react-ai-devtools@0.2.30

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-devtools-core@0.3.26

@tanstack/solid-ai-devtools@0.2.30

Patch Changes

  • Updated dependencies []:
    • @tanstack/ai-devtools-core@0.3.26

ts-svelte-chat@0.1.39

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-anthropic@0.8.4
    • @tanstack/ai-client@0.9.0
    • @tanstack/ai-gemini@0.10.1
    • @tanstack/ai-ollama@0.6.11
    • @tanstack/ai-openai@0.8.3
    • @tanstack/ai-svelte@0.7.1

ts-vue-chat@0.1.39

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-anthropic@0.8.4
    • @tanstack/ai-client@0.9.0
    • @tanstack/ai-gemini@0.10.1
    • @tanstack/ai-ollama@0.6.11
    • @tanstack/ai-openai@0.8.3
    • @tanstack/ai-vue@0.7.1
    • @tanstack/ai-vue-ui@0.1.32

vanilla-chat@0.0.36

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai-client@0.9.0

@tanstack/ai-code-mode-models-eval@0.0.13

Patch Changes

  • Updated dependencies [b2d3cc1]:
    • @tanstack/ai@0.15.0
    • @tanstack/ai-anthropic@0.8.4
    • @tanstack/ai-code-mode@0.1.9
    • @tanstack/ai-gemini@0.10.1
    • @tanstack/ai-grok@0.7.1
    • @tanstack/ai-groq@0.1.9
    • @tanstack/ai-ollama@0.6.11
    • @tanstack/ai-openai@0.8.3
    • @tanstack/ai-isolate-node@0.1.9

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants