Skip to content

fix(provider): add missing index field to streaming tool_call deltas (#6661)#6692

Open
Yaohua-Leo wants to merge 2 commits intoAstrBotDevs:masterfrom
Yaohua-Leo:fix/6661-streaming-tool-call-index
Open

fix(provider): add missing index field to streaming tool_call deltas (#6661)#6692
Yaohua-Leo wants to merge 2 commits intoAstrBotDevs:masterfrom
Yaohua-Leo:fix/6661-streaming-tool-call-index

Conversation

@Yaohua-Leo
Copy link
Contributor

@Yaohua-Leo Yaohua-Leo commented Mar 20, 2026

Summary

Fixes #6661 - Streaming tool_call arguments lost when OpenAI-compatible proxy omits index field (e.g. Gemini)

Changes

  • Add missing index field to tool_call deltas before passing to ChatCompletionStreamState.handle_chunk()
  • Gemini and some OpenAI-compatible proxies (e.g. Continue) omit the index field
  • This caused handle_chunk() to reject the chunk and silently drop tool_call arguments

Testing

✅ Docker sandbox test passed

  • Tool call without index: index=0 assigned
  • Tool call with existing index: original value preserved
  • Empty tool_calls handled
  • Empty choices handled

Summary by Sourcery

Bug Fixes:

  • Prevent loss of tool_call arguments in streamed responses when upstream providers omit the index field on tool_call deltas.

- Fix AstrBotDevs#6661: Streaming tool_call arguments lost when OpenAI-compatible proxy omits index field
- Gemini and some proxies (e.g. Continue) don't include index field in tool_call deltas
- Add default index=0 when missing to prevent ChatCompletionStreamState.handle_chunk() from rejecting chunks

Fixes AstrBotDevs#6661
@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Mar 20, 2026
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a critical bug affecting the streaming of tool_call arguments, particularly when interacting with OpenAI-compatible proxies that do not consistently provide an index field. By proactively injecting a default index into tool_call deltas, the system now robustly processes all streaming tool call data, preventing silent data loss and enhancing the reliability of tool-use capabilities.

Highlights

  • Bug Fix for Streaming Tool Calls: Resolved an issue where streaming tool_call arguments were lost when OpenAI-compatible proxies (like Gemini) omitted the index field, which caused ChatCompletionStreamState.handle_chunk() to reject the chunk and silently drop tool_call arguments.
  • Index Field Handling: Implemented logic to add a missing index field with a default value of 0 to tool_call deltas before they are processed by ChatCompletionStreamState.handle_chunk(), ensuring proper handling of all streaming tool call arguments.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 1 issue, and left some high level feedback:

  • The fix currently only inspects chunk.choices[0]; if a provider ever sends tool_call deltas on multiple choices in a single chunk, those additional choices will still be dropped—consider iterating over all chunk.choices instead of just the first.
  • You always set a missing tool_call.index to 0, which might conflict with multiple tool calls in the same choice; it may be safer to derive the index from the tool_calls list position or preserve any existing ordering assumptions in handle_chunk().
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- The fix currently only inspects `chunk.choices[0]`; if a provider ever sends tool_call deltas on multiple choices in a single chunk, those additional choices will still be dropped—consider iterating over all `chunk.choices` instead of just the first.
- You always set a missing `tool_call.index` to `0`, which might conflict with multiple tool calls in the same choice; it may be safer to derive the index from the tool_calls list position or preserve any existing ordering assumptions in `handle_chunk()`.

## Individual Comments

### Comment 1
<location path="astrbot/core/provider/sources/openai_source.py" line_range="310-315" />
<code_context>
         state = ChatCompletionStreamState()

         async for chunk in stream:
+            # Fix for #6661: Add missing 'index' field to tool_call deltas
+            # Gemini and some OpenAI-compatible proxies omit this field
+            if chunk.choices:
+                choice = chunk.choices[0]
+                if choice.delta and choice.delta.tool_calls:
+                    for tc in choice.delta.tool_calls:
+                        if not hasattr(tc, "index") or tc.index is None:
+                            tc.index = 0
</code_context>
<issue_to_address>
**issue (bug_risk):** Setting all missing tool_call indices to 0 could break multi-tool-call streams.

This change sets `tc.index = 0` for every tool_call delta missing an index. If a choice yields multiple tool_calls, they’ll all share index 0, which can break any logic that uses indices to track or merge tool_calls across chunks.

If multiple tool_calls per choice are possible, consider deriving the index from position (e.g. `enumerate(choice.delta.tool_calls)`) or inferring it inside `ChatCompletionStreamState`, where you can use previously seen tool_calls for that choice. Otherwise, you risk incorrect tool_call reconstruction while only masking the missing-field issue.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@dosubot dosubot bot added the area:provider The bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner. label Mar 20, 2026
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds a workaround for OpenAI-compatible providers that omit the index field in streaming tool call deltas. The change correctly patches the chunk before it's processed. However, the current implementation has a flaw where it assigns a hardcoded index=0, which can cause issues if multiple tool calls are streamed concurrently. I've left a comment with details on this potential bug and a suggestion for a more robust implementation. I've also added a minor suggestion to improve code style.

if choice.delta and choice.delta.tool_calls:
for tc in choice.delta.tool_calls:
if not hasattr(tc, "index") or tc.index is None:
tc.index = 0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Hardcoding tc.index = 0 can lead to incorrect behavior if the streaming provider sends multiple tool calls concurrently without an index. All tool calls would be assigned to the same index, causing them to be merged incorrectly by ChatCompletionStreamState. This could result in lost tool calls or malformed arguments.

A more robust solution would be to assign a unique index to each tool call. This can be achieved by maintaining a state within _query_stream that maps tool call IDs to indices. When a new tool call ID is seen, assign it the next available index.

If you can confirm that the targeted OpenAI-compatible proxies will only ever stream one tool call at a time, then this fix is sufficient. Otherwise, I strongly recommend implementing the stateful indexing logic to prevent bugs with multiple tool calls.

choice = chunk.choices[0]
if choice.delta and choice.delta.tool_calls:
for tc in choice.delta.tool_calls:
if not hasattr(tc, "index") or tc.index is None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This condition can be written more concisely and safely using getattr.

Suggested change
if not hasattr(tc, "index") or tc.index is None:
if getattr(tc, "index", None) is None:

- Use enumerate() to assign correct index based on list position
- Iterate over all choices (not just the first) for completeness
- Addresses review feedback from sourcery-ai and gemini-code-assist
@Yaohua-Leo
Copy link
Contributor Author

Hi, I've addressed the review feedback from @sourcery-ai and @gemini-code-assist. The fix now uses enumerate() for correct index assignment and iterates over all choices. Please take another look when you have time. Thanks!

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a fix to handle streaming tool call deltas from OpenAI-compatible proxies that omit the index field. The change correctly patches the incoming chunks by adding a sequential index to tool calls where it's missing, preventing arguments from being dropped. The implementation is straightforward and effectively addresses the issue. I have one suggestion to improve the code's readability and maintainability by using a more idiomatic Python pattern.

for choice in chunk.choices:
if choice.delta and choice.delta.tool_calls:
for idx, tc in enumerate(choice.delta.tool_calls):
if not hasattr(tc, "index") or tc.index is None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For improved readability and maintainability, consider using getattr with a default value to check for the missing or None index. This is a more concise and idiomatic Python pattern for this type of check.

Suggested change
if not hasattr(tc, "index") or tc.index is None:
if getattr(tc, "index", None) is None:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:provider The bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner. size:XS This PR changes 0-9 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Streaming tool_call arguments lost when OpenAI-compatible proxy omits index field (e.g. Gemini)

1 participant