Skip to content

Conversation

@max-wittig
Copy link
Contributor

@max-wittig max-wittig commented Jul 15, 2025

Even when stream_options is not specified

This is a follow up from https://github.com/vllm-project/vllm/pull/19695/files where I didn't consider the fact that we also want to include the usage, when no stream options are specified

Essential Elements of an Effective PR Description Checklist

  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.

Purpose

Test Plan

  1. Run vllm with --enable-force-include-usage
  2. Send a normal streaming request to vLLM and observe the usage being included at the end

Test Result

(Optional) Documentation Update

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @max-wittig, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request implements a fix to ensure that usage statistics are consistently included in the completion API's responses. It addresses a specific scenario where these statistics were omitted when stream_options were not explicitly provided, aligning the behavior with the intended functionality of always reporting usage. This is a follow-up to a previous change to cover this edge case.

Highlights

  • Completion API: Ensured that usage statistics are always included in the completion response, even when stream_options are not specified, by leveraging the enable_force_include_usage flag.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@mergify mergify bot added the frontend label Jul 15, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly ensures that usage information is included in completion streams when stream_options is not specified, by respecting the enable_force_include_usage flag. The change is simple, well-contained, and directly addresses the issue described. The code is clear and I see no issues with it.

@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch from 658b824 to 9b42f26 Compare July 15, 2025 12:14
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@max-wittig max-wittig marked this pull request as ready for review July 15, 2025 14:21
@max-wittig max-wittig requested a review from aarnphm as a code owner July 15, 2025 14:21
@max-wittig
Copy link
Contributor Author

@aarnphm maybe you could take a look. This is a fix for the contribution that I made in #19695. Thank you!

Copy link
Collaborator

@NickLucche NickLucche left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you need to edit at least serving_chat and serving_transcriptions too.
Which could make for an opportunity to group the logic into a util function

@max-wittig
Copy link
Contributor Author

@NickLucche Thanks for the review. I will check it!

@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch from 9b42f26 to 6f81ce5 Compare July 18, 2025 10:37
@mergify
Copy link

mergify bot commented Jul 18, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @max-wittig.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Jul 18, 2025
@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch from 6f81ce5 to c6c5ff4 Compare July 18, 2025 10:43
@mergify mergify bot removed the needs-rebase label Jul 18, 2025
@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch 15 times, most recently from c1e0090 to 88021ff Compare July 18, 2025 14:26
@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch 2 times, most recently from c3a35cb to cbbc7f6 Compare October 13, 2025 11:42
@mergify
Copy link

mergify bot commented Oct 13, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @max-wittig.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Oct 13, 2025
max-wittig and others added 2 commits October 13, 2025 16:00
Even when stream_options is not specified

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch from cbbc7f6 to 77cd6a8 Compare October 13, 2025 14:00
@mergify mergify bot removed the needs-rebase label Oct 13, 2025
@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch from 77cd6a8 to 460e4fa Compare October 13, 2025 14:05
@max-wittig max-wittig force-pushed the fix/stream-options-always-usage branch from 460e4fa to d2f2425 Compare October 13, 2025 14:08
@max-wittig
Copy link
Contributor Author

@NickLucche This is ready and passing again!

@NickLucche NickLucche merged commit fd85c9f into vllm-project:main Oct 14, 2025
48 checks passed
@NickLucche
Copy link
Collaborator

Thanks for your patience @max-wittig !

@max-wittig
Copy link
Contributor Author

@NickLucche Thanks for the reviews and the merge!

@max-wittig max-wittig deleted the fix/stream-options-always-usage branch October 14, 2025 07:30
Dhruvilbhatt pushed a commit to Dhruvilbhatt/vllm that referenced this pull request Oct 14, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Signed-off-by: Dhruvil Bhatt <[email protected]>
max-wittig added a commit to siemens/vllm that referenced this pull request Oct 15, 2025
This was introduced by accident in vllm-project#20983.
Sorry about that.
max-wittig added a commit to siemens/vllm that referenced this pull request Oct 15, 2025
This was introduced by accident in vllm-project#20983.
Sorry about that.

Signed-off-by: Max Wittig <[email protected]>
@max-wittig max-wittig mentioned this pull request Oct 15, 2025
5 tasks
bbartels pushed a commit to bbartels/vllm that referenced this pull request Oct 16, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Signed-off-by: bbartels <[email protected]>
lywa1998 pushed a commit to lywa1998/vllm that referenced this pull request Oct 20, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
alhridoy pushed a commit to alhridoy/vllm that referenced this pull request Oct 24, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Signed-off-by: xuebwang-amd <[email protected]>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Signed-off-by: xuebwang-amd <[email protected]>
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Signed-off-by: 0xrushi <[email protected]>
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Signed-off-by: 0xrushi <[email protected]>
rtourgeman pushed a commit to rtourgeman/vllm that referenced this pull request Nov 10, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Zhathw pushed a commit to Zhathw/vllm that referenced this pull request Nov 12, 2025
… ` (vllm-project#20983)

Signed-off-by: Max Wittig <[email protected]>
Signed-off-by: Antoine Auger <[email protected]>
Co-authored-by: Antoine Auger <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

frontend ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants