-
-
Notifications
You must be signed in to change notification settings - Fork 11.9k
[responsesAPI][8] input/output messages for ResponsesParser #30158
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
The pull request introduces input/output message tracking for ResponsesParser and enables enable_response_messages for ParsableContext. However, there are several issues identified, including incorrect population of input_messages in multi-turn conversations, unresolved TODOs that indicate known parsing shortcomings, and the presence of a debugger breakpoint in the code. These issues could lead to incorrect API responses and should be addressed.
2d1d3e6 to
40fe6cb
Compare
|
Documentation preview: https://vllm--30158.org.readthedocs.build/en/30158/ |
|
This pull request has merge conflicts that must be resolved before it can be |
Signed-off-by: Andrew Xia <[email protected]>
Signed-off-by: Andrew Xia <[email protected]>
40fe6cb to
76cad23
Compare
Signed-off-by: Andrew Xia <[email protected]>
Signed-off-by: Andrew Xia <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| from openai.types.responses import ResponseFunctionToolCall, ResponseOutputItem | ||
| from openai.types.responses.response_function_tool_call_output_item import ( | ||
| ResponseFunctionToolCallOutputItem, | ||
| ) | ||
| from openai.types.responses.response_input_item import McpCall |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Import McpCall output schema for parsed tool calls
The parser now converts MCP tool call outputs into McpCall objects inside make_response_output_items_from_parsable_context, but the import here pulls McpCall from response_input_item. That class models request inputs and (unlike the output version used elsewhere, e.g. harmony_utils) does not carry the output field this method populates, so parsing an MCP call will either raise validation/TypeError or drop the tool output when assembling the response. This should import the output-item McpCall to correctly serialize tool results for parsable contexts.
Useful? React with 👍 / 👎.
|
|
||
| return self | ||
|
|
||
| def make_response_output_items_from_parsable_context( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a refactor, no functional changes.
Signed-off-by: Andrew Xia <[email protected]>
chaunceyjiang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks~
Purpose
NOTE: This PR must be merged in after #30230, this builds on top of #30115
Test Plan
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.