Autogen Support for gpt-5 reasoning #6962
Unanswered
ravneetgrewal
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Has the OpenAI ext module been fixed for supporting Reasoning configurations and getting reasoning chunks back?
I am able to port the existing code to use GPT-5 Fast (using 'gpt-5-chat-latest' in the api as model-name/id) along with some other modifications like max_completion_token instead of max_tokens.
GPT-5 Fast works alright, and fast, but the thinking models (using 'gpt-5' in the api as model-name/id) is very slow and there are two issues here:
Anyone with any insight on this for newer thinking models?
Thanks!
R
Beta Was this translation helpful? Give feedback.
All reactions