Skip to content

LLM Feature with gpt-5-mini not working due to wrong temperature #9719

@ulrichmueller

Description

@ulrichmueller

Hello,

while setting up with OpenAI GPT-5-mini (or -nano) a request gives:

[400] Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported

GPT4o-mini is working.

Thx

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions