Issues
Search results
- Status: Open.#16104 In BerriAI/litellm;
- Status: Open.#16099 In BerriAI/litellm;
- Status: Open.#16098 In BerriAI/litellm;
- Status: Open.#16095 In BerriAI/litellm;
- Status: Open.#16094 In BerriAI/litellm;
[Bug]: LiteLLM Proxy doesn't drop temperature parameter for gpt-5 models when drop_params is enabled
Status: Open.#16090 In BerriAI/litellm;- Status: Open.#16089 In BerriAI/litellm;
- Status: Open.#16088 In BerriAI/litellm;
- Status: Open.#16083 In BerriAI/litellm;
- Status: Open.#16081 In BerriAI/litellm;
- Status: Open.#16080 In BerriAI/litellm;
- Status: Open.#16079 In BerriAI/litellm;