Unable to get Leo AI BYOM to work with OpenAI gpt-5 model

I am trying to use the new gpt-5 model from OpenAI in the BYOM (bring your own model) mode of Leo AI, but it results in an error message “There was a network issue connecting to Leo, check your connection and try again.“

I have been able to successfully use Leo AI BYOM with gpt-4o and gpt-4.1 with the same API Key, and I’ve verified using curl that gpt-5 works fine with that same API Key. But not within Leo AI. Could it be that Brave automatically passes a temperature value to https://api.openai.com/v1/chat/completions because that does not work with gpt-5 (but works with all the older models)? If so, then that should be fixed or at least the temperature should be made into a parameter that can be adjusted in settings.

1 Like

Hey,

Thanks for letting us know, seems like you could be right - https://github.com/brave/brave-core/blob/c3fffdfb10c274b45eab35495c5ac142b505de5c/components/ai_chat/core/browser/engine/oai_api_client.cc#L67

dict.Set(“temperature”, 0.7);

I’ll raise an issue and try to get it fixed, sorry about this!

2 Likes

It seems like you understand the issue, but for others following along, this error occurs because GPT-5 accessed through the OpenAI API does not support the temperature parameter.

By the way, it would also be nice to be able to customize the reasoning_effort parameter when using a model that supports it.