I am trying to use the new gpt-5 model from OpenAI in the BYOM (bring your own model) mode of Leo AI, but it results in an error message “There was a network issue connecting to Leo, check your connection and try again.“
I have been able to successfully use Leo AI BYOM with gpt-4o and gpt-4.1 with the same API Key, and I’ve verified using curl that gpt-5 works fine with that same API Key. But not within Leo AI. Could it be that Brave automatically passes a temperature value to https://api.openai.com/v1/chat/completions because that does not work with gpt-5 (but works with all the older models)? If so, then that should be fixed or at least the temperature should be made into a parameter that can be adjusted in settings.