I want to use Ollama Cloud (not local) models with Brave, but I can’t get it to work.
Here is Ollama’s documentation for their Cloud models: https://docs.ollama.com/cloud
I’ve generated my API key, and if I use it in the terminal it does work
export OLLAMA_API_KEY=**** (I used my actual key)
curl https://ollama.com/api/chat \
-H "Authorization: Bearer $OLLAMA_API_KEY" \
-d '{
"model": "gpt-oss:120b",
"messages": [{
"role": "user",
"content": "Why is the sky blue?"
}],
"stream": false
}'
When I tried in Brave Settings > LEO > BYOM
- Model request name: gpt-oss:120b
- Server endpoint: https://ollama.com/api/chat
- API Key: **** (I used my actual key)
I get this error message when chatting:
There was a network issue connecting to Leo, check your connection and try again.
If I change one character in my API key, I get this error:
The API key configured for this model is invalid. Please check your configuration and try again.
So the model is definitely communicating to Ollama’s servers
I also tried the streaming-response server endpoint (see https://docs.ollama.com/api/streaming)
- Server endpoint: https://ollama.com/api/generate
trying with "prompt" in the terminal
note that I can’t get a response when using the "prompt": parameter, only the "messages": [{ "role": "user", "content": "Why is the sky blue?" }], parameters
curl https://ollama.com/api/chat \
-H "Authorization: Bearer $OLLAMA_API_KEY" \
-d '{
"model": "gpt-oss:120b",
"prompt": "Why is the sky blue?",
"stream": false
}'
respone:
{"model":"gpt-oss:120b","created_at":"2025-12-14T05:24:39.141054444Z","message":{"role":"assistant","content":""},"done":true,"done_reason":"load"}