Allow Leo local LLM BYOM to communicate over hTTP for LAN

I have a desktop with Ollama installed and a GPU that is capable of running some different LLMs on it. I also have a laptop on the same local network that is not able to run LLMs on it. I’d like my laptop to use my desktop’s local IP address and communicate with Ollama models via HTTP. I can’t input a HTTP address into the address bar that isn’t “localhost” (it requires HTTPS otherwise). I don’t really want to go through all the signing certificates on my desktop just for this process.

My request would be to either detect a LAN address and allow HTTP, allow local machine names, or possibly add a checkbox with a big warning saying “Are you sure you wish to communicate over HTTP? This is insecure and may put your privacy at risk.” or something similar.

Or, can someone point me to a configuration file or something where I can force the address to be HTTP and skip the standard Brave Settings UI from blocking me from doing so?

5 Likes

Wait. Leo has BYOM support??? HOW?

You can check it out in the beta now. Here is the blog post for when it was launched t o nightly

1 Like

+1 to this request. Definitely need a way to access Ollama models over LAN IP or remove https requirement

I second this request.
It is not reasonable to install Ollama on every computer that runs Brave, nor is it reasonable to install the certificate on the local machine.

yea it will be so convenient to be able to us my LAN ollama.

Looks like the Brave team added this feature! You can now enable “brave-ai-chat-allow-private-ips” in “brave://flags/”.

Thanks!