Appreciation post: Why Leo AI became an essential tool in my daily workflow

Hey Brave team!

I was sitting with my colleagues today, discussing various AI tools, and I realized that after two months of using Leo AI, I can confidently say it has become my main work tool.

In short: it’s a really solid tool that deserves more recognition.

Quality and reliability
What sets Leo apart from the competition is, above all, its reliability and high quality. In my work, I regularly analyze extensive technical documentation- websites, PDF files opened in a browser, complex specifications. Leo handles this surprisingly well: it maintains context even during long conversations, does not invent information, and does not get lost in details. These are not easy analyses, and Leo does its job perfectly. In my work, reliability is essential – and Leo simply delivers.

Usability and accessibility
Leo in the browser sidebar is something that really speeds up my work. It’s not just that it works, but that it’s exactly where I need it. I don’t have to switch contexts, copy content, or interrupt my workflow.

The ability to create shortcuts is an absolute game changer for repetitive tasks. The fact that these shortcuts can be up to 5,000 characters long allows you to provide a really broad context, which is very valuable, especially when you’re working on complex tasks where detailed context is important.

Stability
I have not yet reached any limits during standard use. Importantly, I have not experienced any drops in quality or availability issues. It sounds trivial, but when you rely on a tool for urgent tasks, stability is of utmost importance. With competitors, such problems occur regularly.

Privacy and flexibility
You prioritize security and privacy, which is crucial for me both professionally and privately. I know that my data is secure and stored only locally. A huge plus is the ability to connect your own models through Ollama (I use this) – it really increases Leo’s usefulness in everyday work.

AI Browsing
I mainly use this feature privately, and it works really well. I have tested similar solutions from competitors, and yours is the best.

I know you are constantly developing Leo AI- I myself have reported several ideas and bugs on the forum. But mainly, I just wanted to say thank you.
You are doing a great job, and Leo AI will remain on my list of essential tools for good.

Thanks for developing this project. :folded_hands:

3 Likes

I concur. I just have two issues. Ofter our conversations drift from subject to subject. Since he can not reference information from different separate conversations, this makes it very hard to keep a running conversation going because of the character limit. You need to start over again with a new conversation and big him up to speed on what you already discussed. In other words, each conversation is not relational. I ask him about this and he said this was indeed a limitation to his programing.

You’re right, it’s a shame that Leo doesn’t remember previous conversations and that certain information has to be presented to him again.

As a workaround, we have a memory function-you can ask Leo in a conversation to remember a piece of information, and it works well.

You can also add some information manually.

I also use shortcuts to collect information that is valuable for a given project. I edit shortcuts on an ongoing basis, making sure they contain up-to-date information about the project. There is a large window of 5,000 characters. So far, this has been sufficient, and I rarely reach the limit.

I would like to add that some time ago I suggested adding a feature that would allow chats to be grouped into spaces (1 space = 1 project in the form of multiple chats). If Leo had the memory to remember all conversations in a given space, it would be very helpful. :slightly_smiling_face:

Aww this is so nice to hear! We lean quite heavily on the community and community feedback so we always love to hear when folks appreciate it, makes it much easier to stay motivated :sweat_smile:

In other words, each conversation is not relational. I ask him about this and he said this was indeed a limitation to his programing.

Wdyt about being able to attach a previous conversation to a new one? Maybe to avoid token limits we could summarise it before sending, but if you could just say ‘take a look at .@[select conversation]’ like you can with tabs/bookmarks now it could help with this?

1 Like

Personally, I really like this idea :star_struck: , although I feel that too many things are displayed after selecting @. You have open tabs and bookmarks there, and if conversations are added to that, it could create visual chaos.

Maybe adding a shortcut: @chat: instead of just @

Good shout - could expand it to @bookmarks:, @tabs: etc too. Lemme pass this on to design and see what they think

2 Likes

I use his Memory feature all the time. It helps, but only for basic information. It’s really not ment to consolidate conversations. He really needs to be able to recall data from all conversations. That would let him build a powerful profile. We have had so may useful Conversations. If he had access to them all at once, he be unstoppable!

I second you here …

It has become an amazing tool for me. and as you say, can respond to very detailed requests, where digging through resources would gobble up much time.

There is some false info from time to time, but I would say batting 90 %

1 Like

Just a want to add that it has helped me massively as well. I’m an IoT engineer but also a jack of all trades and DIYer. It has helped with everything from learning PLCs, software, code, HVAC… I have repaired my furnace twice, my central air unit, properly repaired plumbing under my house, installed a water softener, troubleshot a Ford Taurus PCM and other electrical issues, failed turbo on a Chevy cruze.. it has more than paid off itself. Its currently teaching me MODBUS to profinet gateways with PLC integration.

If you are a multifaceted, ambitious individual whose limitation is simply the time involved to learn a topic, its awesome. Having a low latency knowledge base with limitless patience is invaluable. 15 min with the LLM has the same value as hours of scouring stack exchange forum posts from manic depressives with low interpersonal relationship skills.

1 Like