Use your locally running AI models to assist you in your web browsing.
Total ratings
4.45
(Rating count:
22)
Review summary
Pros
- Works very well with local AI models
- Highly polished and feature-rich
- Excellent user interface
- Easy setup and use
- Saves system resources compared to other solutions
Cons
- High CPU utilization even when idle
- Limited configurability for custom prompts
- Does not respect certain prompt formats
- Some minor spelling and wording mistakes
Most mentioned
- High CPU utilization
- Ease of setup
- Polished interface
- Need for better prompt configurability
- Simple to use for beginners in local AI
Upgrade to see all 22 reviews
User reviews
Recent rating average:
4.00
All time rating average:
4.45
Upgrade to see all 22 reviews
Rating filters
5 star 4 star
3 star
2 star
1 star
Date | Author | Rating | Comment |
---|---|---|---|
2025-01-21 | AltB | Excellent with ollama! | |
2025-01-17 | Bob Tao | ||
2025-01-02 | nn | Works very well in the current state (02/01/2025). It would be great to have the custom pilot prompts be more configurable (just one "custom" entry isn't a lot) and using the options window creates a 100% cpu utilization for one core (process web extension). Overall a great addon, very helpful. | |
2024-12-31 | jords | Highly polished and feature-rich extension | |
2024-12-31 | Hous | extension takes 100% cpu load even when you are not actively using it. | |
2024-12-24 | Sabryabdallah | أفضل تطبيق لتشغيل نماذج الذكاء الصناعى على الجهاز الخاص بك | |
2024-12-14 | 无能狂怒气死自己 | ||
2024-11-19 | lukp12 | Absolutely fantastic - excactly what I was looking for. You might want to add "LLM" or "Ollama" keywords to name so it's found easier | |
2024-11-05 | Firefox user 18669759 | Nice extension, but can you fix it to follow different prompt formats. For example, it doesn't respect or use <|end_of_text|> tokens for ChatML models and keeps outputting more and more text. | |
2024-11-02 | winphreak |
Upgrade to see all 22 reviews