Use your locally running AI models to assist you in your web browsing.
Total ratings
4.53
(Rating count:
15)
Review summary
Pros
- User-friendly and simple setup
- Effective for local LLMs without needing extensive configurations
- Excellent user interface compared to other local LLM applications
- Works well with low-spec computers
- Valuable for those new to local AI models
Cons
- Does not support certain prompt formats like <|end_of_text|> tokens for ChatML models
- Minor spelling and wording mistakes in the UI
Most mentioned
- Easy setup and usability
- Great user interface
- Performance on low-spec machines
Upgrade to see all 15 reviews
Recent reviews
Recent rating average:
4.30
All time rating average:
4.53
Upgrade to see all 15 reviews
Rating filters
5 star 4 star
3 star
2 star
1 star
Date | Author | Rating | Comment |
---|---|---|---|
2024-11-19 | lukp12 | Absolutely fantastic - excactly what I was looking for. You might want to add "LLM" or "Ollama" keywords to name so it's found easier | |
2024-11-05 | Firefox user 18669759 | Nice extension, but can you fix it to follow different prompt formats. For example, it doesn't respect or use <|end_of_text|> tokens for ChatML models and keeps outputting more and more text. | |
2024-11-02 | winphreak | ||
2024-10-28 | JackMack | I rarely leave reviews for things, but this is an extension that deserves to be better known. When you're just getting into local LLMs, it's exactly this sort of simple local Web UI that's needed. No need for hours tinkering with CLI and GUI interfaces. Straight into the action with this. | |
2024-10-24 | Alt | Excellent ! | |
2024-10-09 | Firefox user 18630842 | ||
2024-10-09 | Firefox user 17890642 | ΕΞΕΡΕΤΙΚΗ | |
2024-10-09 | Firefox user 18629937 | great | |
2024-09-30 | Kloppix | ||
2024-09-29 | emrgncr | worked great with ollama, with llama3.2 3b you can summarize your web pages even on low spec computers |
Upgrade to see all 15 reviews