Use your locally running AI models to assist you in your web browsing.
Total ratings
4.56
(Rating count:
16)
Review summary
Pros
- User-friendly interface
- Easy setup process
- Works well with local LLMs like Ollama and LLaMA
- Saves resources compared to Docker and other UI interfaces
- Quick to get started without extensive tinkering
Cons
- Doesn't respect certain prompt formats (e.g., <|end_of_text|> tokens for ChatML models)
- A few spelling and wording mistakes in the UI
Most mentioned
- User-friendly interface
- Easy setup
- Works with local AI models
- Resource-efficient alternative to other options
Upgrade to see all 16 reviews
User reviews
Recent rating average:
4.30
All time rating average:
4.56
Upgrade to see all 16 reviews
Rating filters
5 star 4 star
3 star
2 star
1 star
Date | Author | Rating | Comment |
---|---|---|---|
2024-12-14 | 无能狂怒气死自己 | ||
2024-11-19 | lukp12 | Absolutely fantastic - excactly what I was looking for. You might want to add "LLM" or "Ollama" keywords to name so it's found easier | |
2024-11-05 | Firefox user 18669759 | Nice extension, but can you fix it to follow different prompt formats. For example, it doesn't respect or use <|end_of_text|> tokens for ChatML models and keeps outputting more and more text. | |
2024-11-02 | winphreak | ||
2024-10-28 | JackMack | I rarely leave reviews for things, but this is an extension that deserves to be better known. When you're just getting into local LLMs, it's exactly this sort of simple local Web UI that's needed. No need for hours tinkering with CLI and GUI interfaces. Straight into the action with this. | |
2024-10-24 | Alt | Excellent ! | |
2024-10-09 | Firefox user 18630842 | ||
2024-10-09 | Firefox user 17890642 | ΕΞΕΡΕΤΙΚΗ | |
2024-10-09 | Firefox user 18629937 | great | |
2024-09-30 | Kloppix |
Upgrade to see all 16 reviews