Reviews for Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models by Muhammed Nazeem
Review by geeknik
Rated 5 out of 5
by geeknik, 9 months ago39 reviews
- Rated 5 out of 5by Eduardo, 20 hours ago
- Rated 5 out of 5by 晓明, 2 days ago
- Rated 4 out of 5by Firefox user 10396487, 2 days ago
- Rated 5 out of 5by null, 6 days ago
- Rated 5 out of 5by JuiceFruit, 9 days ago
- Rated 5 out of 5by Firefox user 18824818, 9 days ago
- Rated 4 out of 5by Bugloss, 9 days ago
- Rated 5 out of 5by Firefox user 18821506, 11 days ago
- Rated 5 out of 5by zyb, 12 days ago
- Rated 4 out of 5by Elliott, 13 days agoVery nice ollama frontend. It automatically recognises the ollama daemon already running on PC if installed. It provides a nice way to interact with local LLMs, complete with web search integration which, having now seen it firsthand, really transforms the usefulness of the models; even those that appear stupid without web search can be good at summarising information, and become actually helpful when they don't have to rely on only their built in knowledge. This extension is probably the easiest way to get any graphical interface for ollama running, particularly with integrated web search.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama. - Rated 5 out of 5by robouden, 13 days agoAmazing plugin. Only thing I would love to see is an option to have the plugin as a popup or sidebar selection in the settings of the plugin.And when right clicked on text the Chat can popup can be activated.
Regards
Rob Oudendijk - Rated 5 out of 5by redspectre, 14 days ago
- Rated 1 out of 5by Breanna Johnson, 15 days ago
- Rated 5 out of 5by Firefox user 18810652, 17 days ago
- Rated 1 out of 5by malisipi, 19 days ago
- Rated 5 out of 5by 赤黑, 19 days ago
- Rated 5 out of 5by Alwaysliumx, 21 days ago
- Rated 5 out of 5by Bob Tao, a month ago
- Rated 4 out of 5by nn, 2 months agoWorks very well in the current state (02/01/2025). It would be great to have the custom pilot prompts be more configurable (just one "custom" entry isn't a lot) and using the options window creates a 100% cpu utilization for one core (process web extension).
Overall a great addon, very helpful.Developer response
posted 2 months agoThank you for the suggestions and review. Sorry about the 100% CPU utilization issue; I will release a fix in the next update. - Rated 1 out of 5by Hous, 2 months agoextension takes 100% cpu load even when you are not actively using it.
Developer response
posted 2 months agoApologies for the issue; we will release a fix in the next update - Rated 5 out of 5by Sabryabdallah, 2 months agoأفضل تطبيق لتشغيل نماذج الذكاء الصناعى على الجهاز الخاص بك
- Rated 5 out of 5by 无能狂怒气死自己, 2 months ago
- Rated 5 out of 5by lukp12, 3 months agoAbsolutely fantastic - excactly what I was looking for. You might want to add "LLM" or "Ollama" keywords to name so it's found easier