-
-
Notifications
You must be signed in to change notification settings - Fork 354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to connect to Ollama #45
Comments
Hey, sorry about that. Is your Ollama server running locally? If not, you need to run it. You can download Ollama from here: https://ollama.com/download |
I'm jumping in to ask about using a remote installation of ollama. When trying something like |
Just to say that the project is really impressive (I've tried with a local server as well). thanks! |
Usually, page assist works with the remote server. I will try to debug it. Maybe it's because of this issue: ollama/ollama#2335
thanks :) |
Great! |
🤔 can you try this command
|
It did. |
Hey guys, a new fix has been released for this issue. All you need to do is go to Settings > Ollama setting and enable custom origin URL, which will resolve the issue. for more information: https://github.com/n4ze3m/page-assist/blob/main/docs/connection-issue.md |
Hey @MicPec I just tested it on the Vivaldi browser (windows) , and it seems to be working fine. Can you try turning on this setting from the web UI settings > Ollama setting? It may resolve the issue. for more details: https://www.youtube.com/watch?v=fydtRnxjfJU or https://github.com/n4ze3m/page-assist/blob/main/docs/connection-issue.md#solutions |
Thanks for quick response. I did try this before, no luck. BTW this is the Linux machine. |
Hey @MicPec , I will test on my Linux machine with Vivaldi. For OpenWebUI and LobreChat, they have a server, but for PageAssist, everything happens in the browser, which is why this issue exists Another fix is by changing Custom origin url in advance settings to
source: ollama/ollama#2335 (comment) |
I've managed to run it, the problem was completely different. The connection was blocked by uBlock 🤦🏻♂️, so I had to add |
Talking of which, and it's not specific to page-assist, I had to change |
Hey @oatmealm In the latest version of Page Assist, there is an advanced URL configuration. All you need to do is enable |
Is it needed when OLLAMA_ORIGINS is set correctly? |
Not needed :) An alternative method only |
Firefox uses My settings that work (the last lines is just so i use a HDD with lots of space for all the models):
|
Hey, sorry about the issue, @uygur55 . Is Ollama and the extension you're trying to connect on the same computer, or are they on two different ones? |
on the same computer. |
Hey @uygur55, that's weird. It usually connects automatically. If possible, can you add this to your environment variable: |
Hi, |
Hey @uygur55, that’s unfortunate. Since you added the env, could you enable the advanced URL option? Please add Also, if you use uBlock, make sure to add the extension ID 'jfgfiigpkhlkbnfnbobbkinehhfdhndo,' as MicPec pointed out in the comment above |
Hey @uygur55, You don't need to add |
Hi @wuchewuche, you don't need to use the Chrome extension URL as the origin URL. Instead, change it to Alternatively, you can set Hopefully, it works fine. |
hi @n4ze3m It is working, but it cannot find the model on the other computer that I want to access |
emmm @n4ze3m Other PC is http://192.168.1.83:11434 . what can i do? |
Hey @wuchewuche, can you check if this endpoint is working: http://192.168.1.83:11434/api/tags? |
hey @n4ze3m |
Hey @wuchewuche, that's why Page Assist can't access models from another computer. |
hey @n4ze3m |
Hey @wuchewuche , sorry for asking so many questions. Are you running Ollama on Windows and trying to access it from a Mac, or running it on a Mac and trying to access it from Windows? |
hey @n4ze3m I have a model on my Mac, and it has been successfully started locally. I plan to allow other Windows machines to directly use the model on my Mac without needing the model files locally. |
Hey @wuchewuche , You need to set OLLAMA_HOST and OLLAMA_ORIGIN on the Mac and make sure to restart the Ollama daemon. It should work. |
hey @n4ze3m mac is this But Windows still cannot connect to the model on the Mac. |
Hey @wuchewuche, sorry about that :/ Can you try accessing http://192.168.1.83:11434/api/tags from your Mac to see if it works there? Also, let's move this issue here: #355. |
hey @n4ze3m mac can not accessing http://192.168.1.83:11434/api/tags . I am sorry .what is the next step? I'm out of ideas.
|
@wuchewuche How did you set |
nano ~/.bash_profile That what i do. |
Needs to be set with |
how can i do? i do not use lauchctl before. @rick-github |
Click on the link I provided. |
launchctl setenv OLLAMA_HOST "0.0.0.0:11434" Do the same thing in origin . the http://192.168.1.83:11434/api/tags is show the {"models":[{"name":"nomic-embed-text:latest","model":"nomic-embed-text:latest","modified_at":"2025-02-08T17:31:56.437666829+08:00","size":274302450,"digest":"0a109f422b47e3a30ba2b10eca18548e944e8a23073ee3f3e947efcf3c45e59f","details":{"parent_model":"","format":"gguf","family":"nomic-bert","families":["nomic-bert"],"parameter_size":"137M","quantization_level":"F16"}},{"name":"deepseek-r1:8b","model":"deepseek-r1:8b","modified_at":"2025-02-06T11:37:47.650769748+08:00","size":4920738407,"digest":"28f8fd6cdc677661426adab9338ce3c013d7e69a5bea9e704b364171a5d61a10","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"8.0B","quantization_level":"Q4_K_M"}}]}. |
What next step? Ollama is working. Connect your client. |
"I mean, I want to directly access the Ollama model I deployed on my Mac from my Windows host, and both are on the same network. I also want others, who are also on the same network, to be able to access it. My Mac would act like a broadcast. I’m not sure if I understand it correctly, so what’s the next step?" @rick-github |
Configure your client to connect to 192.168.1.83:11434. |
The Windows machine can connect to the Mac at 192.168.1.83:11434, but it only shows "Ollama is running." The "page Assist" is unable to retrieve the model, and the "Select a model" option in the plugin displays "No data." @rick-github |
Have you configured "Page Assist" to connect to 192.168.1.83:11434? What happens if you run the following on the Windows machine:
|
the http://192.168.1.83:11434/api/tags is show the {"models":[{"name":"nomic-embed-text:latest","model":"nomic-embed-text:latest","modified_at":"2025-02-08T17:31:56.437666829+08:00","size":274302450,"digest":"0a109f422b47e3a30ba2b10eca18548e944e8a23073ee3f3e947efcf3c45e59f","details":{"parent_model":"","format":"gguf","family":"nomic-bert","families":["nomic-bert"],"parameter_size":"137M","quantization_level":"F16"}},{"name":"deepseek-r1:8b","model":"deepseek-r1:8b","modified_at":"2025-02-06T11:37:47.650769748+08:00","size":4920738407,"digest":"28f8fd6cdc677661426adab9338ce3c013d7e69a5bea9e704b364171a5d61a10","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"8.0B","quantization_level":"Q4_K_M"}}]}. What other Page Assist settings need to be configured to connect to 192.168.1.83:11434? @rick-github |
So ollama is working.
I don't know. Read the documentation for Page Assist. |
Hello!
I'm experiencing an issue connecting to the local Ollama server while using your page-assist extension. The application displays "Unable to connect to Ollama 🦙" message and a red icon.
I've tried various recommendations from the documentation and discussions, but the problem persists.
Could you provide instructions for running the local Ollama server or suggest other solutions to this issue?
Thank you for your assistance!
The text was updated successfully, but these errors were encountered: