Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running in Docker, Can not connect to local Ollama #4461

Closed
4 tasks done
zblabs opened this issue May 16, 2024 · 6 comments
Closed
4 tasks done

Running in Docker, Can not connect to local Ollama #4461

zblabs opened this issue May 16, 2024 · 6 comments
Labels
🐞 bug Something isn't working

Comments

@zblabs
Copy link

zblabs commented May 16, 2024

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.6.6

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Run in docker
Browse to http://localhost/explore/apps
Select Ollama
Enter:
model name: llama3:latest
base URL: http://localhost:11343 (have also tried with IP of PC. And opened firewall).
Completion mode: chat
Model Context size: 4096
Upper bound for max tokens: 4096
Vision Support: no
Press: Save

✔️ Expected Behavior

Configuration would be saved

❌ Actual Behavior

Received error:
Dify-Ollama-Connection-Failure

Copy link

dosubot bot commented May 16, 2024

Hello @zblabs! I'm Dosu, a bot here to help you solve bugs, answer questions, and guide you on becoming a contributor while you wait for a human maintainer. I’m here to assist you with any issues or queries you might have. I'm taking a look at your issue now and will get back to you with a full answer shortly.

@dosubot dosubot bot added the 🐞 bug Something isn't working label May 16, 2024
@zblabs
Copy link
Author

zblabs commented May 16, 2024

Dify looks awesome, however Corporate AI Governance requires all AI tools must run 100% in-house. I am in the research phase so I can not POC anything that cannot run in-house. We are allowed and do contribute to open-source projects.

@hjlarry
Copy link
Contributor

hjlarry commented May 17, 2024

try http://host.docker.internal:11343 instead of localhost

Copy link

dosubot bot commented May 17, 2024

Switching to http://host.docker.internal:11343 instead of localhost is indeed a correct approach when you're running Dify and Ollama within Docker containers, especially if you're on Windows or Mac. This special DNS name, host.docker.internal, is designed to enable Docker containers to communicate with the host machine, facilitating connections that localhost cannot achieve in a containerized environment.

Ensure that the Ollama service is actively listening on the correct port (11343 in your case) and that there are no firewall or network configurations blocking the connection. It's also crucial to verify the Docker container's network settings to confirm they're set up to allow connections between the container and the host machine.

If after applying this change you still face issues, double-check the port numbers for consistency and ensure that your Docker networking configurations are correctly set up to support this type of communication.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@zblabs
Copy link
Author

zblabs commented May 17, 2024

SOLVED: After updating from 3912 and rebuilding the Docker container I ran http://localhost/install (on a whim) and it now recognizes my Ollama instance. Thank you for your support!

I have tried http://host.docker.internal:11343 and it still failed. Looking at the logs in Docker I see:
172.20.0.1 - - [17/May/2024:13:14:48 +0000] "POST /console/api/workspaces/current/model-providers/ollama/models HTTP/1.1" 400 371 "http://localhost/apps" "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:125.0) Gecko/20100101 Firefox/125.0" "-"

I am running Win11 and Ollama works fine with non-docker apps.

Notice it is trying to use localhost and getting a 400 error. Is there a configuration setting I should be making so internally Dify isn't going to use localhost?

I tried the solution from #3912 and it did not resolve the issue.

@zblabs
Copy link
Author

zblabs commented May 17, 2024

SOLVED: After updating from 3912 and rebuilding the Docker container I ran http://localhost/install (on a whim) and it now recognizes my Ollama instance. Thank you for your support!

@zblabs zblabs closed this as completed May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants