Ollama offline connection failed #1901
start-life
started this conversation in
General
Replies: 1 comment
-
It would be great if this could work completely offline. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Ollama offline connection failed
When it's online it connects
INFO: Started server process [8800]
INFO: Waiting for application startup.
INFO:apps.litellm.main:start_litellm_background
INFO:apps.litellm.main:run_background_process
INFO:apps.litellm.main:Executing command: ['litellm', '--port', '14365', '--host', '127.0.0.1', '--telemetry', 'False', '--config', 'C:\zxcv\open-webui\backend\data/litellm/config.yaml']
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO:apps.litellm.main:Subprocess started successfully.
INFO: 127.0.0.1:51747 - "GET /api/config HTTP/1.1" 200 OK
INFO: 127.0.0.1:51748 - "GET /manifest.json HTTP/1.1" 200 OK
INFO: 127.0.0.1:51748 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51748 - "GET /ollama/api/tags HTTP/1.1" 200 OK
INFO:apps.openai.main:get_all_models()
INFO:apps.openai.main:get_all_models()
INFO: 127.0.0.1:51748 - "GET /openai/api/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51748 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51754 - "GET /ollama/api/tags HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/changelog HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
INFO:apps.openai.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO:apps.openai.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51754 - "GET /openai/api/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51759 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
INFO: 127.0.0.1:51754 - "GET /static/favicon.png HTTP/1.1" 200 OK
INFO: 127.0.0.1:51759 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51761 - "GET /ollama/urls HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51761 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
INFO: 127.0.0.1:51761 - "GET /litellm/api/model/info HTTP/1.1" 200 OK
Beta Was this translation helpful? Give feedback.
All reactions