We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I am using latest version of windows . As per readme file when I am hitting http://localhost:11434 i am getting "ollama is running" but "http://localhost:11434/api" is giving error 404
Windows
AMD
Intel
0.1.33
The text was updated successfully, but these errors were encountered:
I am experiencing issues with API endpoint as well. Here is the server log:
[GIN] 2024/05/07 - 07:41:01 | 200 | 4.894709434s | 127.0.0.1 | POST "/api/chat" [GIN] 2024/05/07 - 07:41:07 | 404 | 9.017µs | 127.0.0.1 | GET "/api/generate/" [GIN] 2024/05/07 - 07:41:13 | 200 | 10.527µs | 127.0.0.1 | GET "/"
See 404 error for api endpoint above
Ollama version: 0.1.33 OS: Mac
Sorry, something went wrong.
Solution: The API endpoint doesn't respond to GET requests, POST requests seem to be working.
@ritesh7911 are you trying to use GET to this API? See https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion for usage information.
If it isn't working as described in the API docs, please share more information about how you're calling it and I'll re-open the issue.
dhiltgen
No branches or pull requests
What is the issue?
I am using latest version of windows . As per readme file when I am hitting http://localhost:11434 i am getting "ollama is running" but "http://localhost:11434/api" is giving error 404
OS
Windows
GPU
AMD
CPU
Intel
Ollama version
0.1.33
The text was updated successfully, but these errors were encountered: