Openai API endpoint location #472
Replies: 4 comments 1 reply
-
Hi @matbee-eth, I understand that you're looking for a way to use Ollama's functionality with an OpenAI-compatible API endpoint. However, the Ollama WebUI project is separate from Ollama and neither offer this capability. Instead, I would recommend checking out alternative projects like LiteLLM+Ollama or LocalAI for accessing local models via an OpenAI-compatible API. Additionally, based on the continue.dev documentation, it seems that it can directly work with Ollama's API without requiring an OpenAI-compatible endpoint. So, you may want to explore this option as well. |
Beta Was this translation helpful? Give feedback.
-
Yeah I use it locally as well, is there an api key system or a way to disable auth for localhost? |
Beta Was this translation helpful? Give feedback.
-
To be clear, Ollama WebUI does not provide an API intended for use by external applications, and the author has stated previously that an option for disabling authentication isn't a goal of this particular project. If you're looking for a lighter-weight version of the application for personal local usage, you can check out Ollama WebUI Lite. If you're looking for a way to provide OpenAI API and manage API keys for Ollama, LiteLLM would be ideal. I recommend reading their documentation for a thorough understanding of its capabilities. |
Beta Was this translation helpful? Give feedback.
-
Quick question, it looks like LiteLLM is already included in the docker. Do you know if its possible to make that externally available? Looks like it only listens on localhost at the moment and I cant pass it any requests. |
Beta Was this translation helpful? Give feedback.
-
Trying to use this ollama-webui as an openai endpoint, but it doesnt seem to work- trying to use continue.dev with ollama
Beta Was this translation helpful? Give feedback.
All reactions