Docker
- Go to docker dir and then to cpu-only or gpu and run `docker compose up -d`Kubernetes
- Not yet supported (under construction)Just remember to add necessary api keys for example if you want to use OpenAI models, provide OPENAI api key in docker-compose file
Usage
Local LLM models inference is done using Ollama. So any model listed in here is supported.
By default Langfuse URL will point to url defined as a MONITORING_SERVER_URL, so if you are deploying this app with docker compose, there is no need to change anything here.
You can of cource provide URL pointing to Langfuse cloud as well.