Skip to content

Any guidance for Ollama API support ? #539

Answered by lgrammel
iam4x asked this question in Help
Discussion options

You must be logged in to vote

With the Vercel AI SDK 3.1, there is a community provider for llama.cpp that works with the new AI functions: https://github.com/sgomez/ollama-ai-provider

Replies: 7 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by lgrammel
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Help
Labels
None yet
7 participants