-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat: use ollama in lastest llamaindex #77
base: main
Are you sure you want to change the base?
Feat: use ollama in lastest llamaindex #77
Conversation
@thucpn is attempting to deploy a commit to the LlamaIndex Team on Vercel. A member of the Team first needs to authorize it. |
@@ -146,15 +151,15 @@ export async function POST(request: NextRequest) { | |||
); | |||
} | |||
|
|||
const llm = new OpenAI({ | |||
model: config.model, | |||
const llm = new Ollama({ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@thucpn how about if config.model
is not one of ALL_AVAILABLE_OPENAI_MODELS
, we fall back to Ollama
?
@@ -193,5 +202,3 @@ export async function POST(request: NextRequest) { | |||
|
|||
export const runtime = "nodejs"; | |||
export const dynamic = "force-dynamic"; | |||
// Set max running time of function, for Vercel Hobby use 10 seconds, see https://vercel.com/docs/functions/serverless-functions/runtimes#maxduration |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
don't remove
No description provided.