Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase connection Timeout #105

Open
shafiqalibhai opened this issue May 11, 2024 · 5 comments
Open

Increase connection Timeout #105

shafiqalibhai opened this issue May 11, 2024 · 5 comments

Comments

@shafiqalibhai
Copy link

Maybe add a variable in settings to change the default timeout setting.

@adamierymenko
Copy link

Seconded, since sometimes the initial load of the model into Ollama times out for big models and you have to re-submit your prompt. Once it's "warmed up" it's fine.

Maybe there's an alternative way for the program to see if Ollama is still running and just taking a long time to respond.

@haydonryan
Copy link

Totally agree on this. timeout needs to be increased.

@egaralmeida
Copy link

egaralmeida commented May 19, 2024

This is importart for folks with low end hardware. I agree, should be in the settings.

@adamierymenko
Copy link

It's important even for high end hardware if you're using a giant model. Sometimes the initial model load times out and you have to resubmit, after which it works.

@haydonryan
Copy link

100% i'm running a 16 core epyc as my LLM machine - it really chugs trying to load mixtral 8x22b even if loading of NVME into ram.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants