Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running Ollama with LLama3 and Phi3 #341

Open
arhaang13 opened this issue May 13, 2024 · 1 comment
Open

Running Ollama with LLama3 and Phi3 #341

arhaang13 opened this issue May 13, 2024 · 1 comment

Comments

@arhaang13
Copy link

Hello,

I wanted to open the issue when using taskweaver with Ollama, run on the local machine, none of the models provided within Ollama are functional.

The way I configured the taskweaver_config.json is:
Screenshot 2024-05-13 at 10 05 37 AM

When I try running Ollama with phi3, the output that I get is:
Screenshot 2024-05-13 at 10 08 18 AM

The same issue occurs when configuring taksweaver with Llama3 in the same way.

I hope to hear back!

Best,
Arhaan

@liqul
Copy link
Contributor

liqul commented May 13, 2024

The problem is not with Ollama, but the capability of the model served with Ollama. The model failed in following the instructions in the prompt to generate a response in the correct format. Phi3 is typically too small to generate a right response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants