Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong encoding in responses with local ollama #1214

Open
3 tasks done
anhonestboy opened this issue May 1, 2024 · 2 comments
Open
3 tasks done

Wrong encoding in responses with local ollama #1214

anhonestboy opened this issue May 1, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@anhonestboy
Copy link

Before submitting your bug report

Relevant environment info

- OS:MacOs 14.3
- Continue: 0.8.25 (tested also 0.9. pre-release)
- IDE: VSCode 1.88.1

Description

Screenshot 2024-05-01 alle 20 39 39

To reproduce

No response

Log output

No response

@anhonestboy anhonestboy added the bug Something isn't working label May 1, 2024
@sestinj
Copy link
Contributor

sestinj commented May 2, 2024

@anhonestboy Is there any extra information you can share about your OS or Ollama setup? I have a very similar system to you (Mac 14.3, VS Code 1.89.0, Ollama with codellama:7b), but haven't been able to reproduce the problem

@serhatay
Copy link

serhatay commented May 3, 2024

I had the same issue with llama3:8b and codellama. After the ollama update llama3:8b started working but codellama still returns nonsense

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants