Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't make it work with LMStudio #1224

Closed
3 tasks done
tthierryEra opened this issue May 2, 2024 · 7 comments
Closed
3 tasks done

Can't make it work with LMStudio #1224

tthierryEra opened this issue May 2, 2024 · 7 comments
Assignees
Labels
bug Something isn't working

Comments

@tthierryEra
Copy link

tthierryEra commented May 2, 2024

Before submitting your bug report

Relevant environment info

- OS: Windows
- Continue: v0.8.25
- IDE: VsCode

I'm using LMStudio with mistral or phi or llama3 whatever

Description

image

image

Hello,

When using LMStudio as result is the same, in the chatbot and in file I always get the system prompt back and all whatever I try on the prompt formating is not working.

Really need help here. It's working with ollama and external apis.

Thanks

To reproduce

No response

Log output

No response

@tthierryEra tthierryEra added the bug Something isn't working label May 2, 2024
@sestinj sestinj self-assigned this May 2, 2024
@sestinj
Copy link
Contributor

sestinj commented May 2, 2024

@tthierryEra can you share your config.json so I can better help debug?

@tthierryEra
Copy link
Author

Hello Thank you

I did not changed anything else than adding models
{
"title": "LM Studio - Llama 8G",
"provider": "lmstudio",
"model": "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF"
},
{
"title": "Crusoeai Llama 8B - 262k",
"provider": "lmstudio",
"model": "crusoeai/Llama-3-8B-Instruct-262k-GGUF"
},

@sestinj
Copy link
Contributor

sestinj commented May 3, 2024

@tthierryEra thanks. I think maybe the best place to check would be in the "Output" tab next to the VS Code terminal, and then in the dropdown on the right select "Continue - LLM Prompts/Completions". This will show the exact prompt send to the LLM. This in addition to the logs that you see on the side of LM Studio will likely show us what we need to see (it's definitely just a prompt formatting mistake).

The first possible solution I can think of is to double-check your prompt formatting settings on the LM Studio side. I believe you can edit these in the right side panel

@tthierryEra
Copy link
Author

tthierryEra commented May 3, 2024

So from my testing, whenever I add a system prompt wheter it's directly in LMStudio or in continue.dev code like config.ts, that system prompt will be printed back to the chat.

If I leave everything empty it's "working" but really not useful as the llm does not know how to react/interact.

Exemple:

Output from continue

Settings:
contextLength: 4096
model: MaziyarPanahi/WizardLM-2-7B-GGUF
maxTokens: 1024

<user>
hello

Completion:


You are a useful coding assistant.hello! How can I assist you today?

As a coding assistant, I'm here to help with a wide range of programming-related questions and tasks. Whether you need assistance writing or understanding code, debugging issues, learning about best practices, or exploring new technologies, feel free to ask. What do you need help with today?

I tried to leave everything empty. Or add as the prefix for user message.
Not working either :-(

Settings:
contextLength: 4096
model: MaziyarPanahi/WizardLM-2-7B-GGUF
maxTokens: 1024

############################################

<user>
hello. How's it going?

==========================================================================
==========================================================================
Completion:

You are a useful coding assistant.<user>hello. How's it going?</user>Hello there! I'm doing well, thank you. It's always a pleasure to assist with any questions or tasks you might have. How can I help you today?

Maybe I'm doing something wrong?

I tried this
image

Or all empty
image

Not working still have this as result
image

@xinnod
Copy link

xinnod commented May 4, 2024

I just ran into this problem and fixed it on my end. The broken behavior happens when running LM Studio 0.2.22. Reverting back to the 0.2.21 version has fixed the prompting formatting.

@sestinj sestinj closed this as completed May 6, 2024
@tthierryEra
Copy link
Author

You closed the issue because the solution is to downgrade LMStudio?
Or is there a fix in continue?

@tthierryEra
Copy link
Author

I just ran into this problem and fixed it on my end. The broken behavior happens when running LM Studio 0.2.22. Reverting back to the 0.2.21 version has fixed the prompting formatting.

@xinnod Where did you found the old version? Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants