Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New context size limit for Llama 3 in settings #572

Open
sdudzin opened this issue May 17, 2024 · 1 comment
Open

New context size limit for Llama 3 in settings #572

sdudzin opened this issue May 17, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@sdudzin
Copy link

sdudzin commented May 17, 2024

What happened?

LLama 3 has new context size of 8192 as opposed to 4096 for Llama 2. However the settings dialog limits the context size of 4096 max, also the text description says the limit is 2048.

Relevant log output or stack trace

No response

Steps to reproduce

No response

CodeGPT version

2.7.1

Operating System

macOS

@sdudzin sdudzin added the bug Something isn't working label May 17, 2024
@PhilKes
Copy link
Contributor

PhilKes commented May 18, 2024

You are right, the UI text is a bit misleading/unhelpful and the context size should be model dependent.
The question is should we define hard context size limits at all?
The thing is you can use any model with a larger context size that they have been trained on, usually completion quality just gets much worse at some point, but still if a user wants to do that, why not?

I think I would prefer to not put a hard limit onto the prompt context size input field, and instead only put a hint below it with the context size the model has been trained on (e.g. for Llama 3 8k, CodeQwen 64k, etc.), or even set it as a default value, instead of defining a max value.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants