Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

the prompt format of different models #405

Open
2 tasks
ghost opened this issue Mar 21, 2024 · 1 comment
Open
2 tasks

the prompt format of different models #405

ghost opened this issue Mar 21, 2024 · 1 comment
Assignees
Labels

Comments

@ghost
Copy link

ghost commented Mar 21, 2024

System Info

Information

  • The official example scripts
  • My own modified scripts

馃悰 Describe the bug

Hello,
where can I find the prompt format definitions of different Llama models? like llama2 7B and Llama2 7B chat model.

Error logs

Expected behavior

@HamidShojanazeri
Copy link
Contributor

@mxjyst for chat models you would need this format, also if you are a HF user this should be applied here tokenizer.apply_chat_template too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant