Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama3 instruct template will have different outputs depending on system tokens #4312

Open
Sneakr opened this issue May 10, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Sneakr
Copy link

Sneakr commented May 10, 2024

What is the issue?

Issue:
The absense or presence of a system token produces different outputs, based on my findings:
ggerganov/llama.cpp#7062 (comment)

This is even more important for fine tunes on the instruct models as it can break everything.

Official ollama - llama3 template removes tokens if no system message is present:
https://github.com/ollama/ollama/blob/main/docs/modelfile.md
https://ollama.com/library/llama3:instruct/blobs/8ab4849b038c

The corrected template should be:
TEMPLATE """<|start_header_id|>system<|end_header_id|> {{ .System }} <|eot_id|>{{ if .Prompt }} <|start_header_id|>user<|end_header_id|> {{ .Prompt }} <|eot_id|>{{ end }} <|start_header_id|>assistant<|end_header_id|> {{ .Response }} <|eot_id|>"""

Edit:
I've made an official thread and awaiting response:
meta-llama/llama3#203

@Sneakr Sneakr added the bug Something isn't working label May 10, 2024
@Sneakr
Copy link
Author

Sneakr commented May 11, 2024

So meta just changed the template page without answering my issue :) Atleast give some credit where credit is due.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant