Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llama 3 chat template #371

Open
woheller69 opened this issue Apr 26, 2024 · 1 comment
Open

Llama 3 chat template #371

woheller69 opened this issue Apr 26, 2024 · 1 comment

Comments

@woheller69
Copy link

When using your chat template from here the output starts with "assistant"
https://huggingface.co/jartine/Meta-Llama-3-8B-Instruct-llamafile#prompting

<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{{prompt}}<|eot_id|>{{history}}<|start_header_id|>{{char}}<|end_header_id|>

It works if I change {{char}} to assistant:

<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{{prompt}}<|eot_id|>{{history}}<|start_header_id|>assistant<|end_header_id|>

@Mayorc1978
Copy link

Can you specify what goes in Template, and Chat Template.
Also there was a problem with stop tokens in early Llama3 how I am sure that is fixed with llamafile?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants