Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port HuggingChat prompt templates to transformers tokenizers #978

Open
nsarrazin opened this issue Apr 4, 2024 · 0 comments
Open

Port HuggingChat prompt templates to transformers tokenizers #978

nsarrazin opened this issue Apr 4, 2024 · 0 comments
Labels
huggingchat For issues related to HuggingChat specifically

Comments

@nsarrazin
Copy link
Collaborator

nsarrazin commented Apr 4, 2024

We added the option to use the tokenizer field to determine the chat template. (PR)

However we couldn't port the ones used in prod because we need to support system prompts. One way we could handle it would be:

  1. Try to apply the chat template
  2. Catch any error being thrown, if one is thrown get rid of the system prompt, inject it directly in the first user message and try to apply the chat template again.
  3. If an error is thrown again well something is wrong with the chat template 😅

Haven't tested it yet with our prod config though.

@nsarrazin nsarrazin added the huggingchat For issues related to HuggingChat specifically label Apr 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
huggingchat For issues related to HuggingChat specifically
Projects
None yet
Development

No branches or pull requests

1 participant