Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

transformers > 4.38 causes bug in inference for HF models #463

Open
rmitsch opened this issue Apr 24, 2024 · 0 comments
Open

transformers > 4.38 causes bug in inference for HF models #463

rmitsch opened this issue Apr 24, 2024 · 0 comments
Labels
bug Something isn't working feat/model Feature: models

Comments

@rmitsch
Copy link
Collaborator

rmitsch commented Apr 24, 2024

Inference fails with

TypeError: transformers.generation.utils.GenerationMixin.generate() got multiple values for keyword argument 'pad_token_id'
```. 
Cause for this is unclear so far.

Workaround for the time being is to pin `transformers` to <= 4.38.
@rmitsch rmitsch added bug Something isn't working feat/model Feature: models labels Apr 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working feat/model Feature: models
Projects
None yet
Development

No branches or pull requests

1 participant