Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

trainning for word task on a custom datatset for ner #78

Closed
SGidentification opened this issue May 17, 2024 · 2 comments
Closed

trainning for word task on a custom datatset for ner #78

SGidentification opened this issue May 17, 2024 · 2 comments

Comments

@SGidentification
Copy link

Hi, I was running the following command and I face the error bellow, do you have any idea about it?

python experiments/run_word_task.py train_configs/word-task/Llama2-bi-mntp.json

File "/workspace/llm2vec/experiments/run_word_task.py", line 740, in <module> main() File "/workspace/llm2vec/experiments/run_word_task.py", line 736, in main trainer.train() File "/usr/local/lib/python3.10/dist-packages/transformers/trainer.py", line 1885, in train return inner_training_loop( File "/usr/local/lib/python3.10/dist-packages/transformers/trainer.py", line 2216, in _inner_training_loop tr_loss_step = self.training_step(model, inputs) File "/usr/local/lib/python3.10/dist-packages/transformers/trainer.py", line 3238, in training_step loss = self.compute_loss(model, inputs) File "/usr/local/lib/python3.10/dist-packages/transformers/trainer.py", line 3264, in compute_loss outputs = model(**inputs) File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(*args, **kwargs) File "/workspace/llm2vec/experiments/run_word_task.py", line 93, in forward outputs = self.model( File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/peft/peft_model.py", line 642, in forward return self.get_base_model()(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/modeling_llama.py", line 940, in forward causal_mask = self._update_causal_mask( TypeError: LlamaBiModel._update_causal_mask() takes from 4 to 5 positional arguments but 6 were give

@vaibhavad
Copy link
Collaborator

Hi @SGidentification,

Are you using the latest version of llm2vec? We had pushed a fix for this issue (5492ca5) when it was raised in #33.

@SGidentification
Copy link
Author

thanks I changed the version of transformers to "transformers_version": "4.40.0" and llm2vec to 0.1.5 and now it is working

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants