Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] 我下载了huggingface上的baichuan7b模型,使用 里面的测试程序测试发现CUDA错误 #143

Open
5 tasks done
QIANXUNZDL123 opened this issue Apr 18, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@QIANXUNZDL123
Copy link

Required prerequisites

System information

from transformers import AutoModelForCausalLM, AutoTokenizer
import os
os.environ["CUDA_VISIBLE_DEVICES"] = '0,1,2,3'

tokenizer = AutoTokenizer.from_pretrained("./", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("./", device_map="auto", trust_remote_code=True)
inputs = tokenizer('登鹳雀楼->王之涣\n夜雨寄北->', return_tensors='pt')

inputs = inputs.to('cuda:2')

pred = model.generate(**inputs, max_new_tokens=16,repetition_penalty=1.1)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))

Problem description

image
image

Reproducible example code

The Python snippets:

Command lines:

Extra dependencies:


Steps to reproduce:

Traceback

No response

Expected behavior

No response

Additional context

No response

Checklist

  • I have provided all relevant and necessary information above.
  • I have chosen a suitable title for this issue.
@QIANXUNZDL123 QIANXUNZDL123 added the bug Something isn't working label Apr 18, 2024
@QIANXUNZDL123
Copy link
Author

image
我找了好几处pad_token_id 查看都是0 但是就是报错,有没有大佬帮忙解决下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant