Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapted to the Ascend NPU #3933

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

Dbassqwer
Copy link

对当前0.2.10版本进行昇腾NPU适配,其中Fastchat已由官方Repo适配过,本次commit适配了以下内容:
1、embedding model部分支持NPU加载
2、以及可选device支持NPU

@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Apr 30, 2024
@@ -512,25 +512,28 @@ def _get_proxies():
def detect_device() -> Literal["cuda", "mps", "cpu"]:
try:
import torch
import mindspore as ms
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

第515行的import放在520行前面要好些,用cuda和mps的用户不需要引入这个mindspore就直接返回了。

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

为什么要用mindspore而不是直接用torch_npu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:S This PR changes 10-29 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants