Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is Qwen1.5-14B-Chat-GPTQ-Int4 not supported now? #1510

Open
1 of 2 tasks
wongyan-data opened this issue May 11, 2024 · 0 comments
Open
1 of 2 tasks

Is Qwen1.5-14B-Chat-GPTQ-Int4 not supported now? #1510

wongyan-data opened this issue May 11, 2024 · 0 comments
Labels
documentation Improvements or additions to documentation Waiting for reply

Comments

@wongyan-data
Copy link

Search before asking

  • I had searched in the issues and found no similar feature requirement.

Description

通过vllm0.4.0使用该量化模型可以使用,通过dbgpt加载提示错误,看支持的模型清单里面没有这个,请问这个模型是不是现在还不兼容。

Documentation Links

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!
@wongyan-data wongyan-data added documentation Improvements or additions to documentation Waiting for reply labels May 11, 2024
@wongyan-data wongyan-data reopened this May 11, 2024
@Aries-ckt Aries-ckt changed the title 请问Qwen1.5-14B-Chat-GPTQ-Int4是不是现在不支持? Is Qwen1.5-14B-Chat-GPTQ-Int4 not supported now? May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation Waiting for reply
Projects
None yet
Development

No branches or pull requests

1 participant