Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Models based on 'LlamaForCausalLM' are not yet supported #14

Closed
3 tasks done
cklogic opened this issue Apr 30, 2024 · 2 comments
Closed
3 tasks done

Error: Models based on 'LlamaForCausalLM' are not yet supported #14

cklogic opened this issue Apr 30, 2024 · 2 comments
Labels

Comments

@cklogic
Copy link

cklogic commented Apr 30, 2024

提交前必须检查以下项目

  • 请确保使用的是仓库最新代码(git pull)
  • 已阅读项目文档FAQ章节并且已在Issue中对问题进行了搜索,没有找到相似问题和解决方案。
  • 第三方插件问题:例如llama.cpptext-generation-webui等,建议优先去对应的项目中查找解决方案。

问题类型

模型量化和部署

基础模型

Llama-3-Chinese-Instruct-8B(基座模型)

操作系统

None

详细描述问题

ollama version is 0.1.32

依赖情况(代码类问题务必提供)

运行日志或截图

transferring model data 
unpacking model metadata 
Error: Models based on 'LlamaForCausalLM' are not yet supported
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

@github-actions github-actions bot added the stale label May 14, 2024
Copy link

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant