We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问一下,本地部署之后,怎么通过demo/demo_qwen_agent.ipynb 这个例子来使用呢,就是部署完成之后怎么通过实例化RolePlay,来调用本地部署完成的qwen-7b呢,我在这个目录(config/cfg_model_template.json)下 添加了 "qwen1.5-7b-chat": { "type": "openai", "model": "qwen/Qwen1.5-7B-Chat", "api_base": "http://localhost:8000/v1", "is_chat": true, "is_function_call": false } 之后,关掉api_key,然后再llm_config中修改为'qwen1.5-7b-chat' ,此时无法调用本地llm来运行
无
None
The text was updated successfully, but these errors were encountered:
参考这个例子:https://github.com/modelscope/modelscope-agent/blob/master/examples/llms/local_llm.ipynb
Sorry, something went wrong.
参考这个例子:https://github.com/modelscope/modelscope-agent/blob/master/demo/demo_qwen_local_llm.ipynb
hi, 无法在代码仓库中找到这个demo文件
https://github.com/modelscope/modelscope-agent/blob/master/examples/llms/local_llm.ipynb 你好,由于本周做了example结构调整,还没来得及更新。
本地使用modelscope-swift运行模型,这里使用本地方式方式调用接口,执行报错 调整了几种方式,都无法解决,求指教
zzhangpurdue
suluyana
No branches or pull requests
Initial Checks
What happened + What you expected to happen
请问一下,本地部署之后,怎么通过demo/demo_qwen_agent.ipynb 这个例子来使用呢,就是部署完成之后怎么通过实例化RolePlay,来调用本地部署完成的qwen-7b呢,我在这个目录(config/cfg_model_template.json)下 添加了 "qwen1.5-7b-chat": {
"type": "openai",
"model": "qwen/Qwen1.5-7B-Chat",
"api_base": "http://localhost:8000/v1",
"is_chat": true,
"is_function_call": false
}
之后,关掉api_key,然后再llm_config中修改为'qwen1.5-7b-chat' ,此时无法调用本地llm来运行
Versions / Dependencies
无
Reproduction script
无
Issue Severity
None
The text was updated successfully, but these errors were encountered: