-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
模型从chatglm3-6b切换到qwen-1_8B的时候对话就会报错 #4032
Comments
The error you're encountering,
By following these steps, you should be able to resolve the error and ensure smooth operation when switching models.
|
根据报错信息File "/usr/local/lib/python3.11/dist-packages/langchain_community/chat_models/openai.py", line 493, in _astream找到文件openai.py底493行,将判断条件改成 if chunk is None or chunk["choices"] is None or len(chunk["choices"]) == 0: 这样就可以正常运行了,使用知识库回答问题了,我切换了几个不同的LLM也能正常回答 |
谢谢,我试试 |
问题描述 / Problem Description
当我将LLM模型从chatglm3-6b切换到qwen-1_8B的时候对话就会报错,只有在我使用docker运行时会出现,直接在linux服务器上运行没有发现这个问题,这两种情况使用的模型是相同的
报错信息如下
这是我的dockerfile
The text was updated successfully, but these errors were encountered: