Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问LLAMA3,里面是按1.2.3.4的顺序来分别执行吗? #94

Open
YadiHe opened this issue Apr 24, 2024 · 6 comments
Open

请问LLAMA3,里面是按1.2.3.4的顺序来分别执行吗? #94

YadiHe opened this issue Apr 24, 2024 · 6 comments

Comments

@YadiHe
Copy link

YadiHe commented Apr 24, 2024

我第一个完成后
image
不知道怎么继续对话

@KMnO4-zx
Copy link
Contributor

第一个是部署api形式的llama3,第二个是部署langchian接入形式的llama3,第三个是部署webdemo对话,第四个是lora微调llama3.
如果你想要对话的话,可以尝试给第三个

@YadiHe
Copy link
Author

YadiHe commented Apr 24, 2024

谢谢指点,请问第二个是部署langchian接入形式的llama3,这里的langchian就是可以接入很多大模型给自己选择用哪一个吗?

@KMnO4-zx
Copy link
Contributor

详细内容,请看仓库文档

@KMnO4-zx
Copy link
Contributor

https://github.com/datawhalechina/llm-universe

如果你对angchain不太熟悉,可以看下这个仓库

@YadiHe
Copy link
Author

YadiHe commented Apr 24, 2024

大佬,在终端中运行以下命令,启动streamlit服务,并按照 autodl 的指示将端口映射到本地,然后在浏览器中打开链接 http://localhost:6006/ ,即可看到聊天界面。
这里怎么打开啊,我到这一步了
image

@KMnO4-zx
Copy link
Contributor

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants