Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Connect to Python-based llamaindex instance #81

Open
PanCakeConnaisseur opened this issue Mar 7, 2024 · 1 comment
Open

Comments

@PanCakeConnaisseur
Copy link

I built a small RAG with a local embedding model in the normal python-based llamaindex. How do I use this react-based chat application with the python-based chat engine? Or what is the idiomatic way to have a GUI chat for the python-based llamaindex?

@marcusschiesser
Copy link
Collaborator

I think you have two options:

  1. Continue working on the Python backend for Chat LlamaIndex, see [Feature] Python version #30 (comment)
  2. Use create-llama to generate a FastAPI backend with NextJS frontend and integrate your Python RAG code into it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants