Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multiple conversation, how to seach vector with all the user input #1756

Closed
2 tasks done
xutengfei opened this issue Dec 14, 2023 · 5 comments
Closed
2 tasks done

multiple conversation, how to seach vector with all the user input #1756

xutengfei opened this issue Dec 14, 2023 · 5 comments
Assignees
Labels
👭 duplicate This issue or pull request already exists 💪 enhancement New feature or request no-issue-activity Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed

Comments

@xutengfei
Copy link

Self Checks

Dify version

0.3.2

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

first convesation i search "BYD company" , second conversation i search "what was it's advantages",the response of vector search was not about "BYD"

✔️ Expected Behavior

all response of vector search was about all the user input

❌ Actual Behavior

the response of vector search was about user intends

@xutengfei xutengfei added the 🐞 bug Something isn't working label Dec 14, 2023
@guchenhe
Copy link
Collaborator

Retrieval is done on a per-query basis by design, so it's expected that your second search won't return documents relevant to your first search.

However, in an actual conversation the model does have knowledge on your previous queries and retrieval results, and will respond accordingly (how well it retains previous context also depends on how you've written your prompts & context window)

@xutengfei
Copy link
Author

Retrieval is done on a per-query basis by design, so it's expected that your second search won't return documents relevant to your first search.

However, in an actual conversation the model does have knowledge on your previous queries and retrieval results, and will respond accordingly (how well it retains previous context also depends on how you've written your prompts & context window)

we found if we use gtp4 in dify, it will change the second search query automaticlly, that very cool. But it won't work when use other llm model.

@JohnJyong JohnJyong added 👭 duplicate This issue or pull request already exists 💪 enhancement New feature or request and removed 🐞 bug Something isn't working labels Jan 12, 2024
@VincePotato
Copy link

We use chat history as input context for chat model to better understand user query. That is not achieved through rewriting the user query.

@xutengfei
Copy link
Author

We use chat history as input context for chat model to better understand user query. That is not achieved through rewriting the user query.

this could not take advantage of Vector search,because chat history is not similar to vector data

Copy link

github-actions bot commented Feb 1, 2024

Close due to it's no longer active, if you have any questions, you can reopen it.

@github-actions github-actions bot added the no-issue-activity Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 1, 2024
@github-actions github-actions bot closed this as completed Feb 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
👭 duplicate This issue or pull request already exists 💪 enhancement New feature or request no-issue-activity Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed
Projects
None yet
Development

No branches or pull requests

4 participants