Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Models to inform you it does not know something or has incomplete data #2286

Open
Daza99 opened this issue Apr 29, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@Daza99
Copy link

Daza99 commented Apr 29, 2024

Feature Request

A common issue with LLM is asking a 'general' question about a topic that a model is not trained on or has no or little information, often it will either fill gaps creatively when coming up with an answer or state something that is plainly false. It would be great if there was a stop-gap measure where if the model has incomplete data or no data: the response will say (w/red text or cyan)"I do not know" if it completely lacks data on the question. Or only part of the data "I have incomplete data but what I do have on the topic is x, however you could try getting more information from these sources". Also if the question by the user is too broad, to ask user to redefine or narrow the question, this could save time computing wise too.

This feature could give better credibility to a model(s) if it wont switch to creative mode when it doesn't know something or has incomplete data and just states it does not know or warns that its answer might be incorrect. At least then we can seek/check other sources for an answer.

I have a feeling this suggestion might be, if it was easy to implement it would be already a feature. But I thought it is worth suggesting it, because this is one major issue I have when seeking answers to questions, not knowing how accurate the answer is. Maybe this is not possible because of the sheer scope of questions and topics.

@Daza99 Daza99 added the enhancement New feature or request label Apr 29, 2024
@cosmic-snow
Copy link
Collaborator

That's not really possible, or at least no one has come up with a good way to do that, yet.

They don't know what they don't know. And they'll happily make things up regardless (which is called hallicinating here).

What you can do, however, is augment the conversation with your own, external data. See LocalDocs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants