Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve pg_search documentation around tokenizer details #1056

Open
neilyio opened this issue Apr 12, 2024 · 0 comments
Open

Improve pg_search documentation around tokenizer details #1056

neilyio opened this issue Apr 12, 2024 · 0 comments
Labels
documentation Improvements or additions to documentation pg_search Issue related to `pg_search/` priority-2-medium Medium priority issue

Comments

@neilyio
Copy link
Contributor

neilyio commented Apr 12, 2024

What
Specifically, we should make clear that search operations like regex, fuzzy find, etc all take place on the tokens that are generated by the index tokenizer. Not necessarily the row data itself.

For example, a username of johnlovemarry might be tokenized as "john love marry" or tokenized as "johnlovemarry", and it would affect how a regex of "(love)" would match it... matching john love marry and not johnlovemarry.

Some examples of how the tokenizers work on simple words would go a long way.

@philippemnoel philippemnoel added documentation Improvements or additions to documentation priority-2-medium Medium priority issue pg_search Issue related to `pg_search/` labels Apr 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation pg_search Issue related to `pg_search/` priority-2-medium Medium priority issue
Projects
None yet
Development

No branches or pull requests

2 participants