Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Table has no column with name doc #52

Open
Boburmirzo opened this issue Aug 15, 2023 · 2 comments
Open

Table has no column with name doc #52

Boburmirzo opened this issue Aug 15, 2023 · 2 comments

Comments

@Boburmirzo
Copy link
Contributor

Currently, there is no way to send the data to the indexing process without creating a doc column from the input.

Need to fix the indexing error:

AttributeError: Table has no column with name doc.
Occurred here:
    Line: query_context = index.query(embedded_query, k=3).select(
    File: /home/bumurzokov/llm-app/src/prompt.py:14

When no doc column defined, it always fails at the index stage:

# Compute embeddings for each document using the OpenAI Embeddings API
embedded_data = contextful(context=documents, data_to_embed=documents.doc)
@mdmalhou
Copy link
Contributor

The index expects a column doc containing the chunks and data having their corresponding embeddings. Open for any suggestions or alternatives.

@Boburmirzo
Copy link
Contributor Author

@mdmalhou Thank you for replying to this.

If it is a technical requirement having always doc column for indexing, my suggestion can be somehow abstract this step in the library so that user can specify what fields to index and the LLM App automatically creates doc column under the hood for chosen fields.

If user does not specify any field to index, LLM App creates doc column with all fields for indexing.

The same we have already discussed with @janchorowski last week.

What do you think?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants