Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: KnowledgeGraphQueryEngine occurs error when query #13585

Open
yanyi74 opened this issue May 19, 2024 · 2 comments
Open

[Bug]: KnowledgeGraphQueryEngine occurs error when query #13585

yanyi74 opened this issue May 19, 2024 · 2 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@yanyi74
Copy link

yanyi74 commented May 19, 2024

Bug Description

ValidationError: 1 validation error for LLMPredictStartEvent
template
none is not an allowed value (type=type_error.none.not_allowed)

截屏2024-05-19 20 38 04

Version

0.10.37

Steps to Reproduce

截屏2024-05-19 20 39 57 截屏2024-05-19 20 38 04

Relevant Logs/Tracbacks

No response

@yanyi74 yanyi74 added bug Something isn't working triage Issue needs to be triaged/prioritized labels May 19, 2024
Copy link

dosubot bot commented May 19, 2024

The ValidationError indicating that none is not an allowed value for template suggests that a None value is being passed where it is not permitted. This issue typically arises in scenarios where the KnowledgeGraphQueryEngine or its methods are expecting certain parameters to be provided, and those parameters are either missing or explicitly set to None.

To address this issue, follow these steps:

  1. Verify Constructor Arguments: Ensure the KnowledgeGraphQueryEngine is instantiated with all required arguments, especially storage_context, and that storage_context includes a valid graph_store.

  2. Check Method Calls: Make sure that the arguments passed to methods like generate_query are not None, unless allowed by the method's documentation.

  3. Optional Parameters: If using optional parameters, confirm they are correctly instantiated or handled when set to None.

  4. Debugging: Implement logging within the KnowledgeGraphQueryEngine methods to trace the values of parameters at runtime, helping to pinpoint where a None value might be introduced.

  5. Review Code for Dynamic Updates: If your implementation involves dynamically updating prompts or similar features, ensure these updates do not result in None values being passed where they are not expected.

By systematically checking these areas, you should be able to identify and correct the cause of the ValidationError. If the issue persists, creating a minimal reproducible example could further help in isolating and addressing the problem.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@luisenriqueramos1977
Copy link

I have a similar issue, here is the code:

import os
import openai
from llama_index.llms.openai import OpenAI
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.core.query_engine import KnowledgeGraphQueryEngine
from llama_index.graph_stores.neo4j import Neo4jGraphStore
from llama_index.core import StorageContext

define LLM

from llama_index.core import Settings

#openai api conneting
os.environ["OPENAI_API_KEY"] = "sk-proj-mykey"
openai.api_key = os.environ["OPENAI_API_KEY"]

#llm=OpenAI(model_name="gpt-3.5-turbo")
#service_context = ServiceContext.from_defaults(llm=llm, chunk_size=256)

llm = OpenAI(temperature=0, model="gpt-3.5-turbo")
Settings.llm = llm
Settings.chunk_size = 512

graph_store = Neo4jGraphStore(username="neo4j",password="mypassword",
url="bolt://localhost:7687",database="neo4j")

storage_context = StorageContext.from_defaults(graph_store=graph_store)

query_engine = KnowledgeGraphQueryEngine(
storage_context=storage_context,
llm=Settings.llm,
verbose=True,
)

response = query_engine.query(
"what is the query?",
)

print(response)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

2 participants