custom message and query in llm #11687
Replies: 2 comments 1 reply
-
Hey there, @PradiptaAtByteIQMind-Graph2023! Fancy seeing you here again. Hope you're ready to dive back into the world of code with us 🚀 Based on the error message you're receiving, it seems like you're trying to access a In your code, you're passing a formatted string to from llama_index.core.base.llms.types import ChatMessage, MessageRole
# Create a ChatMessage instance
message_content = f"""
{responses} \n
----------------------------------------------
{instruction_prompt} \n
==============================================
Required Return Format \n
{required_format}
"""
message = ChatMessage.from_str(content=message_content, role=MessageRole.SYSTEM)
# Pass the ChatMessage instance to the chat method
parse_response = self.ai_services.llm.chat(message) In this example, Please review your code and ensure that you're passing the correct types of objects to your methods. If For more information about the I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
{ am getting this error |
Beta Was this translation helpful? Give feedback.
-
Hi! Team
I got stucked here below:
responses = {
"user_query": prompt,
"sql_response": await self.postgres.run_query(prompt),
"mongo_response": await self.mongo.run_query(prompt),
"docusign_response": await self.docusign.run_query(prompt)
}
in the current setup i am passing like below:
response = f"""
{responses} \n
----------------------------------------------
{instruction_prompt} \n
==============================================
Required Return Format \n
{required_format}
"""
parse_response = self.ai_services.llm.chat(response)
getting the below error : AttributeError: 'str' object has no attribute 'role'
can someone let me know what mis-implementation i am doing?
Thanks and Regards
Pradipta
Beta Was this translation helpful? Give feedback.
All reactions