Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

LocalLLM Error 400 - chat messages must alternate roles between 'user' and 'assistant' #1159

Open
ssling0817 opened this issue May 16, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@ssling0817
Copy link

ssling0817 commented May 16, 2024

System Info

pandasai version 2.0.42
python version 3.10.0 and 3.11.9

馃悰 Describe the bug

from pandasai import SmartDataframe
from pandasai.llm.local_llm import LocalLLM
import pandas as pd
df = pd.DataFrame({
    "country": [
        "United States",
        "United Kingdom",
        "France",
        "Germany",
        "Italy",
        "Spain",
        "Canada",
        "Australia",
        "Japan",
        "China",
    ],
    "gdp": [
        19294482071552,
        2891615567872,
        2411255037952,
        3435817336832,
        1745433788416,
        1181205135360,
        1607402389504,
        1490967855104,
        4380756541440,
        14631844184064,
    ],
    "happiness_index": [6.94, 7.16, 6.66, 7.07, 6.38, 6.4, 7.23, 7.22, 5.87, 5.12],
})
llm= LocalLLM(api_base=..., model=...,api_key =...)
sdf = SmartDataframe(df, config={"llm": llm, "verbose":True})
response = sdf.chat("Which are the top 5 countries by GDP?")

There will be error:

openai.BadRequestError: Error code: 400 - {'type': 'urn:inference-service:problem-details:bad-request', 'title': 'Bad Request', 'status': 400, 'detail': "chat messages must alternate roles between 'user' and 'assistant'.  Message may have a leading 'system' role message",

How to solve this issue? Or am I doing anything wrong?

Btw, the llm is connected successfully. As when I execute

llm.chat_completion("Which are the top 5 countries by GDP?", None)

It will return something such as "' According to the International Monetary Fund (IMF) World Economic Outlook Update'".

Copy link

dosubot bot commented May 16, 2024

The error you're encountering with the LocalLLM suggests there's an issue with how chat messages are structured, specifically requiring an alternation between 'user' and 'assistant' roles. This is a common requirement for conversational AI models to ensure a coherent dialogue flow. To address this, ensure that your implementation correctly alternates messages between these roles. If your code directly sends multiple user or assistant messages in a row without alternating, it could trigger this error.

To resolve the issue, double-check the sequence of messages being sent to the LocalLLM. Each 'user' message should be followed by an 'assistant' message, without two consecutive messages from the same role. If you're programmatically generating these messages, ensure your logic enforces this alternation.

If the problem persists, reviewing the complete error stack trace and the specific configuration of your LocalLLM instance could provide more insights. Pay close attention to the api_base and model parameters, and ensure they're correctly set up according to the documentation. Remember to keep your API key confidential and not share it in public forums.

By adjusting your message sequencing and verifying your LocalLLM setup, you should be able to overcome the BadRequestError you're facing.

Details

About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 馃憤 if this is helpful and 馃憥 if it is irrelevant.

To continue the conversation, mention @dosu.

@dosubot dosubot bot added the bug Something isn't working label May 16, 2024
@ssling0817
Copy link
Author

ssling0817 commented May 16, 2024

These are the logs in pandasai.log

2024-05-16 15:59:38 [INFO] Question: Which are the top 5 countries by GDP?
2024-05-16 15:59:38 [INFO] Running PandasAI with local LLM...
2024-05-16 15:59:38 [INFO] Prompt ID: c76ff404-ecb6-4cd1-9981-d430acb60402
2024-05-16 15:59:38 [INFO] Executing Pipeline: GenerateChatPipeline
2024-05-16 15:59:38 [INFO] Executing Step 0: ValidatePipelineInput
2024-05-16 15:59:38 [INFO] Executing Step 1: CacheLookup
2024-05-16 15:59:38 [INFO] Executing Step 2: PromptGeneration
2024-05-16 15:59:41 [INFO] Using prompt:
dfs[0]:10x3
country,gdp,happiness_index

France,7981801142,5.12

Germany,8803724522,6.94

Canada,0022525473,5.87

Update this initial code:

# TODO: import the required dependencies
import pandas as pd

# Write code here

# Declare result var: 
type (possible values "string", "number", "dataframe", "plot"). Examples: { "type": "string", "value": f"The highest salary is {highest_salary}." } or { "type": "number", "value": 125 } or { "type": "dataframe", "value": pd.DataFrame({...}) } or { "type": "plot", "value": "temp_chart.png" }

QUERY

Which are the top 5 countries by GDP?

Variable dfs: list[pd.DataFrame] is already declared.

At the end, declare "result" variable as a dictionary of type and value.

If you are asked to plot a chart, use "matplotlib" for charts, save as png.

Generate python code and return full updated code:
2024-05-16 15:59:41 [INFO] Executing Step 3: CodeGenerator
2024-05-16 15:59:41 [INFO] HTTP Request: POST https://.../v1/chat/completions "HTTP/1.1 400 Bad Request"
2024-05-16 15:59:41 [ERROR] Pipeline failed on step 3: Error code: 400 - {'type': 'urn:inference-service:problem-details:bad-request', 'title': 'Bad Request', 'status': 400, 'detail': "chat messages must alternate roles between 'user' and 'assistant'. Message may have a leading 'system' role message", 'instance': '/v2/nvcf/pexec/functions/d7618e99-db93-4465-af4d-330213a7f51f', 'requestId': '6581f9d9-564c-41fc-b193-44f4c3d6392f'}

Copy link

dosubot bot commented May 16, 2024

To address the error you're encountering with the LocalLLM model, it's crucial to ensure that the chat messages are structured to alternate roles between 'user' and 'assistant'. This requirement is designed to maintain a coherent dialogue flow. The error indicates a possible issue with how messages are being sequenced or structured before being sent to the model.

Here are some steps to troubleshoot and potentially resolve the issue:

  1. Review Message Structure: Ensure that each 'user' message sent to the LocalLLM is followed by an 'assistant' response before another 'user' message is sent. This alternation is necessary to prevent the BadRequestError you're seeing.

  2. Check Initialization and Usage: Verify how the LocalLLM model is initialized, including the api_base, model, and api_key parameters. Incorrect configuration here might not directly cause the error you're seeing but could lead to other issues.

  3. Examine Code for Message Sending: Look into how the chat messages are being structured and sent to the LocalLLM model. If there's a part of your code that sends multiple messages or interacts with the model in a way that could disrupt the expected 'user'-'assistant' message sequence, it could trigger the error.

  4. Additional Context: If there are other parts of your code that interact with the LocalLLM model or the SmartDataframe object, reviewing these interactions might provide insights. There might be additional messages or interactions that are not immediately apparent from the provided snippet.

By focusing on these areas, you should be able to identify and correct the issue with the message structure that's causing the BadRequestError.

Details

About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 馃憤 if this is helpful and 馃憥 if it is irrelevant.

To continue the conversation, mention @dosu.

@qgzhaodow1
Copy link

I have the same issue. What I have done is have a custom local llm, and set the messages to an empty list within the chat_completion function. This is just a temporary solution, and will not include any chat history.

@ssling0817
Copy link
Author

@qgzhaodow1 Do you also use SmartDataframe? As I need it to connect to the df. If so, could you attach your code for reference? Thank you!

@qgzhaodow1
Copy link

@qgzhaodow1 Do you also use SmartDataframe? As I need it to connect to the df. If so, could you attach your code for reference? Thank you!

class CustomLocalLLM(LLM):
    def __init__(self, api_base: str, model: str = "", api_key: str = "", **kwargs):
        if not api_key:
            api_key = "dummy"

        self.model = model
        self.client = OpenAI(base_url=api_base, api_key=api_key).chat.completions
        self._invocation_params = kwargs

    def chat_completion(self, value: str, memory: Memory) -> str:
        messages = memory.to_openai_messages() if memory else []
        messages = []   # temporary solution

        # adding current prompt as latest query message
        messages.append(
            {
                "role": "user",
                "content": value,
            }
        )

        params = {"model": self.model, "messages": messages, **self._invocation_params}
        response = self.client.create(**params)

        return response.choices[0].message.content

    def call(self, instruction: BasePrompt, context: PipelineContext = None) -> str:
        self.last_prompt = instruction.to_string()

        memory = context.memory if context else None

        return self.chat_completion(self.last_prompt, memory)

    @property
    def type(self) -> str:
        return "local"

I just set messages to empty list before append the current prompt.

llm= CustomLocalLLM(api_base=..., model=...,api_key =...)
sdf = SmartDataframe(df, config={"llm": llm, "verbose":True})
response = sdf.chat("Which are the top 5 countries by GDP?")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants