Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doesn't include the summery of chat history in the chat memory by langchain ConversationSummaryBufferMemory #21604

Open
5 tasks done
sawinuCP opened this issue May 13, 2024 · 0 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: memory Related to memory module 🔌: openai Primarily related to OpenAI integrations

Comments

@sawinuCP
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

def __set_chain_memory(self, user_id, conversation_id):
        chat_mem = self.chat_memory.get(user_id, conversation_id)
        llm = ChatOpenAI(temperature=0, model_name=GPT3_MODEL)
        self.chain_memory = ConversationSummaryBufferMemory(
            llm=llm,
            chat_memory=chat_mem,
            memory_key="history",
            input_key="query",
            return_messages=True,
            max_token_limit=1000,
        )
        self.chain_memory.prune()```

```   def generate_streaming_llm_response(
        self,
        user_id: str,
        conversation_id: str,
        user_input,
        llm,
        prompt: str,
        callback_handler: StreamingHandler,
    ):
        self.__set_chain_memory(user_id, conversation_id)

        chain = StreamingChain(
            llm=llm,
            prompt=prompt,
            memory=self.chain_memory,
            queue_manager=callback_handler.queue_manager,
        )

        return chain.stream(user_input, callback_handler.queue_id)```

```class StreamingChain(LLMChain):
    queue_manager = QueueManager()

    def __init__(self, llm, prompt, memory, queue_manager):
        super().__init__(llm=llm, prompt=prompt, memory=memory)
        self.queue_manager = queue_manager

    def stream(self, input, queue_id, **kwargs):
        queue = self.queue_manager.get_queue(queue_id)

        def task():
            try:
                self(input)
            except Exception as e:
                logger.exception(f"Exception caught")
                self.queue_manager.close_queue(queue_id)

        t = Thread(target=task)
        t.start()

        try:
            while True:
                token = queue.get()
                if token is None:
                    break
                yield token
        finally:
            t.join()
            self.queue_manager.close_queue(queue_id)```




### Error Message and Stack Trace (if applicable)

_No response_

### Description

I'm using a chatbot which is created by python langchain. In that I'm sending requests to the several LLM models (OPENAI , Claude, GEMINI). When I send requests to llm, first I summarize my previous chat. For summarize it I send my previous chats with a prompt mentioning to summarize this chat. It is done by ConversationSummaryBufferMemory using llm = ChatOpenAI(temperature=0, model_name=GPT3_MODEL) . By then I got that summary of the chat history and I stored it in a variable. After when I send my query to the LLM, I send it with the prompt , query and the summery of chat history that I have stored in a variable. But in the verbose I can see whole the chat history in the prompt instead of the summery of the previous chat. I the code chain_memory is the variable that I store the summery of the chat. chat_mem is the whole previous chat that I get from the postgres database. after Summarizing the previous chat It wil be send in to the StreamingChain to generate the response .

### System Info

langchain==0.1.8
langchain-community==0.0.21
langchain-core==0.1.31
langchain-experimental==0.0.52
openai==1.12.0
langchain_anthropic==0.1.4
langchain_mistralai==0.0.5
@dosubot dosubot bot added Ɑ: memory Related to memory module 🔌: openai Primarily related to OpenAI integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: memory Related to memory module 🔌: openai Primarily related to OpenAI integrations
Projects
None yet
Development

No branches or pull requests

1 participant