Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ConversationSummaryBufferMemory does not work as expected with MongoDBChatMessageHistory #21610

Open
5 tasks done
Sameera2001Perera opened this issue May 13, 2024 · 4 comments
Labels
Ɑ: memory Related to memory module 🔌: mongo Primarily related to Mongo integrations

Comments

@Sameera2001Perera
Copy link

Sameera2001Perera commented May 13, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import os
from langchain.memory import ConversationSummaryBufferMemory
from langchain_openai import OpenAI
from langchain_mongodb.chat_message_histories import MongoDBChatMessageHistory
from langchain.chains.conversation.base import ConversationChain

os.environ["OPENAI_API_KEY"] = "#######################"

llm = OpenAI()

connection_string = "mongodb+srv://sa.................."
database_name = "langchain-chat-history"
collection_name = "collection_1"
session_id = "session31"

chat_memory = MongoDBChatMessageHistory(
    session_id=session_id,
    connection_string=connection_string,
    database_name=database_name,
    collection_name=collection_name,
)

memory = ConversationSummaryBufferMemory(
    llm=llm, chat_memory=chat_memory, max_token_limit=10
)

conversation_with_summary = ConversationChain(
    llm=llm,
    memory=memory,
    verbose=True,
)

print(conversation_with_summary.predict(input="Hi, what's up?"))
print(conversation_with_summary.predict(input="Just working on writing some documentation!"))
print(conversation_with_summary.predict(input="For LangChain! Have you heard of it?"))

Error Message and Stack Trace (if applicable)

Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Human: Hi, what's up?
AI:

Finished chain.
Hello! I am an AI program designed and created by a team of developers at OpenAI. Currently, I am running on a server with a powerful processor and a lot of memory, allowing me to process and store vast amounts of information. I am constantly learning and improving my abilities through various algorithms and data sets. Is there something specific you would like to know or discuss?

Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
System:
The human greets the AI and asks about its capabilities. The AI explains that it is a program designed and created by a team of developers at OpenAI, constantly learning and improving through algorithms and data sets. It also mentions its powerful processor and memory. The human is curious to know more.
Human: Hi, what's up?
AI: Hello! I am an AI program designed and created by a team of developers at OpenAI. Currently, I am running on a server with a powerful processor and a lot of memory, allowing me to process and store vast amounts of information. I am constantly learning and improving my abilities through various algorithms and data sets. Is there something specific you would like to know or discuss?
Human: Just working on writing some documentation!
AI:

Finished chain.
That's great to hear! I have access to a vast amount of information and can assist you with any questions you may have. Is there a specific topic or area you need help with in your documentation?

Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
System:
The human greets the AI and asks about its capabilities. The AI explains that it is a program designed and created by a team of developers at OpenAI, constantly learning and improving through algorithms and data sets. It also mentions its powerful processor and memory. The human is curious to know more and the AI offers its assistance, stating that it has access to a vast amount of information and can help with any questions about documentation. The human also shares that they are currently working on writing documentation.
Human: Hi, what's up?
AI: Hello! I am an AI program designed and created by a team of developers at OpenAI. Currently, I am running on a server with a powerful processor and a lot of memory, allowing me to process and store vast amounts of information. I am constantly learning and improving my abilities through various algorithms and data sets. Is there something specific you would like to know or discuss?
Human: Just working on writing some documentation!
AI: That's great to hear! I have access to a vast amount of information and can assist you with any questions you may have. Is there a specific topic or area you need help with in your documentation?
Human: For LangChain! Have you heard of it?
AI:

Finished chain.
Yes, I am familiar with LangChain. It is a blockchain platform that aims to provide secure and transparent language translation services. Is there anything specific you would like to know about LangChain for your documentation?

Description

Although the conversation is summarized, the entire chat conversation is still sent to llm without pruning the summarized chats. However, this works as expected with default in-memory list in ConversationSummaryBufferMemory.

Example (work as expected):

import os
from langchain.memory import ConversationSummaryBufferMemory
from langchain_openai import OpenAI
from langchain.chains.conversation.base import ConversationChain


os.environ["OPENAI_API_KEY"] = "##########################"

llm = OpenAI()

memory=ConversationSummaryBufferMemory(
    llm=llm, max_token_limit=10
)

conversation_with_summary = ConversationChain(
    llm=llm,
    memory=memory,
    verbose=True,
)


print(conversation_with_summary.predict(input="Hi, what's up?"))
print(conversation_with_summary.predict(input="Just working on writing some documentation!"))
print(conversation_with_summary.predict(input="For LangChain! Have you heard of it?"))

Expected output:

Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Human: Hi, what's up?
AI:

Finished chain.
Hello! Not much is up with me, I am an AI after all. But my servers are running smoothly and I am ready to assist you with any questions or tasks you may have. How about you? Is there anything I can help you with today?

Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
System:
The human greets the AI and asks how it is doing. The AI responds by saying it is an AI and its servers are running smoothly. The AI also offers to assist the human with any questions or tasks.
Human: Just working on writing some documentation!
AI:

Finished chain.
That sounds like a productive task! As an AI, I don't experience fatigue or boredom like humans, so I am always ready to assist with any tasks or questions you may have. Is there something specific you need help with?

Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
System:
The human greets the AI and asks how it is doing. The AI responds by saying it is an AI and its servers are running smoothly. The AI also offers to assist the human with any questions or tasks, mentioning its lack of fatigue or boredom. The human mentions working on writing documentation, to which the AI offers its assistance and asks for specific needs.
Human: For LangChain! Have you heard of it?
AI:

Finished chain.
Yes, I am familiar with LangChain. It is a blockchain platform that focuses on language and translation services. It was founded in 2019 and has gained significant popularity in the tech industry. Is there something specific you would like to know about LangChain? I can provide you with more detailed information if needed.

System Info

langchain==0.1.17
langchain-community==0.0.36
langchain-core==0.1.50
langchain-mongodb==0.1.3

@dosubot dosubot bot added Ɑ: memory Related to memory module 🔌: mongo Primarily related to Mongo integrations labels May 13, 2024
@keenborder786
Copy link
Contributor

I tried on my local machine and everything is working as expected. Can you please check your mongodb and see if messages are being stored in the desired collection???

@sawinuCP
Copy link

I'm facing the same issue with PostgresChatMessageHistory.

import os
import config
from langchain.memory import ConversationSummaryBufferMemory
from langchain_openai import OpenAI
from langchain_community.chat_message_histories import (
    PostgresChatMessageHistory,
)
from langchain.chains.conversation.base import ConversationChain

os.environ["OPENAI_API_KEY"] = ""

llm = OpenAI()

connection_string = config.POSTGRES_CHAT_MEMORY_CONNECTION_STRING
database_name = "langchain-chat-history"
collection_name = "collection_1"
session_id = "session42"

chat_memory = PostgresChatMessageHistory(
    session_id=session_id,
    connection_string=connection_string,
)

memory = ConversationSummaryBufferMemory(
    llm=llm, chat_memory=chat_memory, max_token_limit=10
)

conversation_with_summary = ConversationChain(
    llm=llm,
    memory=memory,
    verbose=True,
)

print(conversation_with_summary.predict(input="Hi, what's up?"))
print(conversation_with_summary.predict(input="Just working on writing some documentation!"))
print(conversation_with_summary.predict(input="For LangChain! Have you heard of it?"))

@Sameera2001Perera
Copy link
Author

I tried on my local machine and everything is working as expected. Can you please check your mongodb and see if messages are being stored in the desired collection???

@keenborder786 yes messages are being stored in the given collection.

@Sameera2001Perera
Copy link
Author

Hope this will be fixed with #11374.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: memory Related to memory module 🔌: mongo Primarily related to Mongo integrations
Projects
None yet
Development

No branches or pull requests

3 participants