You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
sawinuCP opened this issue
May 13, 2024
· 0 comments
Labels
🤖:bugRelated to a bug, vulnerability, unexpected error with an existing featureⱭ: memoryRelated to memory module🔌: openaiPrimarily related to OpenAI integrations
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangChain rather than my code.
The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
def __set_chain_memory(self, user_id, conversation_id):
chat_mem = self.chat_memory.get(user_id, conversation_id)
llm = ChatOpenAI(temperature=0, model_name=GPT3_MODEL)
self.chain_memory = ConversationSummaryBufferMemory(
llm=llm,
chat_memory=chat_mem,
memory_key="history",
input_key="query",
return_messages=True,
max_token_limit=1000,
)
self.chain_memory.prune()```
``` def generate_streaming_llm_response(
self,
user_id: str,
conversation_id: str,
user_input,
llm,
prompt: str,
callback_handler: StreamingHandler,
):
self.__set_chain_memory(user_id, conversation_id)
chain = StreamingChain(
llm=llm,
prompt=prompt,
memory=self.chain_memory,
queue_manager=callback_handler.queue_manager,
)
return chain.stream(user_input, callback_handler.queue_id)```
```class StreamingChain(LLMChain):
queue_manager = QueueManager()
def __init__(self, llm, prompt, memory, queue_manager):
super().__init__(llm=llm, prompt=prompt, memory=memory)
self.queue_manager = queue_manager
def stream(self, input, queue_id, **kwargs):
queue = self.queue_manager.get_queue(queue_id)
def task():
try:
self(input)
except Exception as e:
logger.exception(f"Exception caught")
self.queue_manager.close_queue(queue_id)
t = Thread(target=task)
t.start()
try:
while True:
token = queue.get()
if token is None:
break
yield token
finally:
t.join()
self.queue_manager.close_queue(queue_id)```
### Error Message and Stack Trace (if applicable)
_No response_
### Description
I'm using a chatbot which is created by python langchain. In that I'm sending requests to the several LLM models (OPENAI , Claude, GEMINI). When I send requests to llm, first I summarize my previous chat. For summarize it I send my previous chats with a prompt mentioning to summarize this chat. It is done by ConversationSummaryBufferMemory using llm = ChatOpenAI(temperature=0, model_name=GPT3_MODEL) . By then I got that summary of the chat history and I stored it in a variable. After when I send my query to the LLM, I send it with the prompt , query and the summery of chat history that I have stored in a variable. But in the verbose I can see whole the chat history in the prompt instead of the summery of the previous chat. I the code chain_memory is the variable that I store the summery of the chat. chat_mem is the whole previous chat that I get from the postgres database. after Summarizing the previous chat It wil be send in to the StreamingChain to generate the response .
### System Info
langchain==0.1.8
langchain-community==0.0.21
langchain-core==0.1.31
langchain-experimental==0.0.52
openai==1.12.0
langchain_anthropic==0.1.4
langchain_mistralai==0.0.5
The text was updated successfully, but these errors were encountered:
dosubotbot
added
Ɑ: memory
Related to memory module
🔌: openai
Primarily related to OpenAI integrations
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
labels
May 13, 2024
🤖:bugRelated to a bug, vulnerability, unexpected error with an existing featureⱭ: memoryRelated to memory module🔌: openaiPrimarily related to OpenAI integrations
Checked other resources
Example Code
The text was updated successfully, but these errors were encountered: