-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AsistantResponse] - to save streaming data to external database #1461
Comments
Hi @ayepRahman , you can use "sendMessage" function present on assistantresponsecallback to save the current response to your external database and if you want to get the whole chat history then you can use the threadId to get all the messages |
do u by any chance have any example on using sendMessage()? |
@ravvi-kumar do you know how to map the response of const { setMessages } = useAssistant({
// ...
})
useEffect(() => {
if (props.messages) {
const mapped = props.messages.map((message) => {
// ...
})
setMessages(mapped)
}
}, [props.message, setMessages]) |
I just save the string that the model output at the very end of my action.tsx
I have not had any issues with saying them. In an api route i used the
the partial on token is if the user stops the chat in teh middle of the streaming response so we always store the latest token in the database :) Hope this helps |
Something like this here:
|
the This is what LangChainStreamCustom() looks like. It only has the onCompletion but you get the jist and can implement the other callbacks like onToken etc.. if you need it. export const LangChainStreamCustom= (
stream: any,
{ onCompletion }: { onCompletion: (completion: string) => Promise<void> }
) => {
let completion = ''
const transformStream = new TransformStream({
transform(chunk, controller) {
completion += new TextDecoder('utf-8').decode(chunk)
controller.enqueue(chunk)
},
flush(controller) {
onCompletion(completion)
.then(() => {
controller.terminate()
})
.catch((e: any) => {
console.error('Error', e)
controller.terminate()
})
}
})
stream.pipeThrough(transformStream)
return transformStream.readable
} Then in your api route.. you get the stream or response from the LangChain call //....
response = llm.stream({}) // assuming this comes from a typical LangChain Then you can use it like all the other Vercel "OpenAIStream, AnthropicStream" .. etc. const stream = LangChainStreamCustom(response, {
onCompletion: async (completion: string) => {
console.log('COMPLETE!', completion)
}
})
return new StreamingTextResponse(stream) |
Feature Description
Save streaming thread message data to external DB
Use Case
Is there a way to save thread messages to external database after the completion of streaming via AssistantResponse like how OpenAiStream has a callback for completion?
Additional context
No response
The text was updated successfully, but these errors were encountered: