Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Summary Max Tokens cuts off summary #129

Open
Oliver84 opened this issue Jul 13, 2023 · 1 comment
Open

Summary Max Tokens cuts off summary #129

Oliver84 opened this issue Jul 13, 2023 · 1 comment

Comments

@Oliver84
Copy link

Oliver84 commented Jul 13, 2023

So the "max_tokens" feature from open ai sounds great, but in reality it just cuts the response off at the given limit. So if the response is 700 tokens but you ask it to cap it off at 512, then it will literally just cut the response off at 512.

... The AI discusses how challenges are part of",

It doesn't care if it doesn't make any sense, it just drops anything after 512 which makes the summary less cohesive. I do like being able to limit the summary because if not, you can have a really long summary after a long chat conversation which can get really expensive.

What I've found works better than sending in "max_tokens" is to make it part of the prompt. "Summarize the following in 512 tokens or less". This will make sure that the response is less than a certain amount, without cutting off the response prematurely.

It would also be great if this number was easily configurable from langchain so we didn't have to worry about the config file when finding which amount works best for our use case.

Update:
Here's a draft PR with my proposed solution minus making it configurable from langchain. I've never worked on a Go project before but you should be able to see what I'm trying to do here. Let me know what you think and I can keep working on it if you'd like. #130

@danielchalef
Copy link
Member

Great. Hopefully the Go was straightforward. I've left some thoughts on the PR. Thanks for contributing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants