Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect finish_reason for generate? #320

Open
harupy opened this issue Oct 4, 2023 · 1 comment
Open

Incorrect finish_reason for generate? #320

harupy opened this issue Oct 4, 2023 · 1 comment

Comments

@harupy
Copy link

harupy commented Oct 4, 2023

I'm not sure if this is the right place to ask this, but generate with max_tokens=3 finshes with finish_reason=COMPLETE

$ curl --request POST \
     --url https://api.cohere.ai/v1/generate \
     --header 'accept: application/json' \
     --header "authorization: Bearer $COHERE_API_KEY" \
     --header 'content-type: application/json' \
     --data '
{
  "max_tokens": 3,
  "stream": true,
  "prompt": "Please explain to me how LLMs work"
}
'
{"text":" LL","is_finished":false}
{"text":"Ms","is_finished":false}
{"text":",","is_finished":false}
{"is_finished":true,"finish_reason":"COMPLETE","response":{"id":"05835269-8d06-422d-8f4e-fc3e8a2b8a96","generations":[{"id":"73b93e8c-ee35-4831-a3be-c7534b31dbb9","text":" LLMs,","finish_reason":"COMPLETE"}],"prompt":"Please explain to me how LLMs work"}}

Should it finish with MAX_TOKENS?

@samuelpath
Copy link

samuelpath commented Nov 2, 2023

Hi @harupy, I'm not sure this is the best place to ask your question since it is not really related to Python's SDK per se. Indeed, your question is related to a CURL query.

You would stand a better chance to get a prompt answer by asking your question on Cohere's Discord community, in the #general-chat channel.

However, I'm not able to reproduce, since I get MAX_TOKENS as expected:

$ curl --request POST \
     --url https://api.cohere.ai/v1/generate \
     --header 'accept: application/json' \
     --header "authorization: Bearer $COHERE_API_KEY" \
     --header 'content-type: application/json' \
     --data '
{
  "max_tokens": 3,
  "stream": true,
  "prompt": "Please explain to me how LLMs work"
}
'
{"text":" LL","is_finished":false}
{"text":"Ms","is_finished":false}
{"text":",","is_finished":false}
{"is_finished":true,"finish_reason":"MAX_TOKENS","response":{"id":"379fb392-fa7c-4e26-9940-0e0a5b057b08","generations":[{"id":"fa961ef2-6f0e-4fc6-98eb-0df16e9960c5","text":" LLMs,","finish_reason":"MAX_TOKENS"}],"prompt":"Please explain to me how LLMs work"}}

Can you try again to see if you still encounter this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants