-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: logprobs
is not compatible with the OpenAI spec
#4795
Labels
bug
Something isn't working
good first issue
Good for newcomers
help wanted
Extra attention is needed
Comments
simon-mo
added
good first issue
Good for newcomers
help wanted
Extra attention is needed
labels
May 14, 2024
I will take a look! |
This was referenced May 24, 2024
DarkLight1337
added a commit
to DarkLight1337/vllm-rocm
that referenced
this issue
May 24, 2024
DarkLight1337
added a commit
to DarkLight1337/vllm-rocm
that referenced
this issue
May 25, 2024
simon-mo
pushed a commit
that referenced
this issue
May 29, 2024
blinkbear
pushed a commit
to blinkbear/vllm
that referenced
this issue
May 31, 2024
dtrifiro
pushed a commit
to opendatahub-io/vllm
that referenced
this issue
May 31, 2024
mawong-amd
pushed a commit
to ROCm/vllm
that referenced
this issue
Jun 3, 2024
triple-Mu
pushed a commit
to CC-LLM/vllm
that referenced
this issue
Jun 5, 2024
blinkbear
pushed a commit
to blinkbear/vllm
that referenced
this issue
Jun 6, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
bug
Something isn't working
good first issue
Good for newcomers
help wanted
Extra attention is needed
Your current environment
I'm using Runpod Serverless vLLM (https://github.com/runpod-workers/worker-vllm) so I can't run this command. However, I confirmed that the issue is in the codebase in
main
:https://github.com/vllm-project/vllm/blob/0fca3cdcf265cd375bca684d951702b6b7adf65a/vllm/entrypoints/openai/protocol.py
🐛 Describe the bug
The behavior of
logprobs=True
does not match OpenAI's.I identified two issues:
(1) vLLM throws an error when
logprobs=True
andtop_logprobs
is missing.OpenAI works fine:
vLLM breaks:
via
vllm/vllm/entrypoints/openai/protocol.py
Line 162 in 0fca3cd
(2) Even wtih
top_logprobs=1
, the behavior doesn't match.OpenAI:
vLLM:
Notice that, for example,
token_logprobs
comes up with vLLM but not with OpenAI.These issues break libraries expecting OpenAI-compatible responses, e.g. Rust's async_openai we are using.
The text was updated successfully, but these errors were encountered: