Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] OpenAI Chat Completion response streaming support #46

Open
Boburmirzo opened this issue Aug 8, 2023 · 1 comment
Open

[FEATURE] OpenAI Chat Completion response streaming support #46

Boburmirzo opened this issue Aug 8, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@Boburmirzo
Copy link
Contributor

I assume that currently, the LLM App API client wrapper for OpenAI API does not support this streaming completions feature.

It is nice to have it where we can stream ChatGPT final responses into Pathway's output connectors such as Kafka, Redpanda or Debezium.

References:

https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb
https://platform.openai.com/docs/api-reference/completions/create#completions/create-stream

@Boburmirzo Boburmirzo changed the title OpenAI Chat Completion response streaming support [FEATURE] OpenAI Chat Completion response streaming support Aug 8, 2023
@Boburmirzo Boburmirzo added the enhancement New feature or request label Aug 8, 2023
@mihir1739
Copy link

Can I work on this? Please assign this issue to me

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants