Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: batch request #174

Open
perone opened this issue Mar 2, 2023 · 2 comments
Open

Feature request: batch request #174

perone opened this issue Mar 2, 2023 · 2 comments

Comments

@perone
Copy link

perone commented Mar 2, 2023

It would be nice if the API could provide an endpoint where it could accept a batch request, like for example the Co.Generate could accept a list of prompts. It is not very efficient both from the inference perspective at the backend and for the client, having to execute multiple calls with all the overhead of each call instead of just batching it and doing a single call with all prompts.

@amorisot
Copy link
Contributor

There's an undocumented feature to do this, here!

@rajveer43
Copy link

can I take this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants