We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can I use this with local model?
The text was updated successfully, but these errors were encountered:
Can't without making some changes, we're using: https://python.langchain.com/docs/modules/model_io/chat/structured_output Should be possible as we add support for local models.
Until then your best bet is a parsing approach, so you'd need to re-write some of the code in the service to use a parsing approach.
Sorry, something went wrong.
Thanks, I asked question about this function, I could probably copy it from the partner folder.
I could also create a PR if this is something you want.
@amztc34283 Were you able to set it up with a local model? I wanna test mistral model through ollama, any idea on this implementation?
No branches or pull requests
How can I use this with local model?
The text was updated successfully, but these errors were encountered: