Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] python: Provide a way get or use the max context size supported by the LLM #2283

Open
Ben-Epstein opened this issue Apr 29, 2024 · 0 comments
Labels
bindings gpt4all-binding issues enhancement New feature or request python-bindings gpt4all-bindings Python specific issues

Comments

@Ben-Epstein
Copy link

Feature Request

Currently, the default context is always set to 2048, but many models have a larger context window. I tried to showcase a simple first step by having the context limit as a part of the models3.json, but that was rejected and requested that I open an issue instead.

According to @cebtenzzre ref this is already done in the GUI.

Request: Set the default context length in the bindings to None, and if not set, dynamically set the context length based on the chosen model

@Ben-Epstein Ben-Epstein added the enhancement New feature or request label Apr 29, 2024
@cebtenzzre cebtenzzre added bindings gpt4all-binding issues python-bindings gpt4all-bindings Python specific issues labels May 10, 2024
@cebtenzzre cebtenzzre changed the title [Feature] Dynamically set context window size for chosen LLM in the bindings [Feature] python: Provide a way get or use the max context size supported by the LLM May 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bindings gpt4all-binding issues enhancement New feature or request python-bindings gpt4all-bindings Python specific issues
Projects
Development

No branches or pull requests

2 participants