You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm comparing different LLMs using the above stack. When I add a model to the my local instance of Ollama, I expected it to pop up in the list of available models. However, I need to completely close VSCodium and reopen it to get the updated list.
Before submitting your bug report
Relevant environment info
Description
I'm comparing different LLMs using the above stack. When I add a model to the my local instance of Ollama, I expected it to pop up in the list of available models. However, I need to completely close VSCodium and reopen it to get the updated list.
Here's my config fragment:
I tried just disabling and re-enabling Continue, but that didn't work.
To reproduce
"models"
configured as aboveollama pull $fancy_new_model
$fancy_new_model
not listedexpected:
Log output
No response
The text was updated successfully, but these errors were encountered: