-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add preliminary MistralAI support #732
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #732 +/- ##
==========================================
+ Coverage 98.46% 98.51% +0.05%
==========================================
Files 24 24
Lines 1364 1142 -222
==========================================
- Hits 1343 1125 -218
+ Misses 15 10 -5
- Partials 6 7 +1 ☔ View full report in Codecov by Sentry. |
I think this PR signals a mature direction to start thinking about moving this project. |
Actually, I think the model declaration needs to be revamped. |
@iMilnb Thank you for the PR! We're definitely not merging that, as implicit dependency on the environment variable is not what we want. (And I'm getting strong xz vibes from this PR tbh). Regarding the list of consts: there is no need to rely on it, the |
@@ -224,6 +225,9 @@ func decodeString(body io.Reader, output *string) error { | |||
// fullURL returns full URL for request. | |||
// args[0] is model name, if API type is Azure, model name is required to get deployment name. | |||
func (c *Client) fullURL(suffix string, args ...any) string { | |||
if val := os.Getenv("OPENAI_BASEURL"); val != "" { | |||
c.config.BaseURL = val |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The fullURL
method is getting called during the usage of the client (days, weeks), so this change allows you to quietly change the base url using an environment variable from anywhere if you could just set an environment variable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't realize that, my bad. I wanted to provide a non disruptive and quick way of supporting another API endpoint.
Is alternative API endpoint support planned somehow or is this project only focusing at OpenAI?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You probably want something like that:
config := openai.DefaultConfig("token")
config.BaseURL = "your url" // probably "https://api.mistral.ai"
client := openai.NewClientWithConfig(config)
response, err := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
Model: "mistral-small",
Messages: []openai.ChatCompletionMessage{
{
Role: openai.ChatMessageRoleUser,
Content: "Hello!",
},
},
},
)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok I totally missed the NewClientWithConfig()
function, thanks for clarifying.
As per the xz
vibes, I use go-openai
in this little project of mine: https://gitlab.com/iMil/cbg
Again, sorry for the misunderstanding and keep up the good work.
Yes, let me clarify. Expanding the library to support a growing number of compatible models is a solid direction for this library which is becoming mature. The actual implementation should be carefully considered, as this is a foundational piece of the library's architecture. It is a promising direction of thought. However, I do echo the sentiments from @sashabaranov, in terms of the current suggested implementation being less than ideal. |
Just trying to understand the design philosophy here. Is the goal to prevent the constants list from being polluted with non openai models? |
This small patch provides the ability to use another API endpoint declared in an environment variable, and adds Mistral models.
Here is the python example from MistralAI: https://github.com/mistralai/client-python/blob/0d55818364fd2a7ca67401dd550632f8ca70e5af/examples/chatbot_with_streaming.py#L15
Successfully tested with MistralAI.