Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add preliminary MistralAI support #732

Closed
wants to merge 6 commits into from
Closed

Conversation

iMilnb
Copy link

@iMilnb iMilnb commented May 2, 2024

This small patch provides the ability to use another API endpoint declared in an environment variable, and adds Mistral models.

Here is the python example from MistralAI: https://github.com/mistralai/client-python/blob/0d55818364fd2a7ca67401dd550632f8ca70e5af/examples/chatbot_with_streaming.py#L15

Successfully tested with MistralAI.

Copy link

codecov bot commented May 2, 2024

Codecov Report

Attention: Patch coverage is 0% with 2 lines in your changes are missing coverage. Please review.

Project coverage is 98.51%. Comparing base (774fc9d) to head (26912c2).
Report is 15 commits behind head on master.

Files Patch % Lines
client.go 0.00% 1 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #732      +/-   ##
==========================================
+ Coverage   98.46%   98.51%   +0.05%     
==========================================
  Files          24       24              
  Lines        1364     1142     -222     
==========================================
- Hits         1343     1125     -218     
+ Misses         15       10       -5     
- Partials        6        7       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@kmesiab
Copy link
Contributor

kmesiab commented May 19, 2024

I think this PR signals a mature direction to start thinking about moving this project.

@iMilnb
Copy link
Author

iMilnb commented May 20, 2024

I think this PR signals a mature direction to start thinking about moving this project.

Actually, I think the model declaration needs to be revamped. go-openai would certainly work with ollama just like Mistral, but adding models in the completion.go file is not sustainable. The const list should be extendable via an environment variable or a config file.
I can work on this it you agree with this idea.

@sashabaranov
Copy link
Owner

@iMilnb Thank you for the PR! We're definitely not merging that, as implicit dependency on the environment variable is not what we want. (And I'm getting strong xz vibes from this PR tbh).

Regarding the list of consts: there is no need to rely on it, the Model parameter is just a string, and you can pass any model name you want.

@@ -224,6 +225,9 @@ func decodeString(body io.Reader, output *string) error {
// fullURL returns full URL for request.
// args[0] is model name, if API type is Azure, model name is required to get deployment name.
func (c *Client) fullURL(suffix string, args ...any) string {
if val := os.Getenv("OPENAI_BASEURL"); val != "" {
c.config.BaseURL = val
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fullURL method is getting called during the usage of the client (days, weeks), so this change allows you to quietly change the base url using an environment variable from anywhere if you could just set an environment variable.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't realize that, my bad. I wanted to provide a non disruptive and quick way of supporting another API endpoint.
Is alternative API endpoint support planned somehow or is this project only focusing at OpenAI?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You probably want something like that:

config := openai.DefaultConfig("token")
config.BaseURL = "your url" // probably "https://api.mistral.ai" 
client := openai.NewClientWithConfig(config)

response, err := client.CreateChatCompletion(
	context.Background(),
	openai.ChatCompletionRequest{
		Model: "mistral-small",
		Messages: []openai.ChatCompletionMessage{
			{
				Role:    openai.ChatMessageRoleUser,
				Content: "Hello!",
			},
		},
	},
)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok I totally missed the NewClientWithConfig() function, thanks for clarifying.
As per the xz vibes, I use go-openai in this little project of mine: https://gitlab.com/iMil/cbg
Again, sorry for the misunderstanding and keep up the good work.

@kevin-mesiab
Copy link

I think this PR signals a mature direction to start thinking about moving this project.

Yes, let me clarify. Expanding the library to support a growing number of compatible models is a solid direction for this library which is becoming mature.

The actual implementation should be carefully considered, as this is a foundational piece of the library's architecture. It is a promising direction of thought.

However, I do echo the sentiments from @sashabaranov, in terms of the current suggested implementation being less than ideal.

@kevin-mesiab
Copy link

Regarding the list of consts: there is no need to rely on it.

Just trying to understand the design philosophy here. Is the goal to prevent the constants list from being polluted with non openai models?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants