Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add response_format support for openai and a way to set the functions #170

Closed
wants to merge 4 commits into from

Conversation

Arttii
Copy link

@Arttii Arttii commented Feb 23, 2024

Description

I needed support for the JSON output that OpenAI supports by setting the type. This is already supported by the underlying client.

Type of change

  • New feature (non-breaking change which adds functionality)

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • No tests

Checklist:

  • I have read the Contributing documentation.
  • [ X] I have read the Code of conduct documentation.
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have checked my code and corrected any misspellings

@BabySid
Copy link

BabySid commented Apr 10, 2024

hi, I do need this feature too. Is there a plan to merge this PR? thanks @henomis

@henomis
Copy link
Owner

henomis commented Apr 10, 2024

hi, I do need this feature too. Is there a plan to merge this PR? thanks @henomis

@BabySid this PR needs some refactor.

@BabySid
Copy link

BabySid commented Apr 11, 2024

hi, I do need this feature too. Is there a plan to merge this PR? thanks @henomis

@BabySid this PR needs some refactor.

got. Thanks for the reply

@Arttii
Copy link
Author

Arttii commented Apr 14, 2024

@henomis please let me know, what you think should be changed. Would be happy to incorporate it.

@henomis
Copy link
Owner

henomis commented Apr 14, 2024

@henomis please let me know, what you think should be changed. Would be happy to incorporate it.

@Arttii just follow the review comments.

Comment on lines +79 to +84
// WithFunctions sets the functions to use for the OpenAI instance.
func (o *OpenAI) WithFunctions(functions map[string]Function) *OpenAI {
o.functions = functions
return o
}

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is out of context of this PR.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this out of scope?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not 100% sure of this method and this change must be discussed.

@@ -37,6 +37,7 @@ type OpenAI struct {
streamCallbackFn StreamCallback
toolChoice *string
cache *cache.Cache
responseFormat openai.ChatCompletionResponseFormatType
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do not use go-openai type here.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we use a local alias for this or what is your preference?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes use an alias, see the comment below.

func (o *OpenAI) WithToolChoice(toolChoice *string) *OpenAI {
o.toolChoice = toolChoice
return o
}

func (o *OpenAI) WithResponseFormat(format openai.ChatCompletionResponseFormatType) *OpenAI {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't use go-openai types here

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lingoose's users must deal with lingoose types.

temperature: DefaultOpenAITemperature,
maxTokens: DefaultOpenAIMaxTokens,
functions: make(map[string]Function),
responseFormat: openai.ChatCompletionResponseFormatTypeText,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't use go-openai types here.

@@ -299,6 +312,9 @@ func (o *OpenAI) buildChatCompletionRequest(t *thread.Thread) openai.ChatComplet
N: DefaultOpenAINumResults,
TopP: DefaultOpenAITopP,
Stop: o.stop,
ResponseFormat: &openai.ChatCompletionResponseFormat{
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ResponseFormat is omitempty in go-openai package. This means that you can use a default pointer to nil and eventually overwrite it using the internal value.

@@ -208,7 +208,7 @@ func (c *Content) Format(input types.M) *Content {
prompt := prompt.NewPromptTemplate(c.Data.(string))
err := prompt.Format(input)
if err != nil {
return c
panic(c)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't panic. Out of scope of this PR.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an interesting question as we should at least surface the error. This tripped me up a few times as the llm was just being prompted with an empty prompt essentially. Should we log or what is your preference?

Comment on lines +30 to +31
github.com/sashabaranov/go-openai v1.21.0 h1:isAf3zPSD3VLd0pC2/2Q6ZyRK7jzPAaz+X3rjsviaYQ=
github.com/sashabaranov/go-openai v1.21.0/go.mod h1:lj5b/K+zjTSFxVLijLSTDZuP7adOgerWeFyZLUhAKRg=
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is out of scope of this PR.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair enough, I will remove this.

Comment on lines +36 to +37
GPT4Turbo Model = openai.GPT4Turbo
GPT4Turbo20240409 Model = openai.GPT4Turbo20240409
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of Scope

@@ -324,7 +324,7 @@ func (o *OpenAI) getChatCompletionRequestTools() []openai.Tool {
for _, function := range o.functions {
tools = append(tools, openai.Tool{
Type: "function",
Function: openai.FunctionDefinition{
Function: &openai.FunctionDefinition{
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of scope

@@ -37,6 +37,7 @@ type OpenAI struct {
streamCallbackFn StreamCallback
toolChoice *string
cache *cache.Cache
responseFormat openai.ChatCompletionResponseFormatType
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes use an alias, see the comment below.

func (o *OpenAI) WithToolChoice(toolChoice *string) *OpenAI {
o.toolChoice = toolChoice
return o
}

func (o *OpenAI) WithResponseFormat(format openai.ChatCompletionResponseFormatType) *OpenAI {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lingoose's users must deal with lingoose types.

Comment on lines +79 to +84
// WithFunctions sets the functions to use for the OpenAI instance.
func (o *OpenAI) WithFunctions(functions map[string]Function) *OpenAI {
o.functions = functions
return o
}

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not 100% sure of this method and this change must be discussed.

@BabySid
Copy link

BabySid commented May 20, 2024

Hi, @henomis It seems that this PR has been on hold for a long time. Does the lingoose project have any plans to develop this feature? To be honest, this feature is still very important for openai. I also know that this PR is currently in a very awkward situation, and the author may not be able to continue development due to some unknown reasons. It would be a very commendable thing if the project can integrate this feature.

@henomis
Copy link
Owner

henomis commented May 20, 2024

@BabySid
I'll open a new PR for this feature.

@henomis
Copy link
Owner

henomis commented May 20, 2024

Implemented here: #199

@henomis henomis closed this May 20, 2024
@BabySid
Copy link

BabySid commented May 20, 2024

@BabySid I'll open a new PR for this feature.

Many thanks:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants