Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for OpenAI tools/function-calling API #321

Open
bitnom opened this issue Feb 5, 2024 · 3 comments
Open

Add support for OpenAI tools/function-calling API #321

bitnom opened this issue Feb 5, 2024 · 3 comments
Labels

Comments

@bitnom
Copy link

bitnom commented Feb 5, 2024

Investigate the feasibility of adding support for the OpenAI function-calling API, and (If feasible) integrate it.

Rationale

source: https://gorilla.cs.berkeley.edu/blogs/4_open_functions.html

References

https://platform.openai.com/docs/guides/function-calling

Example request/response:

curl https://api.openai.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "What is the weather like in Boston?"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "The city and state, e.g. San Francisco, CA"
            },
            "unit": {
              "type": "string",
              "enum": ["celsius", "fahrenheit"]
            }
          },
          "required": ["location"]
        }
      }
    }
  ],
  "tool_choice": "auto"
}'

Response:

{
  "id": "chatcmpl-abc123",
  "object": "chat.completion",
  "created": 1699896916,
  "model": "gpt-3.5-turbo-0613",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": null,
        "tool_calls": [
          {
            "id": "call_abc123",
            "type": "function",
            "function": {
              "name": "get_current_weather",
              "arguments": "{\n\"location\": \"Boston, MA\"\n}"
            }
          }
        ]
      },
      "logprobs": null,
      "finish_reason": "tool_calls"
    }
  ],
  "usage": {
    "prompt_tokens": 82,
    "completion_tokens": 17,
    "total_tokens": 99
  }
}

Notes

@lbeurerkellner
Copy link
Collaborator

I understand the request and I think OAI did a great job with their function calling models. I am uncertain if support for this in LMQL makes sense, since it is a very vendor-specific API, that will be hard generalize in a model-agnostic way.

I am open to design proposal however. It would be great to somehow abstract their implementation away, and to provide a common interface, that also works e.g. for Gorilla models or other forms of more open function calling.

@bitnom
Copy link
Author

bitnom commented Feb 18, 2024

I understand the request and I think OAI did a great job with their function calling models. I am uncertain if support for this in LMQL makes sense, since it is a very vendor-specific API, that will be hard generalize in a model-agnostic way.

I am open to design proposal however. It would be great to somehow abstract their implementation away, and to provide a common interface, that also works e.g. for Gorilla models or other forms of more open function calling.

I did a bit more research on this issue. There are several major implementations of function/tool calling:

There are of course others. Every agentic framework has some version of this. All of them I have looked at so far have standardized around the OpenAI parallel function calling API, and use the openai Python package for integration, both for calls to openai and function calling via local LLMs.

This standardization is not confined to agentic frameworks. The fact that Gorilla and litellm are depending on the openai spec for all function calling are pretty good signals.

I would have already submitted a PR but the lmql oai client is more complex than most other packages' openai integrations. The reason for that is obvious. Regardless, I have yet to have time to sit down and take it all in.

Note: As I vaguely recall from my first attempt, logit_bias and/or logprobs parameters did not have any effect on the output of a function/tool definition and/or call. I could be misremembering, so citation and/or example needed because that doesn't sound right. Streaming/chunks is supported.

@bitnom
Copy link
Author

bitnom commented Feb 18, 2024

Another thought: check the prompt templates of gorilla and litellm's function-calling for local models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants