Skip to content

rubenselander/openai-function-tokens

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenAI Function Tokens Estimator

Estimate OpenAI token usage for chat completions, including functions, with this Python utility!

This package is based upon hmarr's openai-chat-tokens. As of right now (September 2023) there is no official documentation from openai on how to accurately predict the number of tokens from functions. This package solves that! Use it to get a very precise estimation of the token count for chat completions and better manage your OpenAI API usage.

Most often it is correct down to the token.

Installation

  1. Install the Package via pip

    pip install openai-function-tokens
  2. Import the Estimation Function

    from openai_function_tokens import estimate_tokens

Usage

To use the estimator, call the estimate_tokens function:

estimate_tokens(messages, functions=None, function_call=None)

Pass in the messages, and optionally functions and function_call, to receive a precise token count.

Acknowledgments

Credit to hmarr for the original TypeScript tool. For a better understanding of token counting logic, check out his blog post.

Further Reading

Function Calling

How to call functions with chat models

How to use functions with a knowledge base

JSON Schema documentation

Counting tokens (only messages)

Contributing

Feedback, suggestions, and contributions are highly appreciated. Help make this tool even better!