Skip to content

Plugin for LLM-CLI adding support for Together.AI hosting a large collection of open-source LLMs

License

Notifications You must be signed in to change notification settings

wearedevx/llm-together

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-together

PyPI Changelog License

Plugin for LLM adding support for Together

Installation

Install this plugin in the same environment as LLM.

llm install llm-together

Configuration

You will need an API key from Together. You can obtain one by creating an account and going to 'API Keys'.

You can set that as an environment variable called TOGETHER_API_KEY, or add it to the llm set of saved keys using:

llm keys set together
Enter key: <paste key here>

Usage

This plugin adds together models that support inference without VM start to increase speed.

llm models list
llm -m <one-together-model> "Three names for my new ai project"

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-together
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

Test

Execute unit test with:

pytest

About

Plugin for LLM-CLI adding support for Together.AI hosting a large collection of open-source LLMs

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages