Skip to content

Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent

License

Notifications You must be signed in to change notification settings

woheller69/LLAMA_TK_CHAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LLAMA_TK_CHAT

Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent

Quickstart

The TK GUI is based on llama-cpp-python, llama-cpp-agent, typer, and tkinter package (install them with pip install ...).

The GUI is a self-contained Python script named LLAMA_TK_GUI.py. As long as its package dependencies are present, you can download and run it from wherever you like.

Specify the path to the model with the -m/--model argument and the prompt template with -f/--format (default: CHATML, Options: MIXTRAL, CHATML, VICUNA, LLAMA_2, SYNTHIA, NEURAL_CHAT, SOLAR, OPEN_CHAT, ALPACA, CODE_DS, B22, LLAMA_3, PHI_3) A optional system prompt can be set with -s/--sysprompt. Context length defaults to 2048, it can be set with -c/--context-length. The number of threads can be set with -t/--n-threads (default: 4 threads). -l/--mlock allows to use MLOCK instead of MMAP. Everything runs on CPU.

Input your request in the bottom window and click Generate.

Inference can be interrupted using the Stop button.

About

Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Languages