lori_server
is an intermediary server to a locally run LLM which implements web crawling, sandboxed file access and other useful chatbot integrations for ollama-chat
using a plugin architecture.
This project is currently in alpha and not usable for anything except for casual chat with a local Ollama model with some Markdown and syntax highlighting. Expect earth-shattering changes.