WordPress LEMP stack with PHP 8.2, Composer, WP-CLI and more
-
Updated
May 30, 2024 - Jinja
WordPress LEMP stack with PHP 8.2, Composer, WP-CLI and more
🤖 Neovim code suggestion and completion (just like GitHub Copilot, but locally using Ollama)
A simple and private transcription tool able to segment speakers and convert audio to text.
Local-GenAI-Search is a generative search engine based on Llama 3, langchain and qdrant that answers questions based on your local files
Work with LLMs on a local environment using containers
The Open Science community in Uppsala
An automated local cluster setup w/ tls, monitoring, ingress and DNS configuration.
🔹 A Home Assistant integration to handle Tuya devices locally "fork from localtuya"
Fully-featured, beautiful web interface for Ollama LLMs - built with NextJS. Deploy with a single click.
Supabase CLI. Manage postgres migrations, run Supabase locally, deploy edge functions. Postgres backups. Generating types from your database schema.
A lightweight Python-based Software Package for daily use
A flexible, configuration driven CLI for Apache Airflow local development
Utility provider used to manage local resources, such as creating files.
Wingman is the fastest and easiest way to run Llama models on your PC or Mac.
ezlocalai is an easy to set up local artificial intelligence server with OpenAI Style Endpoints.
Lowkey Vault is a small test double for Azure Key Vault. Developer feedback needed, please vote here: https://github.com/nagyesta/lowkey-vault/discussions/272
Add a description, image, and links to the local topic page so that developers can more easily learn about it.
To associate your repository with the local topic, visit your repo's landing page and select "manage topics."