Tagging using LLM
-
Updated
Dec 20, 2023 - Jupyter Notebook
Tagging using LLM
Reference implementation of Mistral AI 7B v0.1 model.
Notes on the Mistral AI model
Examples of RAG using LangChain with local LLMs - Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
This repository contains a web application with a chat interface for interacting with Dolphin2.7-Mixtral-8x7b AI model.
Train llm (bloom, llama, baichuan2-7b, chatglm3-6b) with deepspeed pipeline mode. Faster than zero/zero++/fsdp.
A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
A Streamlit-based job offer application featuring interactive visualizations and chatbot/Q&A bots powered by MistralAI Mixtral-8x7B LLM, with offers stored in a SQLite data warehouse and Dockerised
Examples of RAG using Llamaindex with local LLMs in Linux - Gemma, Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
Examples of RAG using Llamaindex with local LLMs - Gemma, Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
Entrainer un LLM en local avec ses propres données
Bash scripts for in-line CLI review, editing and execution of Bash commands suggested by an LLM in response to a user's requests for assistance.
Aditya, an AI Assistant that uses Mixtral-8x7B as the Base LLM and other Tools to get Job Done
Tool for test diferents large language models without code.
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
Unofficial .NET SDK for the Mistral AI platform.
Turn any Youtube video into a nice blogpost, using Groq and Deepgram.
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
New Updated! Open Source Groq-LangChain-Sreamlit-Mixtral 7x8B-Llama2 AI Chatbot. Wrote new code to create a Docker Dev Container and a Streamlit folder for an Easy Installation to Launch and Run the app on Github Codespaces
Add a description, image, and links to the mixtral-8x7b topic page so that developers can more easily learn about it.
To associate your repository with the mixtral-8x7b topic, visit your repo's landing page and select "manage topics."