Skip to content

Repository for running LLMs efficiently on Mac silicon (M1, M2, M3). Features Jupyter notebook for Meta-Llama-3 setup using MLX framework, with install guide & perf tips. Aims to optimize LLM performance on Mac silicon for devs & researchers.

Notifications You must be signed in to change notification settings

GusLovesMath/Llama3_MacSilicon

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 

Repository files navigation

Meta-Llama-3 on Mac Silicon

Overview

This Jupyter notebook demonstrates how to run the Meta-Llama-3 model on Apple's Mac silicon devices from My Medium Post. It includes examples of generating responses from simple prompts and delves into more complex scenarios like solving mathematical problems.

Requirements

  • Apple Mac with M1, M2, or M3 chip
  • macOS Monterey or later
  • Python 3.x
  • Required Python packages: ipywidgets, torch, mlx-lm

Setup

Clone this repository and install the necessary packages:

pip install ipywidgets torch mlx-lm

MLX Cummunity on Hugging Face

About

Repository for running LLMs efficiently on Mac silicon (M1, M2, M3). Features Jupyter notebook for Meta-Llama-3 setup using MLX framework, with install guide & perf tips. Aims to optimize LLM performance on Mac silicon for devs & researchers.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published