This Jupyter notebook demonstrates how to run the Meta-Llama-3 model on Apple's Mac silicon devices from My Medium Post. It includes examples of generating responses from simple prompts and delves into more complex scenarios like solving mathematical problems.
- Apple Mac with M1, M2, or M3 chip
- macOS Monterey or later
- Python 3.x
- Required Python packages:
ipywidgets
,torch
,mlx-lm
Clone this repository and install the necessary packages:
pip install ipywidgets torch mlx-lm