🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
-
Updated
Apr 29, 2024 - Python
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
Slicing a PyTorch Tensor Into Parallel Shards
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
JORA: JAX Tensor-Parallel LoRA Library
Tensor Parallelism with JAX + Shard Map
Add a description, image, and links to the tensor-parallelism topic page so that developers can more easily learn about it.
To associate your repository with the tensor-parallelism topic, visit your repo's landing page and select "manage topics."