🎉CUDA 笔记 / 大模型手撕CUDA / C++笔记,更新随缘: flash_attn、sgemm、sgemv、warp reduce、block reduce、dot product、elementwise、softmax、layernorm、rmsnorm、hist etc.
-
Updated
Jun 3, 2024 - Cuda
🎉CUDA 笔记 / 大模型手撕CUDA / C++笔记,更新随缘: flash_attn、sgemm、sgemv、warp reduce、block reduce、dot product、elementwise、softmax、layernorm、rmsnorm、hist etc.
Code for the paper "On the Expressivity Role of LayerNorm in Transformers' Attention" (Findings of ACL'2023)
Implement layer normalization GRU in pytorch
MNIST Digit Prediction using Batch Normalization, Group Normalization, Layer Normalization and L1-L2 Regularizations
WGAN with feedback from discriminator& LayerNorm instead of BatchNorm
Fundamentals of Artificial Intelligence and Deep Learning Frameworks
Add a description, image, and links to the layernorm topic page so that developers can more easily learn about it.
To associate your repository with the layernorm topic, visit your repo's landing page and select "manage topics."