ryyzn9 / flash-linear-attention Goto Github PK
View Code? Open in Web Editor NEWThis project forked from sustcsonglin/flash-linear-attention
Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton
License: MIT License