🎉flash-attention-mma 0.0.1
What's Changed
- [HGEMM] CuTe HGEMM debug Makefile target by @DefTruth in https://github.com/DefTruth/CUDA-Learn-Notes/pull/154
- [Softmax] Update Online Softmax bindings by @DefTruth in https://github.com/DefTruth/CUDA-Learn-Notes/pull/155
- [FlashAttention] Refactor toy-flash-attn codes part-1 by @DefTruth in https://github.com/DefTruth/CUDA-Learn-Notes/pull/156
- [Bug]Fix typo by @wjj19950828 in https://github.com/DefTruth/CUDA-Learn-Notes/pull/157
- [FlashAttention] Release flash-atttention-mma 0.0.1 🎉 by @DefTruth in https://github.com/DefTruth/CUDA-Learn-Notes/pull/158
New Contributors
- @wjj19950828 made their first contribution in https://github.com/DefTruth/CUDA-Learn-Notes/pull/157
Full Changelog: DefTruth/CUDA-Learn-Notes@v2.6.5...v2.6.6