Skip to content

add sm_margin for hopper flash_attn_qkvpacked_func#1603

Merged
tridao merged 1 commit intoDao-AILab:mainfrom
TopIdiot:feature/qkv_packed_sm_margin
Apr 22, 2025
Merged

add sm_margin for hopper flash_attn_qkvpacked_func#1603
tridao merged 1 commit intoDao-AILab:mainfrom
TopIdiot:feature/qkv_packed_sm_margin

Conversation

@TopIdiot
Copy link
Contributor

No description provided.

@tridao tridao merged commit 75f90d6 into Dao-AILab:main Apr 22, 2025
playerzer0x pushed a commit to Liqhtworks/flash-attention that referenced this pull request Jul 24, 2025
Co-authored-by: yowenchen <yowenchen@tencent.com>
elewarr pushed a commit to elewarr/flash-attention that referenced this pull request Feb 4, 2026
Co-authored-by: yowenchen <yowenchen@tencent.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants