Allow installation/use without flash attention dependency#3
Merged
pengzhangzhi merged 4 commits intopengzhangzhi:mainfrom Dec 3, 2024
Merged
Allow installation/use without flash attention dependency#3pengzhangzhi merged 4 commits intopengzhangzhi:mainfrom
pengzhangzhi merged 4 commits intopengzhangzhi:mainfrom
Conversation
Owner
|
Nice catch! Did u run the example code on the readme? Did it pass? Would love to merge if the code runs! Thank you : ) |
Contributor
Author
I was able to install and run the example code after making these changes and it seemed to work fine. I haven't tried running your tests yet but would be happy to do that, will report back! |
Owner
|
That's amazing! thanks Alex! Cheers, |
Contributor
Author
Owner
|
Beautiful! I think we can still get ~30% reduction if we use the pytorch SDPA. Not bad :) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.

Nice repo! Thanks for sharing.
Apologies in advance if I misunderstood something, but I wanted to use
faesmwith just the SDPA upgrade and found that I was unable to install or run it withoutflash-attn.I made a couple of small changes as a workaround:
setup.pyto makeflash-attnan optional dependencyrotaryandutilsmodules conditional on havingflash-attninstalled