Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: FlashAttention only supports Ampere GPUs or newer. #84

Open
nizhidao opened this issue Dec 22, 2024 · 6 comments
Open

RuntimeError: FlashAttention only supports Ampere GPUs or newer. #84

nizhidao opened this issue Dec 22, 2024 · 6 comments

Comments

@nizhidao
Copy link

nizhidao commented Dec 22, 2024

OS:Windows,GPU:2080ti 22g

is happened at when I trying to generate, is can not runing on my old turing?😢

can it be disabled? I can accept lower speed

@cjjkoko
Copy link

cjjkoko commented Dec 22, 2024

os.environ['ATTN_BACKEND'] = 'xformers'

@nizhidao
Copy link
Author

os.environ['ATTN_BACKEND'] = 'xformers'

thanks, but it has a new problem: ModuleNotFoundError: No module named 'diff_gaussian_rasterization'

it happened when it generating, at randering process, how i can install this dependence?

@cjjkoko
Copy link

cjjkoko commented Dec 22, 2024

@nizhidao
Copy link
Author

nizhidao commented Dec 23, 2024

Duplicate of #26 Found same Issues, my cuda version is 12.7, should I lower to 11.8?

@cjjkoko
Copy link

cjjkoko commented Dec 23, 2024

Duplicate of #26 Found same Issues, my cuda version is 12.7, should I lower to 11.8?

Image https://kaolin.readthedocs.io/en/latest/notes/installation.html

@cjjkoko
Copy link

cjjkoko commented Dec 23, 2024

Maybe, cu121 is a good choice

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants