Skip to content

master-1c168d9

Compare
Choose a tag to compare
@github-actions github-actions released this 23 Nov 06:04
1c168d9
fix: repair flash attention support (#386)

* repair flash attention in _ext
this does not fix the currently broken fa behind the define, which is only used by VAE

Co-authored-by: FSSRepo <[email protected]>

* make flash attention in the diffusion model a runtime flag
no support for sd3 or video

* remove old flash attention option and switch vae over to attn_ext

* update docs

* format code

---------

Co-authored-by: FSSRepo <[email protected]>
Co-authored-by: leejet <[email protected]>