Skip to content

[WIP] Add support for flex attention (paged attention)#35419

Draft
blzheng wants to merge 15 commits intohuggingface:mainfrom blzheng:beilei/enable_flex_attn