Lucas Wilkinson
|
8b5014d3dd
|
[Attention] FA4 integration (#32974)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Signed-off-by: Lucas Wilkinson <LucasWilkinson@users.noreply.github.com>
Co-authored-by: Matthew Bonanni <mbonanni@redhat.com>
Co-authored-by: Tyler Michael Smith <tyler@neuralmagic.com>
|
2026-03-01 23:44:57 +00:00 |
|
Lucas Wilkinson
|
c7914d30f9
|
Reapply [Attention][FA3] Update FA3 to include new swizzle optimization (#34043)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2026-02-11 07:07:56 -08:00 |
|
Luka Govedič
|
e3bf79ffa0
|
Revert "[Attention][FA3] Update FA3 to include new swizzle optimization" (#33841)
|
2026-02-04 19:54:27 -08:00 |
|
Lucas Wilkinson
|
2267cb1cfd
|
[Attention][FA3] Update FA3 to include new swizzle optimization (#23465)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2026-02-03 08:08:47 -08:00 |
|
Lucas Wilkinson
|
be6a81f31b
|
[chore] Update FA commit (#30460)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2026-01-07 23:24:18 -08:00 |
|
Lucas Wilkinson
|
c68c7b403d
|
[BugFix] Fix missing symbol triggering FA2 fallback on Hopper (#29107)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2025-11-21 13:58:32 -08:00 |
|
Matthew Bonanni
|
4c23690f43
|
[Attention] FlashAttention ViT support, make default backend (#28763)
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
|
2025-11-18 20:06:21 -08:00 |
|
Matthew Bonanni
|
8cc40f8992
|
[Attention] Bump FA for removed method (#28429)
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
|
2025-11-14 09:13:37 -08:00 |
|
Jonas M. Kübler
|
9c84ca8293
|
[FA/Chore] Bump FA version for FP8 two-level accumulation (#27889)
Signed-off-by: Jonas Kuebler <kuebj@amazon.com>
Co-authored-by: Lucas Wilkinson <LucasWilkinson@users.noreply.github.com>
|
2025-11-10 12:06:04 -08:00 |
|
Lucas Wilkinson
|
9f020f4f31
|
[BugFix] Fix failing gemma-3-1b-it test: test_lm_eval_accuracy_v1_engine[google/gemma-3-1b-it] (#27111)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2025-10-18 12:44:39 -06:00 |
|
Ming Yang
|
3b736e1c38
|
[Attention][DCP] Support DCP with query length > 1 (MTP) with FA3 (#25049)
Signed-off-by: Ming Yang <minos.future@gmail.com>
|
2025-10-09 08:06:29 -07:00 |
|
Lucas Wilkinson
|
418d111f8c
|
[FA/Chore] Bump vllm-flash-attention (#25537)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2025-10-02 11:06:14 -04:00 |
|
Lucas Wilkinson
|
402759d472
|
[Attention] FlashAttn MLA (#14258)
Signed-off-by: Lucas Wilkinson <lwilkinson@neuralmagic.com>
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni001@gmail.com>
Co-authored-by: Matthew Bonanni <mbonanni001@gmail.com>
Co-authored-by: Matthew Bonanni <mbonanni@redhat.com>
|
2025-09-04 02:47:59 -07:00 |
|
Lucas Wilkinson
|
292084e72a
|
[BugFix] Fix for IMA in FA3 varlen combine (#22967)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2025-08-17 08:52:04 -07:00 |
|
Lucas Wilkinson
|
177e55e3bd
|
[Attention] FA3 Attention Sinks Perf Boost (#22478)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2025-08-15 17:41:07 -04:00 |
|
Thomas Parnell
|
bd875d2eb7
|
[Bugfix] Update FA commit hash (#22546)
Signed-off-by: Thomas Parnell <tpa@zurich.ibm.com>
|
2025-08-08 16:10:25 -07:00 |
|
Lucas Wilkinson
|
2cb6ef8996
|
[BugFix] Fix FA2 RuntimeError when sinks is provided (#22365)
Signed-off-by: LucasWilkinson <lwilkinson@neuralmagic.com>
|
2025-08-06 08:03:03 -07:00 |
|
Woosuk Kwon
|
e3c876dca3
|
Upgrade FA3 for attention sink (#22313)
Signed-off-by: Woosuk Kwon <woosuk.kwon@berkeley.edu>
|
2025-08-05 21:36:21 -07:00 |
|
Woosuk Kwon
|
8acb4badee
|
[CUDA graphs] Enable full cuda graphs with FA3 AoT scheduling (#20301)
Signed-off-by: Woosuk Kwon <woosuk.kwon@berkeley.edu>
|
2025-07-01 09:07:36 -07:00 |
|
Eli Uriegas
|
0d06b533a0
|
cmake: Update vllm_flash_attn for vllm_kernels (#20032)
Signed-off-by: Eli Uriegas <eliuriegas@meta.com>
|
2025-06-24 22:44:10 +00:00 |
|
Lucas Wilkinson
|
a045b7e89a
|
[Perf] Improve/Fix-regression for FA3 in High QPS regimes (#19463)
Signed-off-by: Lucas Wilkinson <lwilkinson@neuralmagic.com>
|
2025-06-24 13:09:01 -04:00 |
|
Lucas Wilkinson
|
07334959d8
|
[Wheel Size] Only build FA2 8.0+PTX (#19336)
|
2025-06-17 12:32:49 +09:00 |
|
Luka Govedič
|
a3896c7f02
|
[Build] Fixes for CMake install (#18570)
|
2025-05-27 20:49:24 -04:00 |
|
yexin(叶鑫)
|
b22980a1dc
|
[Perf]Optimize rotary_emb implementation to use Triton operator for improved inference performance (#16457)
Signed-off-by: cynthieye <yexin93@qq.com>
Co-authored-by: MagnetoWang <magnetowang@outlook.com>
|
2025-04-25 14:52:28 +08:00 |
|
Lucas Wilkinson
|
41ca7eb491
|
[Attention] FA3 decode perf improvement - single mma warp group support for head dim 128 (#16864)
Signed-off-by: Lucas Wilkinson <lwilkinson@neuralmagic.com>
|
2025-04-24 20:12:21 -07:00 |
|
Lucas Wilkinson
|
183dad7a85
|
[Attention] Update to lastest FA3 code (#13111)
Signed-off-by: Lucas Wilkinson <lwilkinson@neuralmagic.com>
|
2025-04-17 15:14:07 -07:00 |
|
Mickaël Seznec
|
a597a57595
|
[Attention] Flash Attention 3 - fp8 (#14570)
Signed-off-by: Mickael Seznec <mickael@mistral.ai>
|
2025-03-20 01:14:20 -04:00 |
|
Pavani Majety
|
ed6ea06577
|
[Hardware] Update the flash attn tag to support Blackwell (#14244)
|
2025-03-05 22:01:37 -08:00 |
|
Lucas Wilkinson
|
f95903909f
|
[Kernel] FlashMLA integration (#13747)
Signed-off-by: Lucas Wilkinson <lwilkinson@neuralmagic.com>
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
|
2025-02-27 10:35:08 +08:00 |
|