This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
a3e4e85ece3c01cf58ffe049540b988e3751001c
vllm
/
vllm
/
attention
History
Li, Jiang
7721ef1786
[CI/Build][CPU] Fix CPU CI and remove all CPU V0 files (
#20560
)
...
Signed-off-by: jiang1.li <
jiang1.li@intel.com
>
2025-07-07 22:13:44 -07:00
..
backends
[CI/Build][CPU] Fix CPU CI and remove all CPU V0 files (
#20560
)
2025-07-07 22:13:44 -07:00
ops
[CI/Build][CPU] Fix CPU CI and remove all CPU V0 files (
#20560
)
2025-07-07 22:13:44 -07:00
utils
Quick Fix by adding conditional import for flash_attn_varlen_func in flash_attn (
#20143
)
2025-06-27 05:48:13 +00:00
__init__.py
[Misc] Add SPDX-FileCopyrightText (
#19100
)
2025-06-03 11:20:17 -07:00
layer.py
[V1] Support any head size for FlexAttention backend (
#20467
)
2025-07-06 09:54:36 -07:00
selector.py
[V1] Support any head size for FlexAttention backend (
#20467
)
2025-07-06 09:54:36 -07:00