This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
6,868
Commits
2
Branches
140
Tags
c55d8046723325e09521a24ac076a8a7e64eaa52
Commit Graph
2 Commits
Author
SHA1
Message
Date
Harry Mellor
27bebcd897
Convert
examples
to
ruff-format
(
#18400
)
...
Signed-off-by: Harry Mellor <
19981378+hmellor@users.noreply.github.com
>
2025-05-26 16:57:54 +00:00
Tao He
60f7624334
Implements dual-chunk-flash-attn backend for dual chunk attention with sparse attention support (
#11844
)
2025-05-12 19:52:47 -07:00