This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
34ddcf9ff433a9a80657ec507a57c049d7ef8183
vllm
/
vllm
/
attention
History
weiliang
2dd72d23d9
update flashinfer to v0.2.9rc1 (
#21485
)
...
Signed-off-by: Weiliang Liu <
weiliangl@nvidia.com
>
2025-07-24 14:06:11 -07:00
..
backends
update flashinfer to v0.2.9rc1 (
#21485
)
2025-07-24 14:06:11 -07:00
ops
[V0 Deprecation] Deprecate BlockSparse Attention & Phi3-Small (
#21217
)
2025-07-19 13:53:17 -07:00
utils
[MISC] Add init files for python package (
#20908
)
2025-07-15 12:16:33 +00:00
__init__.py
[Misc] Add SPDX-FileCopyrightText (
#19100
)
2025-06-03 11:20:17 -07:00
layer.py
[V1] Fix local chunked attention always disabled (
#21419
)
2025-07-23 15:59:30 -07:00
selector.py
[V0 Deprecation] Deprecate BlockSparse Attention & Phi3-Small (
#21217
)
2025-07-19 13:53:17 -07:00