This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
fe743b798dfa56aea3e2cb7182365ba3495489ee
vllm
/
vllm
/
attention
History
youkaichao
fe743b798d
[bugfix] fix early import of flash attention (
#12959
)
...
Signed-off-by: youkaichao <
youkaichao@gmail.com
>
2025-02-09 00:06:56 +08:00
..
backends
[bugfix] fix early import of flash attention (
#12959
)
2025-02-09 00:06:56 +08:00
ops
[ROCM][AMD][TRITON] Halving warps number for fw_prefill to reduce spilling (
#12713
)
2025-02-05 03:58:22 +00:00
__init__.py
[Misc] Add SPDX-License-Identifier headers to python source files (
#12628
)
2025-02-02 11:58:18 -08:00
layer.py
Merging PR
#12536
2025-02-05 13:24:26 -08:00
selector.py
[Misc] Add SPDX-License-Identifier headers to python source files (
#12628
)
2025-02-02 11:58:18 -08:00