This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
67a6882da474a45dde0d35b3789e096e7bd0fd4e
vllm
/
vllm
/
attention
History
ErkinSagiroglu
55137e8ee3
Fix: MI100 Support By Bypassing Custom Paged Attention (
#9560
)
2024-10-26 12:12:57 +00:00
..
backends
Fix: MI100 Support By Bypassing Custom Paged Attention (
#9560
)
2024-10-26 12:12:57 +00:00
ops
[Hardware][CPU] using current_platform.is_cpu (
#9536
)
2024-10-22 00:50:43 -07:00
__init__.py
[Core] Add
AttentionState
abstraction (
#7663
)
2024-08-20 18:50:45 +00:00
layer.py
[Kernel] Support sliding window in flash attention backend (
#9403
)
2024-10-20 10:57:52 -07:00
selector.py
[Hardware][openvino] is_openvino --> current_platform.is_openvino (
#9716
)
2024-10-26 10:59:06 +00:00