This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
947dfda9c281c2b2d779a29e73bbc20170dcfab3
vllm
/
vllm
/
platforms
History
Fadi Arafeh
f355ad5412
[CPU][FIX] Fix build failures on Arm CPUs with torch nightly (
#30481
)
...
Signed-off-by: Fadi Arafeh <
fadi.arafeh@arm.com
>
2025-12-12 02:09:25 +00:00
..
__init__.py
[TPU] Rename path to tpu platform (
#28452
)
2025-11-11 19:16:47 +00:00
cpu.py
[CPU][FIX] Fix build failures on Arm CPUs with torch nightly (
#30481
)
2025-12-12 02:09:25 +00:00
cuda.py
[BugFix][DeepSeek-V3.2] Fix backend selection logic for Blackwell (
#30195
)
2025-12-07 10:53:51 -05:00
interface.py
[v1] Add PrefixLM support to FlexAttention backend (
#27938
)
2025-12-07 15:51:36 +00:00
rocm.py
[ROCm] Fix broken import in platform attention backend dispatching (
#30432
)
2025-12-11 01:12:58 +00:00
tpu.py
[v1] Add PrefixLM support to FlexAttention backend (
#27938
)
2025-12-07 15:51:36 +00:00
xpu.py
[v1] Add PrefixLM support to FlexAttention backend (
#27938
)
2025-12-07 15:51:36 +00:00