2025-07-08 11:27:40 +01:00
# Installation
2025-05-23 11:09:53 +02:00
vLLM supports the following hardware platforms:
- [GPU ](gpu.md )
- [NVIDIA CUDA ](gpu.md#nvidia-cuda )
- [AMD ROCm ](gpu.md#amd-rocm )
- [Intel XPU ](gpu.md#intel-xpu )
- [CPU ](cpu.md )
- [Intel/AMD x86 ](cpu.md#intelamd-x86 )
- [ARM AArch64 ](cpu.md#arm-aarch64 )
- [Apple silicon ](cpu.md#apple-silicon )
- [IBM Z (S390X) ](cpu.md#ibm-z-s390x )
2025-08-12 20:14:46 -04:00
## Hardware Plugins
2025-12-29 23:55:20 +08:00
vLLM supports third-party hardware plugins that live **outside ** the main `vllm` repository. These follow the [Hardware-Pluggable RFC ](../../design/plugin_system.md ).
2025-08-12 20:14:46 -04:00
2026-03-16 11:12:58 +00:00
A list of all supported hardware can be found on the vLLM website, see [Universal Compatibility - Hardware ](https://vllm.ai/#compatibility ).
If you want to add new hardware, please contact us on [Slack ](https://slack.vllm.ai/ ) or [Email ](mailto:collaboration@vllm.ai ).