[ROCm][Documentation] update quickstart and installation to include rocm nightly docker tips (#38367)

Signed-off-by: Hongxia Yang <hongxiay.yang@amd.com>
Co-authored-by: Hongxia Yang <hongxiay.yang@amd.com>
This commit is contained in:
Hongxia Yang
2026-03-27 19:20:19 -04:00
committed by GitHub
parent 731285c939
commit 83a4df049d
2 changed files with 16 additions and 22 deletions

View File

@@ -56,9 +56,12 @@ This guide will help you quickly get started with vLLM to perform:
!!! note
It currently supports Python 3.12, ROCm 7.0 and `glibc >= 2.35`.
!!! note
!!! note
Note that, previously, docker images were published using AMD's docker release pipeline and were located `rocm/vllm-dev`. This is being deprecated by using vLLM's docker release pipeline.
!!! tip
A nightly Docker image is also available as [vllm/vllm-openai-rocm:nightly](https://hub.docker.com/r/vllm/vllm-openai-rocm/tags) for testing the latest development builds.
=== "Google TPU"
To run vLLM on Google TPUs, you need to install the `vllm-tpu` package.