[ROCm][Documentation] update quickstart and installation to include rocm nightly docker tips (#38367)
Signed-off-by: Hongxia Yang <hongxiay.yang@amd.com> Co-authored-by: Hongxia Yang <hongxiay.yang@amd.com>
This commit is contained in:
@@ -56,9 +56,12 @@ This guide will help you quickly get started with vLLM to perform:
|
||||
!!! note
|
||||
It currently supports Python 3.12, ROCm 7.0 and `glibc >= 2.35`.
|
||||
|
||||
!!! note
|
||||
!!! note
|
||||
Note that, previously, docker images were published using AMD's docker release pipeline and were located `rocm/vllm-dev`. This is being deprecated by using vLLM's docker release pipeline.
|
||||
|
||||
!!! tip
|
||||
A nightly Docker image is also available as [vllm/vllm-openai-rocm:nightly](https://hub.docker.com/r/vllm/vllm-openai-rocm/tags) for testing the latest development builds.
|
||||
|
||||
=== "Google TPU"
|
||||
|
||||
To run vLLM on Google TPUs, you need to install the `vllm-tpu` package.
|
||||
|
||||
Reference in New Issue
Block a user