docs: update CPU Docker images to reference Docker Hub instead of AWS ECR (#34882)
Signed-off-by: Maxime Grenu <69890511+cluster2600@users.noreply.github.com> Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com> Co-authored-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
This commit is contained in:
@@ -136,20 +136,20 @@ Testing has been conducted on AWS Graviton3 instances for compatibility.
|
||||
# --8<-- [end:build-wheel-from-source]
|
||||
# --8<-- [start:pre-built-images]
|
||||
|
||||
To pull the latest image:
|
||||
To pull the latest image from Docker Hub:
|
||||
|
||||
```bash
|
||||
docker pull public.ecr.aws/q9t5s3a7/vllm-arm64-cpu-release-repo:latest
|
||||
docker pull vllm/vllm-openai-cpu:latest-arm64
|
||||
```
|
||||
|
||||
To pull an image with a specific vLLM version:
|
||||
|
||||
```bash
|
||||
export VLLM_VERSION=$(curl -s https://api.github.com/repos/vllm-project/vllm/releases/latest | jq -r .tag_name | sed 's/^v//')
|
||||
docker pull public.ecr.aws/q9t5s3a7/vllm-arm64-cpu-release-repo:v${VLLM_VERSION}
|
||||
docker pull vllm/vllm-openai-cpu:v${VLLM_VERSION}-arm64
|
||||
```
|
||||
|
||||
All available image tags are here: [https://gallery.ecr.aws/q9t5s3a7/vllm-arm64-cpu-release-repo](https://gallery.ecr.aws/q9t5s3a7/vllm-arm64-cpu-release-repo).
|
||||
All available image tags are here: [https://hub.docker.com/r/vllm/vllm-openai-cpu/tags](https://hub.docker.com/r/vllm/vllm-openai-cpu/tags).
|
||||
|
||||
You can run these images via:
|
||||
|
||||
@@ -158,7 +158,7 @@ docker run \
|
||||
-v ~/.cache/huggingface:/root/.cache/huggingface \
|
||||
-p 8000:8000 \
|
||||
--env "HF_TOKEN=<secret>" \
|
||||
public.ecr.aws/q9t5s3a7/vllm-arm64-cpu-release-repo:<tag> <args...>
|
||||
vllm/vllm-openai-cpu:latest-arm64 <args...>
|
||||
```
|
||||
|
||||
You can also access the latest code with Docker images. These are not intended for production use and are meant for CI and testing only. They will expire after several days.
|
||||
|
||||
@@ -161,13 +161,20 @@ uv pip install dist/*.whl
|
||||
# --8<-- [end:build-wheel-from-source]
|
||||
# --8<-- [start:pre-built-images]
|
||||
|
||||
You can pull the latest available CPU image here via:
|
||||
You can pull the latest available CPU image from Docker Hub:
|
||||
|
||||
```bash
|
||||
docker pull public.ecr.aws/q9t5s3a7/vllm-cpu-release-repo:latest
|
||||
docker pull vllm/vllm-openai-cpu:latest-x86_64
|
||||
```
|
||||
|
||||
If you want a more specific build you can find all published CPU based images here: [https://gallery.ecr.aws/q9t5s3a7/vllm-cpu-release-repo](https://gallery.ecr.aws/q9t5s3a7/vllm-cpu-release-repo)
|
||||
To pull an image for a specific vLLM version:
|
||||
|
||||
```bash
|
||||
export VLLM_VERSION=$(curl -s https://api.github.com/repos/vllm-project/vllm/releases/latest | jq -r .tag_name | sed 's/^v//')
|
||||
docker pull vllm/vllm-openai-cpu:v${VLLM_VERSION}-x86_64
|
||||
```
|
||||
|
||||
All available image tags are here: [https://hub.docker.com/r/vllm/vllm-openai-cpu/tags](https://hub.docker.com/r/vllm/vllm-openai-cpu/tags)
|
||||
|
||||
You can run these images via:
|
||||
|
||||
@@ -176,7 +183,7 @@ docker run \
|
||||
-v ~/.cache/huggingface:/root/.cache/huggingface \
|
||||
-p 8000:8000 \
|
||||
--env "HF_TOKEN=<secret>" \
|
||||
public.ecr.aws/q9t5s3a7/vllm-cpu-release-repo:<tag> <args...>
|
||||
vllm/vllm-openai-cpu:latest-x86_64 <args...>
|
||||
```
|
||||
|
||||
!!! warning
|
||||
|
||||
Reference in New Issue
Block a user