[CI/Build] Replace vllm.entrypoints.openai.api_server entrypoint with vllm serve command (#25967)

Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
This commit is contained in:
Cyrus Leung
2025-10-03 01:04:57 +08:00
committed by GitHub
parent 3b279a84be
commit d00d652998
22 changed files with 101 additions and 66 deletions

View File

@@ -20,7 +20,7 @@ To get started with Open WebUI using vLLM, follow these steps:
For example:
```console
python -m vllm.entrypoints.openai.api_server --host 0.0.0.0 --port 8000
vllm serve <model> --host 0.0.0.0 --port 8000
```
3. Start the Open WebUI Docker container: