[Feature] Support multiple api keys in server (#18548)

Signed-off-by: Yan Pashkovsky <yanp.bugz@gmail.com>
This commit is contained in:
Yan Pashkovsky
2025-07-30 15:03:23 +01:00
committed by GitHub
parent da3e0bd6e5
commit bf668b5bf5
3 changed files with 30 additions and 29 deletions

View File

@@ -126,6 +126,7 @@ curl http://localhost:8000/v1/models
```
You can pass in the argument `--api-key` or environment variable `VLLM_API_KEY` to enable the server to check for API key in the header.
You can pass multiple keys after `--api-key`, and the server will accept any of the keys passed, this can be useful for key rotation.
### OpenAI Completions API with vLLM