[Feature] specify model in config.yaml (#14855)

Signed-off-by: weizeng <weizeng@roblox.com>
This commit is contained in:
Wei Zeng
2025-03-21 00:26:03 -07:00
committed by GitHub
parent da6ea29f7a
commit 0fa3970deb
7 changed files with 102 additions and 30 deletions

View File

@@ -184,6 +184,7 @@ For example:
```yaml
# config.yaml
model: meta-llama/Llama-3.1-8B-Instruct
host: "127.0.0.1"
port: 6379
uvicorn-log-level: "info"
@@ -192,12 +193,13 @@ uvicorn-log-level: "info"
To use the above config file:
```bash
vllm serve SOME_MODEL --config config.yaml
vllm serve --config config.yaml
```
:::{note}
In case an argument is supplied simultaneously using command line and the config file, the value from the command line will take precedence.
The order of priorities is `command line > config file values > defaults`.
e.g. `vllm serve SOME_MODEL --config config.yaml`, SOME_MODEL takes precedence over `model` in config file.
:::
## API Reference