Make distinct code and console admonitions so readers are less likely to miss them (#20585)

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
This commit is contained in:
Harry Mellor
2025-07-08 03:55:28 +01:00
committed by GitHub
parent 31c5d0a1b7
commit af107d5a0e
52 changed files with 192 additions and 162 deletions

View File

@@ -60,7 +60,7 @@ And then you can send out a query to the OpenAI-compatible API to check the avai
curl -o- http://localhost:30080/models
```
??? Output
??? console "Output"
```json
{
@@ -89,7 +89,7 @@ curl -X POST http://localhost:30080/completions \
}'
```
??? Output
??? console "Output"
```json
{
@@ -121,7 +121,7 @@ sudo helm uninstall vllm
The core vLLM production stack configuration is managed with YAML. Here is the example configuration used in the installation above:
??? Yaml
??? code "Yaml"
```yaml
servingEngineSpec: