Replace "online inference" with "online serving" (#11923)

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
This commit is contained in:
Harry Mellor
2025-01-10 12:05:56 +00:00
committed by GitHub
parent ef725feafc
commit d85c47d6ad
11 changed files with 16 additions and 16 deletions

View File

@@ -5,7 +5,7 @@
This guide will help you quickly get started with vLLM to perform:
- [Offline batched inference](#quickstart-offline)
- [Online inference using OpenAI-compatible server](#quickstart-online)
- [Online serving using OpenAI-compatible server](#quickstart-online)
## Prerequisites