Signed-off-by: zxw <1020938856@qq.com>
This commit is contained in:
@@ -134,7 +134,7 @@ outputs = llm.chat(conversation, chat_template=custom_template)
|
||||
|
||||
## Online Serving
|
||||
|
||||
Our [OpenAI-Compatible Server][openai-compatible-server] provides endpoints that correspond to the offline APIs:
|
||||
Our [OpenAI-Compatible Server][serving-openai-compatible-server] provides endpoints that correspond to the offline APIs:
|
||||
|
||||
- [Completions API][completions-api] is similar to `LLM.generate` but only accepts text.
|
||||
- [Chat API][chat-api] is similar to `LLM.chat`, accepting both text and [multi-modal inputs][multimodal-inputs] for models with a chat template.
|
||||
|
||||
Reference in New Issue
Block a user