[Core] Consolidate prompt arguments to LLM engines (#4328)
Co-authored-by: Roger Wang <ywang@roblox.com>
This commit is contained in:
14
docs/source/dev/offline_inference/llm_inputs.rst
Normal file
14
docs/source/dev/offline_inference/llm_inputs.rst
Normal file
@@ -0,0 +1,14 @@
|
||||
LLM Inputs
|
||||
==========
|
||||
|
||||
.. autodata:: vllm.inputs.PromptStrictInputs
|
||||
|
||||
.. autoclass:: vllm.inputs.TextPrompt
|
||||
:show-inheritance:
|
||||
:members:
|
||||
:member-order: bysource
|
||||
|
||||
.. autoclass:: vllm.inputs.TokensPrompt
|
||||
:show-inheritance:
|
||||
:members:
|
||||
:member-order: bysource
|
||||
Reference in New Issue
Block a user