Support download models from www.modelscope.cn (#1588)

This commit is contained in:
liuyhwangyh
2023-11-18 12:38:31 +08:00
committed by GitHub
parent bb00f66e19
commit edb305584b
4 changed files with 58 additions and 4 deletions

View File

@@ -81,4 +81,18 @@ Alternatively, you can raise an issue on our `GitHub <https://github.com/vllm-pr
output = llm.generate("Hello, my name is")
print(output)
To use model from www.modelscope.cn
.. code-block:: shell
$ export VLLM_USE_MODELSCOPE=True
.. code-block:: python
from vllm import LLM
llm = LLM(model=..., revision=..., trust_remote_code=True) # Name or path of your model
output = llm.generate("Hello, my name is")
print(output)
If vLLM successfully generates text, it indicates that your model is supported.