[Model] Initialize support for Deepseek-VL2 models (#11578)
Signed-off-by: Isotr0py <2037008807@qq.com> Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
This commit is contained in:
@@ -610,6 +610,13 @@ See [this page](#generative-models) for more information on how to use generativ
|
||||
-
|
||||
- ✅︎
|
||||
- ✅︎
|
||||
* - `DeepseekVLV2ForCausalLM`
|
||||
- DeepSeek-VL2
|
||||
- T + I<sup>+</sup>
|
||||
- `deepseek-ai/deepseek-vl2-tiny`(WIP), `deepseek-ai/deepseek-vl2-small`, `deepseek-ai/deepseek-vl2` etc. (see note)
|
||||
-
|
||||
- ✅︎
|
||||
- ✅︎
|
||||
* - `FuyuForCausalLM`
|
||||
- Fuyu
|
||||
- T + I
|
||||
@@ -755,8 +762,19 @@ See [this page](#generative-models) for more information on how to use generativ
|
||||
<sup>E</sup> Pre-computed embeddings can be inputted for this modality.
|
||||
<sup>+</sup> Multiple items can be inputted per text prompt for this modality.
|
||||
|
||||
````{note}
|
||||
The `deepseek-ai/deepseek-vl2-tiny` is not supported yet.
|
||||
|
||||
To use `DeepSeek-VL2` series models, you need to install a fork version `deepseek_vl2` package:
|
||||
```shell
|
||||
pip install git+https://github.com/Isotr0py/DeepSeek-VL2.git
|
||||
```
|
||||
|
||||
Besides, to run `DeepSeek-VL2` series models, you have to pass `--hf_overrides '{"architectures": ["DeepseekVLV2ForCausalLM"]}'` when running vLLM.
|
||||
````
|
||||
|
||||
```{note}
|
||||
To use `TIGER-Lab/Mantis-8B-siglip-llama3`, you have pass `--hf_overrides '{"architectures": ["MantisForConditionalGeneration"]}'` when running vLLM.
|
||||
To use `TIGER-Lab/Mantis-8B-siglip-llama3`, you have to pass `--hf_overrides '{"architectures": ["MantisForConditionalGeneration"]}'` when running vLLM.
|
||||
```
|
||||
|
||||
```{note}
|
||||
|
||||
Reference in New Issue
Block a user