[Model] Enable LoRA support for tower and connector in H2OVL (#31696)

Signed-off-by: shwetha-s-poojary <shwetha.s-poojary@ibm.com>
This commit is contained in:
Shwetha Poojary
2026-03-18 18:56:47 +05:30
committed by GitHub
parent 17c47fb869
commit cef1f302d2
2 changed files with 15 additions and 1 deletions

View File

@@ -707,7 +707,7 @@ These models primarily accept the [`LLM.generate`](./generative_models.md#llmgen
| `GraniteSpeechForConditionalGeneration` | Granite Speech | T + A | `ibm-granite/granite-speech-3.3-8b` | ✅︎ | ✅︎ |
| `HCXVisionForCausalLM` | HyperCLOVAX-SEED-Vision-Instruct-3B | T + I<sup>+</sup> + V<sup>+</sup> | `naver-hyperclovax/HyperCLOVAX-SEED-Vision-Instruct-3B` | | |
| `HCXVisionV2ForCausalLM` | HyperCLOVAX-SEED-Think-32B | T + I<sup>+</sup> + V<sup>+</sup> | `naver-hyperclovax/HyperCLOVAX-SEED-Think-32B` | | |
| `H2OVLChatModel` | H2OVL | T + I<sup>E+</sup> | `h2oai/h2ovl-mississippi-800m`, `h2oai/h2ovl-mississippi-2b`, etc. | | ✅︎ |
| `H2OVLChatModel` | H2OVL | T + I<sup>E+</sup> | `h2oai/h2ovl-mississippi-800m`, `h2oai/h2ovl-mississippi-2b`, etc. | ✅︎ | ✅︎ |
| `HunYuanVLForConditionalGeneration` | HunyuanOCR | T + I<sup>E+</sup> | `tencent/HunyuanOCR`, etc. | ✅︎ | ✅︎ |
| `Idefics3ForConditionalGeneration` | Idefics3 | T + I | `HuggingFaceM4/Idefics3-8B-Llama3`, etc. | ✅︎ | |
| `IsaacForConditionalGeneration` | Isaac | T + I<sup>+</sup> | `PerceptronAI/Isaac-0.1` | ✅︎ | ✅︎ |

View File

@@ -163,3 +163,17 @@ class H2OVLChatModel(InternVLChatModel):
else:
msg = "Monolith mode is not applicable to H2OVL"
raise NotImplementedError(msg)
def get_num_mm_encoder_tokens(self, num_image_tokens: int) -> int:
if num_image_tokens <= 0 or self.num_image_token <= 0:
return 0
num_patches = num_image_tokens // self.num_image_token
return num_patches * (self.patch_tokens + 1)
def get_num_mm_connector_tokens(self, num_vision_tokens: int) -> int:
if num_vision_tokens <= 0 or self.num_image_token <= 0:
return 0
num_patches = num_vision_tokens // (self.patch_tokens + 1)
return num_patches * self.num_image_token