[Bugfix] Fix Positive Feature Layers in Llava Models (#13514)

Signed-off-by: Alex-Brooks <Alex.brooks@ibm.com>
This commit is contained in:
Alex Brooks
2025-02-19 01:50:07 -07:00
committed by GitHub
parent fdc5df6f54
commit 983a40a8bb
6 changed files with 44 additions and 9 deletions

View File

@@ -969,7 +969,7 @@ class PixtralHFTransformer(nn.Module):
position_embeddings: torch.Tensor,
return_all_hidden_states: bool,
) -> torch.Tensor:
hidden_states_pool = []
hidden_states_pool = [x]
for layer in self.layers:
x = layer(x, attention_mask, position_embeddings)