[Attention] add DCP support for FLASH_ATTN_MLA backend (#24453)
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com> Signed-off-by: Matthew Bonanni <mbonanni@redhat.com> Co-authored-by: Matthew Bonanni <mbonanni@redhat.com>
This commit is contained in:
@@ -440,6 +440,9 @@ class GPUModelRunner(LoRAModelRunnerMixin, KVConnectorModelRunnerMixin):
|
||||
return
|
||||
|
||||
if self.reorder_batch_threshold is not None:
|
||||
# NOTE(lucas): currently no backend supports the custom masking
|
||||
# required for DCP with q_len > 1, so we assert here. Remove this
|
||||
# assert once the custom mask is support is added to FA3.
|
||||
if self.dcp_world_size > 1:
|
||||
assert self.reorder_batch_threshold == 1, \
|
||||
"DCP not support reorder_batch_threshold > 1 now."
|
||||
|
||||
Reference in New Issue
Block a user