[LoRA] Adds support for bias in LoRA (#5733)
Signed-off-by: Umesh Deshpande <udeshpa@us.ibm.com> Co-authored-by: Umesh Deshpande <udeshpa@us.ibm.com>
This commit is contained in:
@@ -1687,6 +1687,7 @@ class LoRAConfig:
|
||||
# This is a constant.
|
||||
lora_vocab_padding_size: ClassVar[int] = 256
|
||||
long_lora_scaling_factors: Optional[Tuple[float]] = None
|
||||
bias_enabled: bool = False
|
||||
|
||||
def __post_init__(self):
|
||||
# Setting the maximum rank to 256 should be able to satisfy the vast
|
||||
|
||||
Reference in New Issue
Block a user