Add LoRA Alpha parameter to modules that support enable_lora #21111
Labels
keras-team-review-pending
Pending review by a Keras team member.
type:feature
The user is asking for a new feature.
I suggest we can add an
alpha
parameter in theenable_lora
function. Currently, only therank
can be specified in the function.Typically, LoRA decomposes the weight update into two matrices A and B and then multiply their product by α/rank to control its magnitude.
For backward compatibility, we can set the default
alpha
=rank
. That way, if you were already using LoRA with a given rank, setting the same rank for alpha reproduces the old behavior exactly.And I think this could be implemented by multiplying the product of matrix A and B by (
self.alpha
/self.rank
) when calling the kernel.The text was updated successfully, but these errors were encountered: