Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LoRA Alpha parameter to modules that support enable_lora #21111

Open
b05505027 opened this issue Mar 31, 2025 · 1 comment
Open

Add LoRA Alpha parameter to modules that support enable_lora #21111

b05505027 opened this issue Mar 31, 2025 · 1 comment
Assignees
Labels
keras-team-review-pending Pending review by a Keras team member. type:feature The user is asking for a new feature.

Comments

@b05505027
Copy link

b05505027 commented Mar 31, 2025

I suggest we can add an alpha parameter in the enable_lora function. Currently, only the rank can be specified in the function.

Typically, LoRA decomposes the weight update into two matrices A and B and then multiply their product by α/rank to control its magnitude.

For backward compatibility, we can set the default alpha = rank. That way, if you were already using LoRA with a given rank, setting the same rank for alpha reproduces the old behavior exactly.

And I think this could be implemented by multiplying the product of matrix A and B by (self.alpha / self.rank) when calling the kernel.

@sonali-kumari1 sonali-kumari1 added the type:feature The user is asking for a new feature. label Mar 31, 2025
@dhantule dhantule added the keras-team-review-pending Pending review by a Keras team member. label Apr 1, 2025
@divyashreepathihalli
Copy link
Collaborator

Please feel free to open a PR for this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
keras-team-review-pending Pending review by a Keras team member. type:feature The user is asking for a new feature.
Projects
None yet
Development

No branches or pull requests

5 participants