Skip to content

Comments

[SFT] fix: lora_alpha set default to twice lora_rank with warning#3100

Open
caowencai wants to merge 1 commit intoverl-project:mainfrom
caowencai:caowencai-patch-1
Open

[SFT] fix: lora_alpha set default to twice lora_rank with warning#3100
caowencai wants to merge 1 commit intoverl-project:mainfrom
caowencai:caowencai-patch-1

Conversation

@caowencai
Copy link

@caowencai caowencai commented Aug 18, 2025

What does this PR do?

Add the lora_alpha parameter, and set its default value to ​​twice the lora_rank​​ if unset with a warning.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly implements a default value for lora_alpha, setting it to twice the LoRA rank when not explicitly provided. The changes to argument parsing and configuration are well-integrated. My main feedback is on improving the warning mechanism by using the warnings module instead of print, and avoiding a potential side effect by not modifying the configuration object within the save_lora_adapter method. This will make the code more robust and align with best practices.

Comment on lines +253 to +258
if self.config.lora_alpha is None:
print(f"Warning: lora_alpha is unset and set to twice the rank value {lora_rank*2}")
self.config.lora_alpha = lora_rank*2
peft_dict = {
"r": lora_rank,
"lora_alpha": 0, # lora_alpha is not set. An error should be raised to inform the user to set it manually.
"lora_alpha": self.config.lora_alpha,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

It's better to use the warnings module for issuing warnings instead of print, as it provides more control for the user of the library (e.g., filtering or redirecting warnings). The warnings module is already imported in this file.

Additionally, modifying the configuration object (self.config.lora_alpha) within this method introduces a side effect. It's generally safer to compute the value and use it locally without altering the shared config state, unless the modified value is intended to be used elsewhere.

Suggested change
if self.config.lora_alpha is None:
print(f"Warning: lora_alpha is unset and set to twice the rank value {lora_rank*2}")
self.config.lora_alpha = lora_rank*2
peft_dict = {
"r": lora_rank,
"lora_alpha": 0, # lora_alpha is not set. An error should be raised to inform the user to set it manually.
"lora_alpha": self.config.lora_alpha,
lora_alpha = self.config.lora_alpha
if lora_alpha is None:
lora_alpha = lora_rank * 2
warnings.warn(f"lora_alpha is not set, defaulting to twice the rank value: {lora_alpha}")
peft_dict = {
"r": lora_rank,
"lora_alpha": lora_alpha,

@techkang
Copy link
Collaborator

Thanks for your code. As Gemini proposed, I think we should not changed config's value. Can you modify your code like this?

        lora_alpha = self.config.lora_alpha
        if lora_alpha is None:
            lora_alpha = lora_rank * 2

@chenhaiq chenhaiq self-requested a review August 19, 2025 04:01
@techkang
Copy link
Collaborator

@caowencai The pre-commit checks failed. Could you please take a look at the errors?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants