[SFT] fix: lora_alpha set default to twice lora_rank with warning#3100
[SFT] fix: lora_alpha set default to twice lora_rank with warning#3100caowencai wants to merge 1 commit intoverl-project:mainfrom
Conversation
There was a problem hiding this comment.
Code Review
This pull request correctly implements a default value for lora_alpha, setting it to twice the LoRA rank when not explicitly provided. The changes to argument parsing and configuration are well-integrated. My main feedback is on improving the warning mechanism by using the warnings module instead of print, and avoiding a potential side effect by not modifying the configuration object within the save_lora_adapter method. This will make the code more robust and align with best practices.
| if self.config.lora_alpha is None: | ||
| print(f"Warning: lora_alpha is unset and set to twice the rank value {lora_rank*2}") | ||
| self.config.lora_alpha = lora_rank*2 | ||
| peft_dict = { | ||
| "r": lora_rank, | ||
| "lora_alpha": 0, # lora_alpha is not set. An error should be raised to inform the user to set it manually. | ||
| "lora_alpha": self.config.lora_alpha, |
There was a problem hiding this comment.
It's better to use the warnings module for issuing warnings instead of print, as it provides more control for the user of the library (e.g., filtering or redirecting warnings). The warnings module is already imported in this file.
Additionally, modifying the configuration object (self.config.lora_alpha) within this method introduces a side effect. It's generally safer to compute the value and use it locally without altering the shared config state, unless the modified value is intended to be used elsewhere.
| if self.config.lora_alpha is None: | |
| print(f"Warning: lora_alpha is unset and set to twice the rank value {lora_rank*2}") | |
| self.config.lora_alpha = lora_rank*2 | |
| peft_dict = { | |
| "r": lora_rank, | |
| "lora_alpha": 0, # lora_alpha is not set. An error should be raised to inform the user to set it manually. | |
| "lora_alpha": self.config.lora_alpha, | |
| lora_alpha = self.config.lora_alpha | |
| if lora_alpha is None: | |
| lora_alpha = lora_rank * 2 | |
| warnings.warn(f"lora_alpha is not set, defaulting to twice the rank value: {lora_alpha}") | |
| peft_dict = { | |
| "r": lora_rank, | |
| "lora_alpha": lora_alpha, |
|
Thanks for your code. As Gemini proposed, I think we should not changed config's value. Can you modify your code like this? |
|
@caowencai The pre-commit checks failed. Could you please take a look at the errors? |
What does this PR do?