On my quest to configure the most promising LoRA adapter, I inspected the documentation on common LoRA parameters:
lora_alpha
: LoRA scaling factor.use_rslora
: When set toTrue
, uses Rank-Stabilized LoRA which sets the adapter scaling factor tolora_alpha/math.sqrt(r)
, since it was proven to work better. Otherwise, it will use the original default value oflora_alpha/r
.
I am using the latest stable version of peft: 0.8.2. However, setting use_rslora
to True
or to False
leads to:
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'use_rslora'
Moreover, the lora_alpha
argument is mandatory. So what would be the point of setting it once more via use_rslora=True
?
In the end, I just want to set the adapter scaling factor to lora_alpha/math.sqrt(r)
. How can I do that?