Issues with LoRA documentation!

On my quest to configure the most promising LoRA adapter, I inspected the documentation on common LoRA parameters:

  • lora_alpha: LoRA scaling factor.
  • use_rslora: When set to True, uses Rank-Stabilized LoRA which sets the adapter scaling factor to lora_alpha/math.sqrt(r), since it was proven to work better. Otherwise, it will use the original default value of lora_alpha/r.

I am using the latest stable version of peft: 0.8.2. However, setting use_rslora to True or to False leads to:
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'use_rslora'

Moreover, the lora_alpha argument is mandatory. So what would be the point of setting it once more via use_rslora=True?

In the end, I just want to set the adapter scaling factor to lora_alpha/math.sqrt(r). How can I do that?

This is a working example with use_rslora=True:

config = LoraConfig(
   # GUIDE  =>
   target_modules=["query", "key", "value", "query_proj", "key_proj", "value_proj"],