Issues with LoRA documentation!

On my quest to configure the most promising LoRA adapter, I inspected the documentation on common LoRA parameters:

  • lora_alpha: LoRA scaling factor.
  • use_rslora: When set to True, uses Rank-Stabilized LoRA which sets the adapter scaling factor to lora_alpha/math.sqrt(r), since it was proven to work better. Otherwise, it will use the original default value of lora_alpha/r.

I am using the latest stable version of peft: 0.8.2. However, setting use_rslora to True or to False leads to:
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'use_rslora'

Moreover, the lora_alpha argument is mandatory. So what would be the point of setting it once more via use_rslora=True?

In the end, I just want to set the adapter scaling factor to lora_alpha/math.sqrt(r). How can I do that?

This is a working example with use_rslora=True:

config = LoraConfig(
   # GUIDE  => https://huggingface.co/docs/peft/main/en/conceptual_guides/lora#common-lora-parameters-in-peft
   # https://arxiv.org/abs/2312.03732, 
   r=r,
   target_modules=["query", "key", "value", "query_proj", "key_proj", "value_proj"],
   bias="lora_only",
   use_rslora=True,
   task_type=TaskType.TOKEN_CLS,
   lora_dropout=0.2
)