Facing error while adding multiple adapters to a model

Hi Huggingfacers,

I have fine-tuned 4-bit quantized gemma-2b on some task. The finetuned model has ‘default’ PEFT/LORA adapter. I want to train this model further on some other task by adding one more adapter. I am loading this fine-tuned model as PEFT model using AutoPeftModelForCausalLM class. When trying to add adapter to the loaded model [model.add_adapter(peft_config, adapter_name ="t2")] , I am getting TypeError: PeftModel.add_adapter() got multiple values for argument 'adapter_name'

Has anyone tried training a base model for multiple tasks using PEFT?? If yes, please help!

The order of arguments in the add_adapter function is incorrect. You should provide adapter_name first and then peft_config. Try it:

model.add_adapter(peft_config=peft_config, adapter_name="t2")