I have trained two LoRA Adapters on top of the same base model. I saved the adapters with model.save_pretrained()
Right now, I am trying to load both adapters for inference. My current approach is this:
base_model = AutoModelForSequenceClassification.from_pretrained(model_name, num_labels=2, output_hidden_states=False)
model = PeftModelFromSequenceClassification.from_pretrained(base_model, adapter_1, adapter_name="adapter_1", num_labels=2)
weighted_adapter_name="two-lora"
model.load_adapter(adapter_2, adapter_name="adapter_2")
model.add_weighted_adapter(
adapters=["adapter_1", "adapter_2"],
weights=[0.7, 0.3],
adapter_name=weighted_adapter_name,
combination_type="linear",
)
But this gives me the error Cannot add weighted adapters if they target the same module with modules_to_save, but found 1 such instance(s).
Then, I tried another method from this documentation
base_model = AutoModelForSequenceClassification.from_pretrained(model_name, num_labels=2, output_hidden_states=False)
model = PeftMixedModel.from_pretrained(base_model, adapter_1, adapter_name="adapter_1")
model.load_adapter(adapter_2, adapter_name="adapter_2")
model.set_adapter(["adapter_1", "adapter_2"])
But this too throws an error ValueError: Only one adapter can be set at a time for modules_to_save
.
I don’t understand what I am doing wrong. Should I try this:
get_peft_model
withbase_model
andadapter_1
- train this adapter
add_adapter
withadapter_2
to this model- train second adapter
But with this approach how would I load both adapters for inference?