Adapter Training - Merging Weights

Hey guys,

Would like to preface this by saying that I am new to all this.
I fine-tuned a T5 model using the adapter transformer library. I was just wondering what the function
model.merge_adapter(“adapter name”) does

Does it somehow merge the weights of the adapter with the base model itself?
Does this mean I wouldn’t have to activate the adapter manually every time with model.set_active_adapters()
Also, I was wondering if there is a way to always have a particular adapter head activated instead of having to activate it anew every time at the beginning of the script.

Any help would be appreciated. Thanks a ton!