How do I merge a lora adapter back into the model weights?

I’m using this colab I found on youtube, and I dont really have a clue on how I can merge adapters to models.

Based on the link to the code:

Your peft adapter would be saved in ‘outputs’ via:
model_to_save.save_pretrained(“outputs”)

To load the peft adapter and merge with the base model try:

from peft import PeftModel
peft_model = PeftModel.from_pretrained(model,‘outputs’)