Save LORA weights only in intermediate checkpoints

Hi everyone,

I finetune a model using the trainer API, using Lora. I need to save intermediate checkpoints every n step, but I need to save Lora weights only, and not the full weights of the model (due to space limit on disk). How can I do this?
If I set the training argument save_steps=n, the full model is saved (along with other training parameters I don’t need), but if the model is saved using model.save_pretrained() it only saves the Lora weights. Is there an option to tell the trainer to save the model using save_pretrained?

Thanks!