How to save peft weight every epoch?

With training_args set as,


training with lora will save entire weight every epoch. But I think this weight can not be loaded? Since it will report many lora weight is not loaded correctly with AutoModelForCausalLM.from_pretrained.

PEFT’s doc says i can use model.save_pretrained(“output_dir”) to get adapter_model.bin. But it only does this at the end of training using trainer.

Is there a agrs I miss can save adapter_model.bin every epoch or I have to write a callback myself?

Thank you!

No, no such as way