Checkpointing in each step

Hi
I use finetune_trainer.py, it saves the best model during checkpoints given an evaluation metric, so sometimes it calls the _save_checkpoint, but if the metric is not higher than the best saved one it would not save the cehckpoint in huggingface code, what I need is to write a callback, that save model+optimizer+scheduler to be called write after each time it calls the save_checkpoint to keep a copy of last updated model in the folder. I am not sure how to access optimizer/model/scheduler in the callbacks, I appreciate your input on this. thanks

you could use save_stpes arg to specify the number of update steps after which to save the checkpoint. If you use

--evaluation_strategy steps
--eval_steps 50
--save_steps 50

then it will eval and save checkpoint after every 50 steps