Save, load and do inference with fine-tuned model

I’m seeing different methods to save the fine-tuned model. That confuses me.

Example1 : model.save_pretrained('./output/')
Example1 : trainer.save_model('./output/')
Example1 : trainer.model.save_pretrained('./output/')

and some example with merge and unload.

@nielsr can you provide some example for fine-tuned model?