Continue from pretrained

I am training a transformer model and since it is huge, I need to stop training after each epoch and continue. I am doing the following. Please can you check and confirm that this indeed continues training from the previous epoch:

from transformers import RobertaForMaskedLM
model = RobertaForMaskedLM.from_pretrained('path to the previous epoch pretrained')
model.train()
...

I think it’s right we load our saved model like that and then retraining from there.

1 Like