I am training a transformer model and since it is huge, I need to stop training after each epoch and continue. I am doing the following. Please can you check and confirm that this indeed continues training from the previous epoch:
from transformers import RobertaForMaskedLM
model = RobertaForMaskedLM.from_pretrained('path to the previous epoch pretrained')
model.train()
...