There’s a suggestion to use resume_from_checkpoint
argument in Trainer but it seems like it’s not available for Seq2SeqTrainer:
trainer = Seq2SeqTrainer(
model=multibert,
tokenizer=tokenizer,
args=training_args,
train_dataset=train_data,
eval_dataset=val_data,
resume_from_checkpoint=True
)
[out]:
TypeError: Seq2SeqTrainer.__init__() got an unexpected keyword argument 'resume_from_checkpoint'