Hello! I was wondering, by default how long does hugging face trainer run for? I see that the
transformers.TrainingArguments class has the
num_train_epochs parameter and one can override that with the
max_steps parameter of
set_lr_scheduler. However, by default, how long does the trainer run for? Is this dependent on a certain number of epochs? Until the learning rate reaches a certain loss?
I can’t seem to find the information in the documentation. Any help finding this would be greatly appreciated. Thanks!