Train a model without n_epochs

Hello, I would like to train from scratch a RobertaForMaskedLM
However I would like to not specify any stopping time, but to stop only when there is no more improvement in the training. There is a way to do that? I know that the n_epochs in the Trainer is default 3.