Is it possible to set epoch less than 1 when using Trainer

I’m training the model using Trainer and I just need to train by 0.5 epoch each time, but it seems that even if I set num_train_epochs=0.5, the Trainer would still run a full epoch.

Any method to set epoch less than 1 by the API?

You can set the max_steps argument instead, which will override num_train_epochs.