Use ReduceLROnPlateau with deepspeed

Huggingface trainer always let deepspeed to handle the scheduler, and deepspeed step the scheduler at each training step, which is wrong for ReduceLROnPlateau. But deepspeed said that you can manage your scheduler outside deepspeed. So how to do it when using huggingface trainer

1 Like

Hi @BoltzmachineQ

I think this page might help if you’re using TrainingArguments:

You can also find more information here:

1 Like