How to adjust the learning rate after N number of epochs?

I am using Hugginface’s Trainer. How to adjust the learning rate after N number of epochs? For example, I have an initial learning rate set to lr=2e-6 , and I would like to change the learning rate to lr=1e-6 after the first epoch and stay on it the rest of the training.

I tried this so far:

optimizer = AdamW(model.parameters(),
              lr = 2e-5,
              eps = 1e-8
            )

epochs = 5
batch_number = len(small_train_dataset) / 8
total_steps = batch_number * epochs


scheduler = get_linear_schedule_with_warmup(optimizer, 
                                            num_warmup_steps = 0,
                                            num_training_steps = total_steps,
                                            last_epoch=-1
                                            )

I know that there is LambdaLR — PyTorch 1.9.0 documentation but here it drops learning rate every epoch but that is not what i want to do. I want it to drop after 1 epoch and then stay on it rest of the training process.

1 Like

Were you able to resolve this? I have a similar problem where I want to implement adaptive learning rate during training.