I am new to PyTorch and have to train deep models like ResNet-, VGG- which requires the following learning rate schedule:
Linearly increase the learning rate from 0 to ‘initial_lr’ in the first k training steps/iterations
Continue with ‘initial_lr’ for the next ‘m’ training steps
Decay the learning rate in a step-decay manner. For example, say after 30th epoch, you reduce the ‘initial_lr’ by 10. And after 45th epoch, you further reduce it by 10 for any further training.
This can be better visualized using the following picture:
This is an example using LeNet-300-100 on MNIST with TensorFlow2.
How can I achieve this particular learning rate schedule with ‘huggingface’?