Using Cosine LR scheduler via TrainingArguments in Trainer

Hi, can anyone confirm whether my approach is correct or not, I’m trying to fine-tune Wav2Vec2 on a large dataset hence I need to make sure the process is correct:

I want to use an LR scheduler - Cosine scheduler with warmup, it can be extracted in the general training through transformers.get_cosine_schedule_with_warmup. What I’m not clear with is: How should I use this in training args to pass on to the Trainer? It has an argument: lr_scheduler_type should I pass an instance of the transformers.get_cosine_schedule_with_warmup with this lr_scheduler_type ? or is there something else I’m missing?

Any help is appreciated, Thanks!

You have to set lr_scheduler_type to "cosine".

Is there a list somewhere with the respective strings?

@sgugger What if I dont want learning rate decaying to 0 but let’s say to 50% of peak lr? Is there any way to do this?

You can pass your own learning rate scheduler to the Trainer.

I found this, hope it helps.

https://huggingface.co/transformers/v4.7.0/_modules/transformers/trainer_utils.html#:~:text=class-,SchedulerType,-(ExplicitEnum)%3A