I have fine tuning the Flan t5 using the seq2seq trainer what i have observe that while training the learning rate is changing. But i have set the learning rate fixed.
WHy this thing is happening how to control it ??
Hi @rishilearn, I’m guessing the lr_scheduler_type
training argument is using the default value (linear), which means the learning rate will decay linearly during training. Try setting that to “constant”.
Thanks for the suggestion @dblakely i will try it out.