The training loss(logging steps) will drop suddenly after each epoch? Help me plz! Orz

I obtained unsmooth loss(according to trainer_state.json) curve after training CLIP demo(transformers/examples/pytorch/contrastive-image-text at main 路 huggingface/transformers 路 GitHub)

I record detail in The training loss(logging steps) will drop suddenly after each epoch? Help me plz! Orz 路 Issue #18730 路 huggingface/transformers 路 GitHub
Could any one help me?

I checked the logged values for each step(according to trainer_state.json) and found the loss value(dataloader_drop_last=True) dropped significantly at the first step of each epoch.