Train and eval loss increase suddenly

Hello, I’m fine-tuning a BERT model with MatryoshkaLoss and noticed that the eval_loss suddenly spikes. At the same time, the Spearman and Pearson correlation values drop sharply and then turn to NaN. After the spike, both eval_loss and train_loss remain constant throughout training. Do you have any idea why this might be happening?