Prompt Tuning For Sequence Classification

Accuracy is a discrete metric (unlike pretty much all loss metrics used with neural networks) and thus it’s entirely possible that your loss could be decreasing while your accuracy is flatlining. Obviously, for any model, neither validation loss nor accuracy will ever reach 0/100%, respectively, unless your validation data is your training data. The point of the validation data is to help us decide when to stop training- if your training loss is decreasing but your validation loss has flatlined or is even increasing for multiple epochs, you are likely only going to overfit if you keep training.

2 Likes