Training Loss Higher than Validation Loss

Hello everyone,

I got a question regarding my losses. When I finetune a pretrained BERT AutomodelforSequenceClassification model on a Multiclass Classification problem my training loss is always higher compared to my validation loss (see the picture attached):

Is this something to worry about? Thank you guys