Custom loss weight for train a different weight for validation

Hi I am training text classifier, but my data is inbalanced.
I am using and Trainer class and custom loss function with class weights.

class CustomTrainer(Trainer):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.cross_entropy_loss_weights = None

    def compute_loss(self, model, inputs, return_outputs=False):
        labels = inputs.pop("labels")
        outputs = model(**inputs)
        logits = outputs.get("logits")
        loss_fct = nn.CrossEntropyLoss(weight=self.cross_entropy_loss_weights)  # Use self.loss_weights

        loss = loss_fct(logits.view(-1, self.model.config.num_labels), labels.view(-1))

        return (loss, outputs) if return_outputs else loss

How to use a different weights for validation set during the training/evaluation, because by my val loss numbers looks bad during (and they are not real).

Thank you