Calculating loss twice but return two different values

i am modifying the training_step function of Trainer
while i encounter following problems
with self.compute_loss_context_manager():
loss = self.compute_loss(model, inputs)
loss2 = self.compute_loss(model, inputs)
my test suggest that the loss and loss2 are distinct from each other although i did not update params or change inputs, even with a another context_manager, the losses are still different


transformers==4.28.1
and i found that after the call of decoder forward function, it returns distinct outputs for the two loss calculation

1 Like

1 Like