Hi there,
I’ve overridden the compute_loss
method in order to add additional kl_div
loss for distillation. Now, I want to log the cross-entropy and KL divergence along with the total loss. I’m wondering if there’s an easy way to do this without overriding about half of the Trainer
class?
Thank you in advance!