WandB does not log train loss

Hi there!

I am using the Trainer to train Llama Models. When I use Comet as a logging backend, everything works well, but I wanted to test WandB, too, but here it does not log the train loss. It logs the global steps and also the epoch, and also all the evaluation metrics (eval/loss and others). But just not the train loss.

I set WANDB_WATCH to all, report_to in the Trainer to wandb and also disabled comet to see if there are interferences. But it just won’t log the train loss. Did anyone have a similar problem?

Thanks!
Ver

1 Like