Is the reported loss averaged over logging steps

Hopefully a quick question!
If I set logging_steps=500 say in the TrainingArguments I get a log line with the ‘loss’, ‘learning_rate’ and ‘epoch’ every 500 steps. I was wondering if the value for the loss is averaged over the 500 mini-batches, or if it is the value computed on the 500th batch?

Thank you!

It is the average since the beginning of training :slight_smile:

@sgugger thanks for the response - that makes sense.

One final question - does this average reset at the start of a new epoch? I’m seeing some jumps in the training loss (towards lower values) when a new epoch begins, when training with multiple epochs.

Thanks,
Owen