Mean_iou - how come it can be greater than 1?

I’m fine-tuning a Segformer using the example from the blog. It’s an image segmentation task, multiple classes, images from a city streets dataset. The mean_iou metric converges towards something like 0.17 with the data I have, after a few dozen epochs. But the initial value for mean_iou, in the first epoch, is more like 3.5.

My understanding is that the Jaccard index is 0 <= JI <= 1. The code seems to follow the definition. So why do I get superunitary values there in the first few epochs?

Notebook showing the behavior:

1 Like