Debugging the compute_loss function for custom dice loss in binary segmentation tasks

Hi there! I’m trying to use custom loss functions to train a Segformer-based model for binary segmentation tasks, but the segmented mask in the output doesn’t reflect the mIoU. The output masks are also “blotchy” without being related to the input image or even appearing to be remotely close to the mIoU values listed. I have the code for the inference (with the image input and displayed masks), and the code for the custom Trainer class to test out custom loss functions in this Colab notebook. I’m not sure where the error stems from, and I would appreciate any advice or information that could help me solve this issue.