OneFormer ID/Labels for FineTuning

Hello Forums,

TLDR: Trying to fine-tune OneFormer with a dataset. Doing it soley off of providing ground truth and masks works fine but it classifies the segmented things wrong. Trying to add the id2label did not work as I have gotten an assertion sizes out of bounds issue(Could be implementing it wrong).

I was following a method to fine-tune along with classes and this is how I tried to do it. I have two seperate json files, a id2label and a label2id, both contain the classes and ids present in the new dataset. In my train file, I call this method


with open("labels/id2label.json", "r") as f:
   id2label = json.load(f)
id2label = {int(k): v for k, v in id2label.items()}
label2id = {v: k for k, v in id2label.items()}

along with my config:

config = OneFormerConfig.from_pretrained("model link", id2label=id2label , label2id = label2id, is_training=True)

Is there something I am doing wrong with OneFormer? I was aiming to have something like the maskformer finetune (Source 1).

Sources:

  1. Fine Tuning Mask2Former on Custom Dataset
  2. Fine-Tune a Semantic Segmentation Model with a Custom Dataset
  3. Nile Rogers Finetune for both Oneformer and Mask2Former
1 Like

Seems it’s caused by is_training=True with OneFormer…?