I know that I can simply switch num_queries to recognize 1 class, but in principle I don’t need classification at the moment, how can I turn it off?
And does the absence of a classifier affect performance?
I guess you can’t change the model but this one detects only one class:
Yesterday I had a similar issue. I tried to train a DETR with one class, which resulted in a stopped kernel. I’ll try using a dummy class today in the id2label /label2id dicts. And pass a dataset only with the relevant class annotations to the model for training. I’ll update you as soon as i’ve got news.
If you insert a code snippet like:
if len(id2label) == 1:
id2label |= {1 : 'Dummy'}
label2id = {v:k for k,v in id2label.items()}
you have a dummy label and therefore it works TECHNICALLY!
You will have -1 values if you use torchmetrics MeanAveragePrecision; that’s fine — these metrics are disabled.
But Attention there are drawbacks!
The test with an overfitting dataset showed that the training with a dummy variable doesn’t converge well, while the training with two labels does.
See: Images below. (left only shower heads, right shower heads and toothbrushes)
UPDATE:
A second test showed better results, so you might try it out. I used “PekingU/rtdetr_r101vd”.