Freeze Rt-detr backbone when fine tuning on custom dataset

Hi,

I am loading rt-detr v2 for fine-tuning using this command:

model = AutoModelForObjectDetection.from_pretrained(
    "PekingU/rtdetr_v2_r50vd",
    id2label=id2label,
    label2id=label2id,
    ignore_mismatched_sizes=True,
)

As shown in this script.

Once the model saved locally, i have noticed from the config.json that "use_pretrained_backbone": false. Does this mean that the backbone is being adjusted during my training run?

If yes, how would I proceed to freeze it and use a pretrained version before the model run?

Thanks

1 Like

Since that option is set to false in that model itself, I think the standard Transformers backbone will be used without any problems.

Thank you! And to freeze it before training:

for name, param in model.named_parameters():
            if "backbone" in name:
                param.requires_grad = False

Should do it

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.