Freeze Rt-detr backbone when fine tuning on custom dataset

Since that option is set to false in that model itself, I think the standard Transformers backbone will be used without any problems.