MaskFormer family of models and hyperparams

Hello, I was wondering if people typically play with some of the hyperparameters (those who do not affect the number of params of the model) when using pretrained models of MaskFormer or its family of models (Mask2Former)? For example, keeping the same backbone_config params, but playing with some decoder_config params, or things such as the dice_weight, cross_entropy_weight and mask_weight?