I would like to apply the function
f to the parameters that pertains to the 24th layer (the uppermost layer) of the
RobertaForMultipleChoice pre-trained model (
roberta-large). How should I fix the loop below so that I only fix the parameters that are from the 24th layer? Currently, the loop applies the function
f to every paramete in a Transformer.
for m in model_RobertaForMultipleChoice.modules(): for name, value in list(m.named_parameters(recurse=False)): setattr(m, name, f)