Setting Different Precisions for Different Modules During Mixed Precision Training

When using Huggingface’s Trainer to train a model, how can I set different precisions for different modules? I specified --bf16 and set some modules’ dtype to torch.float32, but during training, these modules’ precision automatically reverts to bf16.