How to load pre-trained model parameters only in specific layers?

Hey , I want to know how to load pre-trained model parameters only in specific layers ? For example, I use EncoderDecoderModel class (bert-base-uncased2bert-base-uncased model) . And I only want to load parameters in specific layers like 2 or 10 of the pretrained model . How can I do this ?

This is usually what I did:

original_model = BartForConditionalGeneration.from_pretrained('facebook/bart-base')
my_model = BartForConditionalGeneration.from_pretrained('facebook/bart-base', encoder_layers=2)
## for example, I want my model's last encoder layer's weight = original model's last encoder's weight
model.model.encoder.layers[1].load_state_dict(original_model.model.encoder.layers[5].state_dict())

So I found this answer but it’s not very helpful, since this method requires loading the entire model into memory, which is something I specifically want to avoid.
Anyone know how to only load a specific submodule?