Hi
How can I load pre-trained BART model weights into a custom model layer?
I have a custom decoder layer with addition nn.Module
but pretrain BART model like facebook/bart-base
has prefix decoder.layer.
keys cant match the custom layer.
from transformers.models.bart.modeling_bart import BartEncoder, BartDecoder
class BartDecoder(BartPretrainedModel):
def __init__(self, config: BartConfig, embed_tokens: Optional[nn.Embedding] = None):
super().__init__(config)
# assume here are same
# ...
class BartDecoderLayer(nn.Module):
def __init__(self, config: BartConfig):
super().__init__()
# assume we have original modules here
# ...
self.some_new_modules1 = something_new
self.some_new_modules2 = something_new
How can I load pretrained facebook/bart-base
into this decoder and leave the rest of self.some_new_modules1
and self.some_new_modules2
? The facebook/bart-base
has prefix keys like endoer.layer..
and decoder.layer..
Thanks