Using BART models encoder and decoder

I wanted to use encoder and decoder of BART model separately, is it possible?

Here is what I have done, but it is not working on decoder saying Module [ModuleList] is missing the required “forward” function.

class Encoder(torch.nn.Module):
  def __init__(self):
    super(Encoder, self).__init__()
    model = BartForConditionalGeneration.from_pretrained('facebook/bart-large-cnn')
    self.enc = torch.nn.Sequential(model.get_encoder().base_model)

  def forward(self, inputs):
    embedding_code = self.enc(inputs['input_ids'])
    return embedding_code



class Decoder(torch.nn.Module):
  def __init__(self):
    super(Decoder, self).__init__()
    model = BartForConditionalGeneration.from_pretrained('facebook/bart-large-cnn')
    self.dec = torch.nn.Sequential(model.get_decoder().layers)
  
  def forward(self, x):
    x = self.dec(x)
    return x



tokenizer = BertTokenizer.from_pretrained(PRE_TRAINED_MODEL_NAME)
inputs = tokenizer([ARTICLE_TO_SUMMARIZE], return_tensors='pt')

code = Encoder()(inputs['input_ids'])  
Decoder()(code.last_hidden_state)

Any thought on this implementation

@sgugger @patrickvonplaten any suggestions?