BART from finetuned BERT

Hi all!

Is it possible to use a pretrained BERT model to initialize the encoder part of an encoder-decoder model like BART, leaving the decoder uninitialized (or random), and then do fintuning on some seq2seq task?

How should I proceed if its possible? Does someone know of previous instances where something like that has been tried?

Thanks in advance!
Best,
Gabriel.