T5/BART decoder prefix

I know that for models like GPT-2, it is possible to make a prefix, or a prompt, to let the model continue on this sequence of text. I wonder if we could do the same with encoder-decoder models, e.g., T5, BART…? (Note that this prefix is the decoder prefix, not the encoder prefix like in “Translate English to French: I’m hungry”)