Ensemble Prompting - Seq2Seq

I’ve trained two BERT2BERT models to generate text based on two different prompt types. I’m wondering if it’s possible to decode generations using both models at the same time?