I’ve trained two BERT2BERT models to generate text based on two different prompt types. I’m wondering if it’s possible to decode generations using both models at the same time?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Decoder generate with prompts of variable lengths? | 0 | 657 | May 25, 2022 | |
Choosing correct seq2seq model | 1 | 1603 | March 19, 2021 | |
Generate 'continuation' for seq2seq models | 1 | 1840 | February 22, 2021 | |
Is this the right way prompt summarization with BART? | 1 | 2040 | March 18, 2023 | |
BERT2BERT for CNN/Dailymail example not working | 0 | 226 | September 8, 2022 |