I’ve trained two BERT2BERT models to generate text based on two different prompt types. I’m wondering if it’s possible to decode generations using both models at the same time?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Multi-decoder text generation with BART | 0 | 626 | June 7, 2021 | |
Generating text from pretrained-bert based decoder | 1 | 254 | October 12, 2023 | |
Generate() speculative decoding with static string | 0 | 18 | January 7, 2025 | |
Encoder-Decoder model only generates bos_token's [<s><s><s>] | 17 | 3175 | December 6, 2022 | |
Bert2bert translator? | 6 | 44 | August 28, 2025 |