Hi,
I’m trying to to train the model to perform text generation conditioned on tables.
Since TAPAS can encode the semi-structured meaning in tables, I guessed it was a good choice to use it as an encoder and say GPT2 (or any other CLM) as a decoder.
I however encountered a problem when trying to generate from that EncoderDecoder model, this:
I guess this is since model.generate() for EncoderDecoder does not expect to have the extra dimension of token_type_ids that TAPAS has.
Can anyone think of a way I can make this work?
Thanks!