I want to work on question generation using T5 where the model will receive an answer and context as inputs.
I will use the Tensorflow variant of T5ForConditionalGeneration so I begin with:
model = TFT5ForConditionalGeneration.from_pretrained("google/t5-v1_1-small")
tokenizer = T5Tokenizer.from_pretrained("google/t5-v1_1-small")
I am not sure how to structure my inputs since I need to pass both answer and context.
Can I pass the following string to the tokenizer:
qg answer: my answer context: my context blah blah blah
or do I need to use a special token to separate the two input elements:
qg answer: my answer </s> context: my context blah blah blah