How can I put multiple questions in the same context at once using Question-Answering technique (i’m using RoBerta)?

How can I put multiple questions in the same context at once? It only processes one question although I need to put various questions about one context.
I tried doing it this way but it did not work:
from transformers import AutoTokenizer, AutoModelWithLMHead, pipeline

model_name = “MaRiOrOsSi/t5-base-finetuned-question-answering”
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelWithLMHead.from_pretrained(model_name)
questions = [“Question1”, “Question2”]
context = “context”
input = f"questions: {questions} context: {context}"
encoded_input = tokenizer([input],
return_tensors=‘pt’,
max_length=4000,
truncation=True)
output = model.generate(input_ids = encoded_input.input_ids,
attention_mask = encoded_input.attention_mask)
output = tokenizer.decode(output[0], skip_special_tokens=True)
print(output)

1 Like

hi, did you solve it, if not pls comment down