How can I put multiple questions in the same context at once using Question-Answering technique (i'm using BERT)?

Is that possible? If so, how can I do that?

Yes that’s possible, like so:

from transformers import BertTokenizer, BertForQuestionAnswering

tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertForQuestionAnswering.from_pretrained('bert-base-uncased')

context = "Jim Henson was a nice puppet"
questions = ["Who was Jim Henson?", "What is Jim's last name?"]
inputs = tokenizer(questions, [context for _ in range(len(questions))], padding=True, return_tensors='pt')

outputs = model(**inputs)
start_scores = outputs.start_logits
end_scores = outputs.end_logits

We just make several [CLS] question [SEP] context [SEP] [PAD] [PAD] ... examples, which we forward through the model.

1 Like

Thank you so much!