I’m doing research on NLI with 2-sentence classification. I have already successfully used BERT and its BertForSequenceClassification
class to feed in two sentences in the form of an input string [CLS] sent1 [SEP] sent2 [SEP]
and then perform classification.
I’d like to do the same with the other models available, such as GPT2, XLNet, and RoBERTa. However, I can’t seem to find any example code that takes two sentences for those models. Do they use the same type of input string as BERT? Can someone point me to the relevant webpages for more information?
Thank you for any help.