BART for sequence classification

the facebook/bart-large-cnn is pre-trained on summarization task, is it possible to fine-tune it on a classification task?
syntactically, it doesn’t cause any issue, but in terms of results.

In Google’s QUEST challenge which is a “multi-label” classificatin competition , the 1st place winner uses (finetuned) pretrained Bart as one of their core model, so that might partially answer your question.

1 Like