BART finetuning for summarization without seq2seq trainer

Hello, I am looking for an example code for BART finetuning with huggingface (not fairseq) without the seq2seq trainer
Any guidance here?

Maybe you can tgry this one: