Finetuning BERT as seq2seq for Relation extraction with multiple outputs

Hi, I’m trying to finetune a mBERT model for relation extraction using a seq2seq approach.

I’m trying to overfit the model to see if it can understand the relations with just two samples that I repeat N times. So far I have succeded in extracting one relation from a given input, being the input a text with multple triplets inside where I expect to extract all relations.

My code is in my github if you want to check it out RE-finetune/bert_to_bert.py at main · Maximiliano-Villanueva/RE-finetune · GitHub

My question would be if this approach is feasible for extracting multiple relations out of a text.