Ensemble Learning with various BERT models


i have fine tuned different BERT models for a binary text classification task with the Trainer API. Now I want to try out, if ensemble learning (e.g. combining different BERT models) would help to improve the results.
My question is, if there is a tutorial or script how this could be done with Trainer. I have never done enseble learning and don’t really know how to start. Also I am not sure how to predict new samples with ensemble models.

Thanks in advance,

I’m not sure about using the Trainer API. You could just build your own Decision Tree with the trained models This might help Multi-Model-Endpoint tutorial on Sagemaker.