Implement k-fold cross validation for hyperparameter tuning

I’m currently using the Trainer class to finetune a gpt-2 model for a regression task and I would like to tune the hyperparameters of this model.
Since my dataset is relatively small (about 3500 samples), I wanted to use k-fold cross validation to do that. Unfortunately I couldn’t find this functionality in the documentation of the Transformers library (or any supported hyperparameter backend documentation).

It seems like I would have to override many methods in the Trainer class to achieve that. I’m not quite sure if this is a good idea, since I would also have to override methods like _inner_training_loop().

Is there any way to accomplish this?

Thanks in advance.