Pytorch like loop for finetuning whisper

Hello! I was wondering if there is any pytorch like loop based training code for finetuning Whisper. I have come checked and tried the references mentioned in other topics (one trained on Japanese kana) but I haven’t found one that replicates the results on Fine-Tune Whisper For Multilingual ASR with 🤗 Transformers which uses the Trainer. I have to conduct some very intricate operations on the model (registering hooks adding decoders etc. ) and the Trainer object isn’t conducive to them. Please let me know how I may be able to train using a torch like loop which replicates the results of the Trainer object to a fair margin of error.