Cannot load fine-tuned whisper model

I fine-tuned whisper multilingual models for several languages. I have the checkpoints and exports through these:

train_result = trainer.train(resume_from_checkpoint=maybe_resume)

Now I want to use these fine-tuned models in another script to test against a test set with whisper.transcribe(...)

When I try to load the model from export or checkpoint dirs, it complains about dims.

Traceback (most recent call last):
  File "D:\Anaconda\Anaconda3\envs\whisper\lib\concurrent\futures\", line 246, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
  File "D:\GITREPO\_HK_GITHUB\cv-whisper-finetune\", line 108, in test_process
    WMODEL: whisper.Whisper = load_whisper_model(shared["MODEL_PATH"], shared["CACHE_DIR"], shared["USE_GPU"])        
  File "D:\GITREPO\_HK_GITHUB\cv-whisper-finetune\", line 98, in load_whisper_model
    model: whisper.Whisper = whisper.load_model(name=MODEL_PATH, device=DeviceMode, download_root=cache_dir)
  File "D:\Anaconda\Anaconda3\envs\whisper\lib\site-packages\whisper\", line 147, in load_model
    dims = ModelDimensions(**checkpoint["dims"])
KeyError: 'dims'

I’m on Windows, using Anaconda, and also trying to use multi-processing.
If I use the default HF-provided models, it works flawlessly.

Here is the content of the checkpoint & export dirs respectively:



What am I doing wrong?

All my bad of course. I was not aware that model files in HF-Whisper and openai/whisper have different layer etc naming, and I cannot load .bin files with Whisper’s load model.

If somebody hits this, please follow here.