Hi everyone,
I am fine-tuning whisper models on some internal data, but I want whisper to retain its abilities besides just decoding texts.
How do I fine-tune whisper yet retain its abilities of timestamp decoding and langdetect
further more I am also trying to do multilingual fine-tuning where I take multiple langauges and fine-tune them together. how should I go about doing this ?
@StephennFernandes Great question, if @sanchit-gandhi can help us about this matter it would be great
i have the answer to it,
it seems that if you do LoRA finetuning on whisper and prompt it to decode text with timestamps post funetuning you can still retain timestamps as LoRA fine-tuning doesn’t drift away whisper timestamp decoding capabilities while also generalizing well on the newly finetuned data.
happy trainng!
2 Likes