I fine-tuned the whisper-small checkpoint, but in inference I am getting this warning: ‘Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.’ I’m getting this warning after fine-tuning whisper. How can I avoid this error or how can I add special tokens so that I do not get this error?
Related Topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Adding custom vocabularies on Whisper | 5 | 16205 | March 3, 2024 | |
Fine Tuning Whisper on my own Dataset with a customized Tokenizer | 16 | 9792 | February 12, 2024 | |
Tiny whisper finetuning for french speech recognition | 2 | 336 | August 15, 2023 | |
Whisper fine-tuning on Librispeech makes WER worse | 6 | 1639 | June 26, 2023 | |
[Open-to-the-community] Whisper fine-tuning event | 31 | 11135 | December 10, 2023 |