Deleting tokens from a Seq2Seq model

I’m using WhisperForConditionalGeneration and I’d like to experiment with memory savings that result from deleting tokens from the decoder vocabulary. My domain is fairly specific and thus a large fraction (>50%) of the Whisper tokens are not necessary, so I’d like to delete those from the decoder vocabulary embeddings. What is the most straightforward way to do that, if any?