Create DPR Tokenizer for non-Bert model

Hello everyone, Is there a way to create non bert tokenizer for dpr model? And how much effort it will be to create one?

Maybe someone have experience converting models from original library to transformers (I know there is a script for this, but it doesn’t show what to do with tokenisers).

just want to add more context.
I’ve been trying to create dpr for my language, but now I can’t use it with transformers because DPRContextEncoderTokenizer inherits specifically from BertTokenizer, not PreTrainedTokenizer, I think it will be helpful to change it to something more general, can someone help me figure out the hierarchy of classes or maybe design some specific behaviour (for example hold actual tokenizer inside DPRTokenizer and then just override from_pretrained?)