Tokenizer from own vocab

Hi everyone.

I would like to build a tokenizer that uses as vocabulary a list of pre-defined tokens I have. Would that be possible with the Transformers library and, in that case, how could I do it?

Thanks.