Writing custom tokenizer and wrapping it in tokenizer object

how to write my own tokenizer and wrap it in huggingface tokenizer object?
we will follow this procedure instead of using any pretokenizer and get a tokenizer object like if we use one pretrained tokenizer?

Did you find a solution?

No, Nobody seems to be interested in it. and I am kind of disappointed. let me know if you find something. I am still stuck on this issue.