Pushing a custom tokenizer to the hub

Hello everyone, I’ve trained a custom tokenizer using a custom dataset using this code that’s on the documentation. Is there a method for me to add this tokenizer to the hub and to use it as the other tokenizers. Using the AutoTokenizer.from_pretrained() function? If I can’t do that how can I use the tokenizer to train a custom model from scratch?
Thanks for your help!!!