How to save my tokenizer using save_pretrained?

I have just followed this tutorial on how to train my own tokenizer.

Now, from training my tokenizer, I have wrapped it inside a Transformers object, so that I can use it with the transformers library:

from transformers import BertTokenizerFast

new_tokenizer = BertTokenizerFast(tokenizer_object=tokenizer)

Then, I try to save my tokenizer using this code:

tokenizer.save_pretrained('/content/drive/MyDrive/Tokenzier')

However, from executing the code above, I get this error:

AttributeError: 'tokenizers.Tokenizer' object has no attribute 'save_pretrained'

Am I saving the tokenizer wrong?

If so, what is the correct approach to save it to my local files, so I can use it later?