Huggingface t5 models seem to not download a tokenizer file

If I run this code i get an error:

model_name = 'yhavinga/t5-base-dutch'
#model_name = 'flax-community/t5-base-dutch'
model: PreTrainedModel = T5ForConditionalGeneration.from_pretrained(model_name)
tokenizer = T5Tokenizer.from_pretrained(model_name)

I have debugged the code and i see there is no resolved filename that is passed in to the underlying SentencePiece tokenizer. But this tokenizer crashes if it is not initialized with a file.

So is this a bug in the model files? How do i otherwise get the same tokens as the model was trained with on my local machine?