Can't load tokenizer for 'facebook/wav2vec2-large-robust'

I’m attempting to do inference using the latest “robust” model, roughly following the steps laid out in inspired by this notebook

When attempting to load in the Tokenizer for wav2vec2-large-robust, I have been encountering the following issue:

OSError: Can't load tokenizer for 'facebook/wav2vec2-large-robust'. Make sure that:

- 'facebook/wav2vec2-large-robust' is a correct model identifier listed on 'https://huggingface.co/models'

- or 'facebook/wav2vec2-large-robust' is the correct path to a directory containing relevant tokenizer files

I tried AutoTokenizer, Wav2Vec2CTCTokenizer and Wav2Vec2Tokenizer, but none of those work.

When I use the same code, but swap out the name of the pretrained model to, say, “facebook/wav2vec2-base-960h” it works.

I tried retyping the names a few times, and recopied it directly from the website. I also recreated the issue in a Colab notebook, below.

Colab notebook replicating the issue: https://colab.research.google.com/drive/1tZ3oEtkg9_74nCLnnRq9lpLktyfAAS6I?usp=sharing

Anyone got any suggestions for what I should try?