Inference API Web Widget for tons of public models: Can't load tokenizer using from_pretrained

Still getting this on a huge amount of public models on the inference API widgets.

Can't load tokenizer using from_pretrained, please update its configuration: Can't load tokenizer for 'aidan-o-brien/recipe-improver'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'aidan-o-brien/recipe-improver' is the correct path to a directory containing all relevant files for a AlbertTokenizerFast tokenizer.

I have seen several threads recommending to add tokeniser.json or make various edits to the code, but these are not my models, and I would really like to be able to test these models with the inference API widget before actually cloning it or writing any code.

Here are a few, but I’ve encountered many more, as well as a bunch of models that have an inference API widget but always return nothing as the output (with no errors). If you take a look at the recipe-improver, the most recent commit is deleting app.py, which contains load_model_tokenizer!! Why does it still have an inference API widget on the web-page if it lacks the needed code?