HF Inference API last few minutes returns the same 404 exception to all models

I think (I hope) that HF has already noticed and is working on the recovery without the need for a report this time…

If you do need to report, these github issues or HF Discord are the quicker way to do it.

1 Like