HF Inference API last few minutes returns the same 404 exception to all models

I think its due to the server error/issues, im getting this now as well instead of 404

1 Like