My model is doesnt seem to load in Inference API

Is inference API down? When I user the model card’s “Hosted inference API” I get “Internal Server Error”. When I fetch the API, I get endless loading. I’ve also tried other models and I’m getting the same issues. Are the servers down or am I missing something?
SebastianS/MetalSam_v2 · Hugging Face

1 Like