This model could not be loaded by the inference API

Why is there an error shown on the “Hosted inference API” Page?
“This model could not be loaded by the inference API”

Cheers, Frood

I am also seeing this error.