"This model is not currently available via any of the supported Inference Providers."

Incidentally, I understand the situation, but I don’t know anything about the background or the reasons. I only know what everyone can see on the surface…

To summarize the current situation,

  1. My uploaded models are returning 400 errors. I don’t know the reason, background, or anything else. I haven’t really been pursuing it either
  2. There are currently no other restrictions on my actions
  3. The Serverless Inference API is frequently returning 400 and other errors for models other than mine
    Request failed: 500
  4. The Serverless Inference API is currently being completely revamped
  5. In particular, it has been announced that there will be major changes to the Inference Provider in the near future
    Inference Providers: 3 cents per request? - #3 by julien-c