Incidentally, I understand the situation, but I don’t know anything about the background or the reasons. I only know what everyone can see on the surface…
To summarize the current situation,
- My uploaded models are returning 400 errors. I don’t know the reason, background, or anything else. I haven’t really been pursuing it either
- There are currently no other restrictions on my actions
- The Serverless Inference API is frequently returning 400 and other errors for models other than mine
Request failed: 500 - The Serverless Inference API is currently being completely revamped
- In particular, it has been announced that there will be major changes to the Inference Provider in the near future
Inference Providers: 3 cents per request? - #3 by julien-c