Why my model has no inferece providers option available? I really don’t understand how this feature works. I managed to create a space with this model, and even an inference endpoint worked, but I can’t include an inference widget in the model card. Can someone guide me better on what the problem would be? I would like to put a test inference app on the model card, as I see in some models, but the option is not available to me. The model is as follows: marcelovidigal/ModernBERT-base-2-contract-sections-classification-v4-10-max. Thank you for your attention in helping us.
1 Like
I see a lot of questions on HF Discord, the forum, Posts, and the Hub.
I think there are almost no users, including myself, who know the big picture about the future of Hugging Face’s Serverless Inference API and Inference Provider…
There have been no announcements, and it seems that all of the models on my account are 404 errors…
Other people’s models also often give 50x errors…
No one knows what’s going on anymore…
Support: website@huggingface.co