Hosted Inference API overloaded

Hi, all. I have got a problem with the API inference of my model called GeoBERT
[botryan96/GeoBERT · Hugging Face]

It always says “Overloaded” or sometimes “Internal Server Error” every time I want to check it.

I kind of a newbie here, so please help me out. Is there a problem from my side?

Thanks

https://www.espace-recettes.fr/profile/transformers-rise-beasts-streaming-vf-en-francais/638319
https://www.espace-recettes.fr/profile/flash-2023-streaming-vf-en-francais/638317