Hosted Inference API overloaded

Hi, all. I have got a problem with the API inference of my model called GeoBERT
[botryan96/GeoBERT · Hugging Face]

It always says “Overloaded” or sometimes “Internal Server Error” every time I want to check it.

I kind of a newbie here, so please help me out. Is there a problem from my side?