My inference endpoint is on "Load balancer not ready yet" status for more than 1h, what is happeing?

Hi, I deployed a inference endpoint for a Diffuser model trained with dreambooth, and it has been on initializing status for >1h with a message of “Load balancer not ready yet”. I had to turn off the endpoint because it was billing me during that time. Do you know what is happening / How I could solve it?

2 Likes