Inference endpoint taking forever to initialize

My inference endpoint used to initialize under 2 minutes now it just gets stuck…???

1 Like