Cannot Setup Mixtral Models and Other Models on Inference Endpoints

Here is a link to the error logs and the discussion thread related to my issue and many others who have the same problem:

Even with other models, I am encountering the same issue on my account. Any ideas for possible resolutions? The Inference Endpoints as a service does not function for me right now.

Having the same problem. I keep getting sharding errors like these issues:

  1. cognitivecomputations/dolphin-2.5-mixtral-8x7b · HuggingFace Inference Endpoints Issue (Detailed Information)
  2. tiiuae/falcon-7b-instruct · Trying to deploy this using Inference Endpoints