Help using inference endpoint with Llama 3.1 405B Instruct

I have been trying to query the model, but I have been getting the same error:
Error code: 503 - {‘error’: ‘Service Unavailable’}. The same code works fine for llama 3.1 8B and 70B. I have access to the models and also pro account on HF