Error hosting endpoint when deploying model in sagemaker

’m attempting to deploy the Salesforce/SFR-Embedding-Mistral model on SageMaker but am encountering an error that I’m unable to resolve.

I set up a Jupyter notebook on SageMaker using an ml.g4dn.2xlarge instance .

I copied the deployment script from Hugging Face and updated it with my token.

import json
import sagemaker
import boto3
from sagemaker.huggingface import HuggingFaceModel, get_huggingface_llm_image_uri

try:
	role = sagemaker.get_execution_role()
except ValueError:
	iam = boto3.client('iam')
	role = iam.get_role(RoleName='sagemaker_execution_role')['Role']['Arn']

# Hub Model configuration. https://huggingface.co/models
hub = {
	'HF_MODEL_ID':'Salesforce/SFR-Embedding-Mistral'
}


# create Hugging Face Model Class
huggingface_model = HuggingFaceModel(
	image_uri=get_huggingface_llm_image_uri("huggingface-tei",version="1.2.3"),
	env=hub,
	role=role, 
)

# deploy model to SageMaker Inference
predictor = huggingface_model.deploy(
	initial_instance_count=1,
	instance_type="ml.g4dn.2xlarge",
  )
  
# send request
predictor.predict({
	"inputs": "My name is Clara and I am",
})```

I am getting the following error :

UnexpectedStatusException: Error hosting endpoint huggingface-pytorch-tgi-inference-2023-09-06-16-46-01-586: Failed. Reason: The primary container for production variant AllTraffic did not pass the ping health check. Please check CloudWatch logs for this endpoint..

I cannot identify what causes the error. Any help is greatly appreciated.