Thank you Heiko for the response and references. It’s really helpful. I tried the examples you shared, and what actually worked was both updating the library versions, and specifying the TASK in environment:
huggingface_model = HuggingFaceModel(
model_data="s3://smbdata-development/models/MiniLM-L6-H384-uncased/model.tar.gz", # path to your trained sagemaker model
role=get_role(), # iam role with permissions to create an Endpoint
sagemaker_session=session,
transformers_version="4.17.0", # transformers version used
pytorch_version="1.10.2", # pytorch version used
env={"HF_TASK": "feature-extraction"},
py_version="py38" # python version of the DLC
)
However, when I try doing the same for another model, where i’ve overridden the some functions, it doesn’t work:
huggingface_model = HuggingFaceModel(
model_data="s3://smbdata-development/models/all-MiniLM-L6-v2/model.tar.gz", # path to your trained sagemaker model
role=get_role(), # iam role with permissions to create an Endpoint
sagemaker_session=session,
transformers_version="4.17.0", # transformers version used
pytorch_version="1.10.2", # pytorch version used
env={"HF_TASK": "feature-extraction"},
py_version="py38" # python version of the DLC
)
gives me :message": “Can\u0027t load config for \u0027/.sagemaker/mms/models/model\u0027. If you were trying to load it from \u0027https://huggingface.co/models\u0027, make sure you don\u0027t have a local directory with the same name. Otherwise, make sure \u0027/.sagemaker/mms/models/model\u0027 is the correct path to a directory containing a config.json file”
}