How to deploy Sagemaker Multi-model Endpoints on GPU?

Hello,

I’m trying to deploy Multi-model Endpoints by following : AWS-SageMaker-Examples/03_MultiModelEndpointWithHuggingFace/huggingface-sagemaker-multi-model-endpoint.ipynb at main · vinayak-shanawad/AWS-SageMaker-Examples · GitHub

I can deploy each model separately either on CPU or GPU.
I can deploy both model on a multi-model Endpoints on CPU, but as soon as I try to deploy them on GPU I get the following error :

ClientError: An error occurred (ValidationException) when calling the CreateModel operation: Your Ecr Image 763104351884.dkr.ecr.eu-central-1.amazonaws.com/huggingface-pytorch-inference:1.13.1-transformers4.26.0-gpu-py39-cu117-ubuntu20.04 does not contain required com.amazonaws.sagemaker.capabilities.multi-models=true Docker label(s)

Any help would be appreciated !