Skip model repacking in Batch Transform

Hi,

I have a HuggingFace Model that I’ve already loaded from the tar.gz in S3 to SageMaker. When instantiating a batch transform job I seemingly have to repack the model every time, which results in ~5 minutes of time Repacking model artifact before the console lets me know that it’s Using already existing model: topic-model.

Does anyone know if there’s a way to skip model repacking and default to using the existing model?

Here’s the code I’m using to instantiate (within a custom class, hence self):

        self.model = HuggingFaceModel(
            env=hub,
            model_data=s3_path_join('s3://', self.bucket, f"{self.model_type}_model.tar.gz"),
            entry_point=entry_point,
            role=self.role,
            transformers_version="4.28.1",
            pytorch_version="2.0.0",
            py_version="py310",
            name=f"{self.model_type}-model"
        )
...
        batch_job = self.model.transformer(
            instance_count=instance_count,
            instance_type=instance_type,
            output_path=s3_path_join('s3://', self.bucket, 'output'),
            strategy='MultiRecord',
            assemble_with='Line')