Hello @philschmid,
I’d like to deploy a custom classification model from S3 on a serverless endpoint through Terraform with the huggingface_sagemaker
module: Terraform Registry
Is it possible to use it for serverless endpoint or it only works for real time inference ?
Thank you in advance 
Hello @YannAgora,
That’s currently not supported. But there module is openly available at: GitHub - philschmid/terraform-aws-sagemaker-huggingface
It would be nice if you could add a PR with the feature. Should be quite similar to the support for async
Hey @philschmid 
Happy to see the PR for serverless support has been accepted and released 
1 Like
Thank you for your great contribution! 
1 Like