How do I deploy transformers from CLIPModel on Serverless

I’ve read most of the tutorials provided by Huggingface but I can’t seem to find any information on how to deploy to a Serverless Sagemaker when using pipelines isn’t possible.

I’ve tried this code for the “fashion clip” model but I can’t get it to only return a vector embedding to store for later use without needing to compare it to other labels.

classifier = pipeline(
    "zero-shot-image-classification", model="patrickjohncyh/fashion-clip"
)

So I converted it to use CLIPModel and CLIPProceessor which works locally.

from PIL import Image
from transformers import CLIPProcessor, CLIPModel
import numpy as np

....
model = CLIPModel.from_pretrained("patrickjohncyh/fashion-clip")
processor = CLIPProcessor.from_pretrained("patrickjohncyh/fashion-clip")

image = Image.open(image_path)
inputs = processor(images=image, return_tensors="pt", padding=True)
image_vector = model.get_image_features(**inputs).squeeze().detach().numpy()
print(image_vector)

I would really appreciate any help.

I’m looking for same thing. Did you find a way to do. it would me great if you share your solution.