RuntimeError during inference on Mask2Former model

Hi everyone,

I am having some trouble running inference on a Mask2Former model which I successfully trained before.

I followed this tutorial here: , replaced the MaskFormer model by the Mask2Former model and trained it on my custom dataset. Unfortunately, when trying to run inference on the trained model, an error message shows up:

RuntimeError: Input type (torch.cuda.ByteTensor) and weight type (torch.cuda.FloatTensor) should be the same

This is my code:

# Load model
# Use GPU if available
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

# Grab the trained model and processor
model = Mask2FormerForUniversalSegmentation.from_pretrained("model.torch").to(device)
processor = Mask2FormerImageProcessor.from_pretrained("pretrained/preprocessor_config.json")

# Use random test image
index = random.randint(0, len(test))
image = test[index]["image"].convert("RGB")
target_size = image.size[::-1]

# Preprocess image
inputs = processor(images=image, return_tensors="pt").to(device)

# Inference
with torch.no_grad():
    outputs = model(**inputs)

Any help would be appreciated. Thanks a lot.

It’s basically a data type mismatch. I don’t know pytorch, but casting the inputs to Float might solve the issue.

Hi Sandy1857,

thanks for your reply. I have tried to cast the inputs to Float but I somehow don’t see how I can do it. Casting with

inputs = inputs.float()


inputs = float(inputs)

doesn’t work. The error message is:

TypeError: float() argument must be a string or a number, not ‘BatchFeature’

Any suggestions?

I don’t know pytorch, but could you try inputs.float() before loading it in cuda? Again are the inputs variable of dtype bytetensor even if you load the images via CV2 or PIL.?

Hi Sandy1857,

thank you very much again. I realized that I forgot to implement a few lines of code as shown in the tutorial I linked above, so now it works and does not show up any more errors. I have to apologize, I somehow just overlooked it. Thanks a lot for your help.

No issues. We’re here to help each other.

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.