Unable to create tensor, you should probably activate padding with 'padding=True' to have batched tensors with the same length. (Paligemma)

What fixed it for me:

In the feature_extraction_utils.py of the transformers library in line 141 changed

return torch.tensor(value)

to

return torch.tensor(value, dtype=torch.float16)

to fix the following error:

PaliGemma\.venv\Lib\site-packages\transformers\feature_extraction_utils.py", line 142, in as_tensor
    return torch.tensor(value)
           ^^^^^^^^^^^^^^^^^^^
RuntimeError: Could not infer dtype of numpy.float32

Not a permanent fix. Works for the paligemma model card examples though.

2 Likes