So in my code i am computing the gradient of the image given as input to a resnet. I use AutoImageProcessor from transformers library. Is it that the gradient will not flow through the AutoImageProcessor ? For example in below;
from transformers import AutoImageProcessor
image_processor = AutoImageProcessor.from_pretrained(“microsoft/resnet-18”)
model_input = image_processor(image, return_tensors=“pt”)
outputs = resnet(**model_input, output_hidden_states=True)
When i backpropogate the loss from the model, i obtain gradient at model_input (using model_input.grad), however image.grad gives me an error " The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won’t be populated during autograd.backward(). If you indeed want the .grad field to be populated for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. "
Is there a way to get the gradient for image in the above snippet ?