Compute log probabilities of any sequence provided

Hey there!

I’m using allenai/unifiedqa-t5-small model to obtain the log probabilities of a given sequence (which is not necessarily the one generated by the model). In particular, I’m interested in having the probability distribution that is conditioned on the previous tokens in the sequence.

So far, I’ve been using the forward method and providing the sentence I want to obtain the logits for as the labels argument. However, I am not 100% confident the resulting logits are conditioned on the sentence I provided as labels (i.e., is the forward method working in a teacher forcing fashion and, thus guarantees that log probabilities are conditioned on the labels parameter, rather than the argmax)?

A second question is whether it is necessary to provide the pad_input_id character as the first character in the labels argument.

Hey @PastelBelem8,

Would this PR: [Generation] Fix Transition probs by patrickvonplaten · Pull Request #17311 · huggingface/transformers · GitHub and this forum post: Generation Probabilities: How to compute probabilities of output scores for GPT2 help?