Truncating sequence -- within a pipeline

try adding something like the following:

from torch.nn import Softmax

smax = Softmax(dim=-1)

probs0 = smax(pt_outputs.logits)
probs0 = probs0.flatten().detach().numpy()

prob_pos = probs[1]

Now prob_pos should be the probability that the sentence is positive.

Does that work/make sense?