Hi, I am training the model with Trainer API, and in the forward method, the return is SequenceClassifierOutput.
def forward(**kwargs):
return SequenceClassifierOutput(
loss=loss,
logits=logits,
hidden_states=outputs.hidden_states,
attentions=outputs.attentions,
)
No problems happen in training. But when the model inferences, what I get is something like this
model = BertPrefixForSequenceClassification.from_pretrained(model_path)
model(**inputs)
The result is:
SequenceClassifierOutput(loss=None, logits=tensor([[-3.1394, -3.2234, -2.6458, -2.6055, -2.0099, -3.1522, -2.5696, -2.3495,
-1.8810, -2.7378, -3.1791, -2.0395, -2.7319, -3.0667, -3.2499, -2.8271,
-2.0556, -2.5394, -2.6604, -2.5847, -3.4659, -2.2318]],
grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)
I am confused how to ONLY get the sigmoid value in inference but not the SequenceClassifierOutput.
It seems quite strange if I do this:
class NeuralNetwork(nn.Module):
def __init__(self):
super(NeuralNetwork, self).__init__()
self.model = BertPrefixForSequenceClassification.from_pretrained(model_path)
def forward(self, input_ids, token_type_ids, attention_mask):
x = self.model(input_ids=input_ids, token_type_ids=token_type_ids, attention_mask=attention_mask)
logits = torch.nn.Sigmoid()(x)
return logits
and if I did so, something wrong in torch.jit.save as well.