BertForSequenceClassification finetune training loss and accuracy have some problem

Model I am using Bert for BertTokenizer, BertForSequenceClassification

The problem arises when using:
Use Trainer to fine-tuning on training set, but the Training log is No log or 0.696694, and accuracy is always 0.X00000

The tasks I am working on is:
An official GLUE task: sst2, using by huggingface datasets package

The details:
Trainer setting I follow the examples/text_classification.ipynb to build the compute_metrics function and tokenize mapping function, but the training loss and accuracy have bug

my trainer and arguments:

encoded_training_no_mask and encoded_val_set format is dataset follows like:
Dataset({
features: [‘attention_mask’, ‘idx’, ‘input_ids’, ‘label’, ‘labels’, ‘sentence’, ‘token_type_ids’],
num_rows: XXX
})

The column ‘lables’ content is same with ‘label’ contents