Replacing last layer of a fine-tuned model for using different set of labels

I’m trying to fine-tune dslim/bert-base-NER (any better models?) using the wnut_17 dataset.
Since the number of NER labels is different, I manually replaced these parameters in the model to get rid of the size mismatch error:

model.config.id2label = my_id2label
model.config.label2id = my_label2id
model.config._num_labels = len(my_id2label) ## replacing 9 by 13

However, I now get the following error which I don’t know how to handle:
Expected input batch_size (1456) to match target batch_size (1008).

Has anyone handled this manually?
@sgugger It will be great if we can have a solid function that handles the head replacements for fine-tuning.

tokenized_wnut[‘train’].shape = (3394, 7)
tokenized_wnut[‘validation’].shape = (1009, 7)