How to fine-tune BERT model for NER if forward method doesn't have "labels" argument

I want to fine-tune a german basic BERT model on the task of NER. Therefore I want to load a model like this. But when I load it like this
model = AutoModel.from_pretrained("dbmdz/bert-base-german-cased", num_labels=len(unique_tags))
and then give it the training data like this
outputs = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask, labels=b_labels)
I get this error:
TypeError: forward() got an unexpected keyword argument 'labels'

When I check which kind of class AutoModel uses, it is BertForMaskedLM, which makes sense. So the forward method of this class doesn’t take labels. But how can I then fine-tune it for this specific task?
I don’t want to use a model from the hub for token classification or NER because I have my special kind of labels I want to use for specific tokens. Therefore something like this isn’t an option I guess.
Thank you very much!

Instead of AutoModel you’ll probably just need AutoModelForTokenClassification. You can also have a look at the existing examples with NER: transformers/examples/pytorch/token-classification at master · huggingface/transformers · GitHub

Ah yes, that works, thank you! Haven’t seen this model loader yet. :sweat_smile: