Unexpected input type after export

Hello,
I exported this model: /bert-large-cased-finetuned-conll03-english
to onnx using this code:

import torch
from transformers import AutoModelForTokenClassification, AutoTokenizer

#load model/tokenizer
model_id = “dbmdz/bert-large-cased-finetuned-conll03-english”
model = AutoModelForTokenClassification.from_pretrained(model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)
dummy_model_input = tokenizer(“JJ and his friends are going to the Park later.”, return_tensors=“pt”)

#Export to onnx
torch.onnx.export(
model,
tuple(dummy_model_input.values()),
f= “dbmdz/bert_large_cased_finetuned_conll03_english_onnx/”,
input_names=[‘input_ids’, ‘attention_mask’],
output_names=[‘logits’],
dynamic_axes={‘input_ids’ : {0: ‘batch_size’, 1: ‘sequence’},
‘attention_mask’: {0: ‘batch_size’, 1: ‘sequence’},
‘logits’: {0: ‘batch_size’, 1: ‘sequence’}},
do_constant_folding= True,
opset_version= 13,
)

After export My .onnx model has ‘input_ids’, ‘attention_mask’, and ‘input.3’ inputs, and i am unsure how or why i got the input.3 input.