I鈥檝e been following the optimum README example trying to get optimum to work with a LongFormer model.
However, I hit this error message when loading the model into an NER pipeline:
ValueError: Model requires 3 inputs. Input Feed contains 2
Error from onnxruntime/onnxruntime_inference_collection.py at main 路 microsoft/onnxruntime 路 GitHub
I believe there鈥檚 one required input (input_ids
) and two optional ones (attention_mask
and global_attention_mask
).
The tokenize outputs input_ids
and attention_mask
, but not global_attention_mask
.
How do I get around this? Is there a way for ONNX to recognize optional inputs, or to remove this expected input?