Longformer Optimum ONNX bug: "ValueError: Model requires 3 inputs. Input Feed contains 2"

I鈥檝e been following the optimum README example trying to get optimum to work with a LongFormer model.

However, I hit this error message when loading the model into an NER pipeline:

ValueError: Model requires 3 inputs. Input Feed contains 2

Error from onnxruntime/onnxruntime_inference_collection.py at main 路 microsoft/onnxruntime 路 GitHub

I believe there鈥檚 one required input (input_ids) and two optional ones (attention_mask and global_attention_mask).

The tokenize outputs input_ids and attention_mask, but not global_attention_mask.

How do I get around this? Is there a way for ONNX to recognize optional inputs, or to remove this expected input?

Hi @benhamner! Unfortunately LongFormer is currently not supported in the ORTModel class because of the global_attention_mask input, hence the error you got with a NER pipeline.
We are thinking about how to properly manage custom inputs. For LongFormer, you can follow the progress here.