Longformer Optimum ONNX bug: "ValueError: Model requires 3 inputs. Input Feed contains 2"

I’ve been following the optimum README example trying to get optimum to work with a LongFormer model.

However, I hit this error message when loading the model into an NER pipeline:

ValueError: Model requires 3 inputs. Input Feed contains 2

Error from https://github.com/microsoft/onnxruntime/blob/main/onnxruntime/python/onnxruntime_inference_collection.py#L196

I believe there’s one required input (input_ids) and two optional ones (attention_mask and global_attention_mask).

The tokenize outputs input_ids and attention_mask, but not global_attention_mask.

How do I get around this? Is there a way for ONNX to recognize optional inputs, or to remove this expected input?

Hi @benhamner! Unfortunately LongFormer is currently not supported in the ORTModel class because of the global_attention_mask input, hence the error you got with a NER pipeline.
We are thinking about how to properly manage custom inputs. For LongFormer, you can follow the progress here.