Support onnx opset 9 for T5 & GPT_neox

Dear team,

T5 & gpt_neox models offer “small” LM, i.e., with <1B parameters. However, it’s not possible to export them with opset=9.

Kindly guide me on how to address the issue filed on Github in case you decide that it’s out of the scope of optimum :slight_smile:


Linking to the corresponding GitHub issue that contains more information: LLM or any model cannot be exported to onnx with opset 9 · Issue #1092 · huggingface/optimum · GitHub