I have tried exporting the google-t5/t5-small model to ONNX format using optimum.
I have tried exporting this model 2 ways.
The first was I used the ORTModelForSeq2SeqLM class as defined in optimum and then used save_pretrained method. This seemed to work swimmingly but only outputs:
- encoder_model.onnx
- decoder_model.onnx
- decoder_with_past_model.onnx
And I need the decoder_model_merged.onnx. I then tried with the optimum-cli using:
optimum-cli export onnx --model google-t5/t5-small onnx_model
This throws the error:
Exception: An error occured during validation, but the model was saved nonetheless at onnx_model. Detailed error: [ONNXRuntimeError] : 1 : FAIL : Load model from onnx_model/decoder_model_merged.onnx failed:/Users/runner/work/1/s/onnxruntime/core/graph/model.cc:180 onnxruntime::Model::Model(ModelProto &&, const PathString &, const IOnnxRuntimeOpSchemaRegistryList *, const logging::Logger &, const ModelOptions &) Unsupported model IR version: 10, max supported IR version: 9
My package versions are:
optimum==1.18.0
onnxruntime==1.17.3
onnx==1.16.0
transformers==4.39.3
Anyone have any ideas?