Error exporting T5 model to ONNX with optimum-cli

I have tried exporting the google-t5/t5-small model to ONNX format using optimum.

I have tried exporting this model 2 ways.

The first was I used the ORTModelForSeq2SeqLM class as defined in optimum and then used save_pretrained method. This seemed to work swimmingly but only outputs:

  1. encoder_model.onnx
  2. decoder_model.onnx
  3. decoder_with_past_model.onnx

And I need the decoder_model_merged.onnx. I then tried with the optimum-cli using:

optimum-cli export onnx --model google-t5/t5-small onnx_model

This throws the error:

Exception: An error occured during validation, but the model was saved nonetheless at onnx_model. Detailed error: [ONNXRuntimeError] : 1 : FAIL : Load model from onnx_model/decoder_model_merged.onnx failed:/Users/runner/work/1/s/onnxruntime/core/graph/model.cc:180 onnxruntime::Model::Model(ModelProto &&, const PathString &, const IOnnxRuntimeOpSchemaRegistryList *, const logging::Logger &, const ModelOptions &) Unsupported model IR version: 10, max supported IR version: 9

My package versions are:

optimum==1.18.0
onnxruntime==1.17.3
onnx==1.16.0
transformers==4.39.3

Anyone have any ideas?

@JamesXanda Can you update Optimum with pip install -U optimum and try again please?

optimum-cli export onnx --model google-t5/t5-small onnx_model

runs well on my laptop with latest stable release of Optimum.

This worked, thanks.

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.