Error while converting hf model to onnx

Hi everyone,

I’m converting a finetuned bert model from huggingface to onnx (following this post).

Transformers version: 4.10.2
When I run this on a terminal

python -m transformers.onnx --model=/path/to/checkpoint output=/tmp

I get this error;

 line 143, in validate_model_outputs
    from onnxruntime import InferenceSession, SessionOptions
ModuleNotFoundError: No module named 'onnxruntime

’
I tried installing onnxruntime-tools but I still get the same error.
Any ideas?

The package to install was onnxruntime

Please refer to ONNX Runtime | Home for more information on installation. The package you’re looking for is onnxruntime and can be installed using (for default CPU installation):

pip install onnxruntime

if you’re interested in the onnxruntime for cuda hardware acceleration installation:

pip install onnxruntime-gpu