LayoutLMv3 Onnx Conversion

Apologies for missing something very obvious but I am just getting started with ONNX and don’t understand many of the errors and problems leading upto converting finetuned local models to ONNX format.

Currently trying to export a finetuned model of layoutlmv3-base, however I encounter a similar error as mentioned above in this post #23166

x----------------------------------Configuration Details----------------------------------x
transformers = 4.28.1
Python = 3.8.10
torch = 1.12.0
onnx = 1.13.1
CPU Only
x----------------------------------Error Trace----------------------------------x
Local PyTorch model found.
Framework not requested. Using torch to export to ONNX.
Using framework PyTorch: 1.12.0+cu102
Traceback (most recent call last):
File “/usr/lib/python3.8/runpy.py”, line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File “/usr/lib/python3.8/runpy.py”, line 87, in _run_code
exec(code, run_globals)
File “/usr/local/lib/python3.8/dist-packages/transformers/onnx/main.py”, line 240, in
main()
File “/usr/local/lib/python3.8/dist-packages/transformers/onnx/main.py”, line 232, in main
export_with_transformers(args)
File “/usr/local/lib/python3.8/dist-packages/transformers/onnx/main.py”, line 165, in export_with_transformers
onnx_inputs, onnx_outputs = export(
File “/usr/local/lib/python3.8/dist-packages/transformers/onnx/convert.py”, line 346, in export
return export_pytorch(preprocessor, model, config, opset, output, tokenizer=tokenizer, device=device)
File “/usr/local/lib/python3.8/dist-packages/transformers/onnx/convert.py”, line 143, in export_pytorch
model_inputs = config.generate_dummy_inputs(preprocessor, framework=TensorType.PYTORCH)
File “/usr/local/lib/python3.8/dist-packages/transformers/models/layoutlmv3/configuration_layoutlmv3.py”, line 263, in generate_dummy_inputs
setattr(processor.feature_extractor, “apply_ocr”, False)
AttributeError: ‘LayoutLMv3TokenizerFast’ object has no attribute ‘feature_extractor’

Am I missing something very obvious?

This issue is solved by using the recommended optimum.exporters.onnx the details of using this library for exporting can be found at the following Optimum Link