Export a BetterTransformer to ONNX


I wonder if it’s possible to export a BetterTransformer to ONNX in order to take advantage of the optimizations done by both. I gave it a shot and I encountered the below error:

UnsupportedOperatorError: Exporting the operator 'aten::_nested_tensor_from_mask' to ONNX opset version 13 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub: https://github.com/pytorch/pytorch/issues

seems that there aren’t a lot of discussions on the internet and thus I wanted to post a question here. Thanks!

1 Like

Hi @dfangish, here is the list of ONNX-supported ATen operators: ONNX supported TorchScript operators — PyTorch 2.0 documentation

Searching this page for _nested_tensor_from_mask shows that it is not supported yet (same for _transformer_encoder_layer_fwd which would also be needed).

They provide a small guide if you would like to add support for these operators: torch.onnx — PyTorch 2.0 documentation

Hi @dfangish , I am facing the same. Did you solve this issue (writing custom op, etc…) ?

Hi @dfangish @EmreOzkose , this is unfortunately not possible. BetterTransformer relies on optimizations from PyTorch through custom CUDA kernels, which can not be exported to ONNX.

1 Like