Exporting GPTJ model to onnx is not supported

Hi , I am trying to export GPTJ to onnx using the following command.

python -m transformers.onnx --model=EleutherAI/gpt-j-6B onnx/

But return I am getting an error “ValueError(f"Unsupported model type: {config.model_type}”)"

But as per this documentation Export 🤗 Transformers Models

The export method support GPTJ model too. Any idea what is going wrong will be very helpful.

Thanks:)