Exporting GPTJ model to onnx is not supported

Hi , I am trying to export GPTJ to onnx using the following command.

python -m transformers.onnx --model=EleutherAI/gpt-j-6B onnx/

But return I am getting an error “ValueError(f"Unsupported model type: {config.model_type}”)"

But as per this documentation Export 🀗 Transformers Models

The export method support GPTJ model too. Any idea what is going wrong will be very helpful.

Thanks:)

1 Like

Could you check if this is still the case? There was some work going on with GPTJ conversions during the time of your post, should be working as intended now.