Hi , I am trying to export GPTJ to onnx using the following command.
python -m transformers.onnx --model=EleutherAI/gpt-j-6B onnx/
But return I am getting an error âValueError(f"Unsupported model type: {config.model_type}â)"
But as per this documentation Export ð€ Transformers Models
The export method support GPTJ model too. Any idea what is going wrong will be very helpful.
Thanks:)