model = GPT2LMHeadModel.from_pretrained('gpt2-xl')
I would like to export this model to ONNX using the Optimum utility. But this utility requires the model ID on hugging face. Here is the syntax. How do I find the ID? Thanks.
optimum-cli export onnx --model <model ID on hugging face>
That works, but it will use the base model of gpt2-xl. What I want is the model with the language modeling head (GPT2LMHeadModel). Using the name of the language modeling head in the optimum command does not work.
I’ve come up with a bypass. Download the model to my local drive, and then use optimum to point to the model file.