Which model/class should I use to fine tune GPT2 for text classification?

Hi, I’m planning to fine tune a pre-trained GPT-2 model with our own private data to build a text classifier. I noticed that there are several classes in the transformer module for GPT-2 pre-trained models, namely GPT2Model, GPT2LMHeadModel, GPT2ForSequenceClassification. Which one should I use for my use case?