Getting the CLS token from ViTMAEForPreTraining

Hello, I’m using the ViT MAE implementation from HuggingFace for a self-supervised task and I wanted to use the CLS token for downstream tasks such as classifications. I see the cls token appearing all around in the implementation, but it is then discarded at the end when producing the hidden states of the output. Is there some way (like an input argument) that I could use to keep the cls token in the hidden states of the model? There is an argument here and the cls token is discarded here.