How Can I use cashed models from HuggingFace?

I want to use models from: https://huggingface.co/ARTeLab/mbart-summarization-mlsum in offline mode, meaning that after downloading them from Hugging Face, they will be saved locally and I will be able to use them offline. However, I don’t know how to do this. If anyone has already figured this out, please advise me. I use these lines to download models:

from transformers import MBartTokenizer, MBartForConditionalGeneration
tokenizer = MBartTokenizer.from_pretrained("ARTeLab/mbart-summarization-mlsum")
model = MBartForConditionalGeneration.from_pretrained("ARTeLab/mbart-summarization-mlsum")

The problem is that when I run this line, I download several files from the repository at once, and I don’t know which one is then used for tokenization:

If they are all used, how can I know how to use them?