Difference between checkpoints

Guys, I am confused on trying to figure the difference between different checkpoints.

So I see some checkpoints like ‘bert-based-uncased’ and ‘facebook/bart-base’. Are these the base BERT and BART transformers? Does the tokenizers I get when I use these checkpoints untrained and models not fine-tuned?

And the checkpoints like ‘Helsinki-NLP/opus-mt-en-fr’. Does, the model and tokenizer that come with this checkpoint already trained and fine-tuned to a certain extent?

I have been trying for a day to understand the difference. But have not been successful. Any answer would be helpful. Thank you.