[Bart] Bart model families' embedding shape?

Hi I’m implementing a finetuned Bart model for summarization, therefore I’m making decisions between using the ‘facebook/bart-large’ or the ‘facebook/bart-large-cnn’. But when I look into the layers in both models, I found the shapes of their embedding layers are different, is this a special trick?

Code to repeat

from transformers import BartTokenizer, BartModel, BartForConditionalGeneration
BARTmodel = BartModel.from_pretrained('facebook/bart-large')
CGmodel = BartForConditionalGeneration.from_pretrained('facebook/bart-large-cnn')
BARTmodel.shared
-----------
Embedding(50265, 1024, padding_idx=1)
CGmodel.model.shared
-----------
Embedding(50264, 1024, padding_idx=1)