BartForConditionalGeneration : lm_head layer dimension change

Hi,

I’m trying to create a simple seq2seq generation model based on BartForconditionalGeneration.

The original input dimension of lm_head layer is 1024.

But I have changed it to 1244 for some purpose.

like this

model.lm_head 
Linear(in_features=1224, out_features=50266, bias=False)

But when I implement forward() method, I run into this error.

As you can see, the input dimension is still 1024.

How can I solve this error?