Unable to create tensor, you should probably activate truncation and/or padding with 'padding=True' 'truncation=True' to have batched tensors with the same length

To train a Bert LM, I updated the DataCollatorForLanguageModeling function, and it creates another input_ids as new_input_ids which is the same size as input_ids. To train bert model it accepts input_ids and new_input_ids in BertForPreTraining class. But facing issues in batch size, getting error
as ’ Unable to create tensor, you should probably activate truncation and/or padding with ‘padding=True’ ‘truncation=True’ to have batched tensors with the same length’
I get this issue, comes from this 699 line of this file

Now how can i keep the same size of new_input_ids same as input_ids for every batch