Error in Seq2SeqTrainingArguments

I am training a simple encoder-decoder transformer model on a text-to-text translation task. I am using Seq2SeqTrainingArguments class. It works fine when I am not using the label_smoothing argument. When I add label_smoothing=0.1, I get the following error:

ValueError                                Traceback (most recent call last)
Cell In[29], line 81
     74 trainer = MySeq2SeqTrainer(
     75   model=model,
     76   args=trainer_args,
     77   train_dataset=train_dataset,
     78   eval_dataset=dev_dataset
     79 )
-> 81 trainer.train()

File /data/envs/hugface/lib/python3.8/site-packages/transformers/, in Trainer.train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs)
   1496     self.model_wrapped = self.model
   1498 inner_training_loop = find_executable_batch_size(
   1499     self._inner_training_loop, self._train_batch_size, args.auto_find_batch_size
   1500 )
-> 1501 return inner_training_loop(
   1502     args=args,
   1503     resume_from_checkpoint=resume_from_checkpoint,
   1504     trial=trial,
   1505     ignore_keys_for_eval=ignore_keys_for_eval,
   1506 )

File /data/envs/hugface/lib/python3.8/site-packages/transformers/, in Trainer._inner_training_loop(self, batch_size, args, resume_from_checkpoint, trial, ignore_keys_for_eval)
    return forward_call(*input, **kwargs)
  File "/data/envs/hugface/lib/python3.8/site-packages/transformers/models/bert/", line 966, in forward
    raise ValueError("You have to specify either input_ids or inputs_embeds")
ValueError: You have to specify either input_ids or inputs_embeds

I am not able to figure out what is causing this issue.

Hi! The parameter name is label_smoothing_factor, can you try and see if that works? :slight_smile:

Sorry, it was a typo. I have set label_smoothing_factor only.

I met the same issue, have your resolved it?