SageMaker doesn’t support argparse actions

This means you cannot use parser.add_argument("--args", action="store_true")


The HfArgumentParser is a custom on top implementation on the argsparser to make it easy create python scripts for Transformers. You can use the HfArgumentParser if you want and feel confident, for example with the HfArgumentParser you don’t need to define the TrainingArguments as parser.add_argument since they are added behind the scenes.
For the SageMaker examples we went with the default argsparser since it is easier and faster to get started for non Transformers experts and it might have been difficult to understand that you don’t need to define per_device_train_batch_size in the train.py but can use it as hyperparameter in the notebook.

Someone could explain the differences and if in SageMaker, we must rephrase the arguments section of the scripts in transformers/examples/pytorch/ as formulated in Prepare a :hugs: Transformers fine-tuning script or not?

No you don’t need to rephrase them, since the HfArgumentParser is creating add_argument behind the scenes it works with SageMaker. So you can decide how you would like to structure your script

1 Like