[Possibly] Forgotten TODO Comment for `TrainingArguments.default_optim`

While reading the metadata and default parameters for the TrainingArguments attributes, I noticed the following comment regarding default_optim (permalink):

   default_optim = "adamw_torch"
    # XXX: enable when pytorch==2.0.1 comes out - we want to give it time to get all the bugs sorted out
    # if is_torch_available() and version.parse(version.parse(torch.__version__).base_version) >= version.parse("2.1.0"):
    #     default_optim = "adamw_torch_fused"
    # and update the doc above to:
    # optim (`str` or [`training_args.OptimizerNames`], *optional*, defaults to `"adamw_torch_fused"` (for torch<2.1.0 `"adamw_torch"`):
    optim: Union[OptimizerNames, str] = field(
        default=default_optim,
        metadata={"help": "The optimizer to use."},
    )

My question is whether this comment is outdated and should be removed, or if it is still relevant and should be applied, since it seems that the time has come.

NB: I figured it was worth drawing attention to this TODO.

1 Like

It’s almost time for the release of PyTorch 2.7…:sweat_smile:
You might want to raise an issue on the Transformers github.

1 Like