TrainingArguments now Immutable. Why?

The TrainingArguments class was once mutable after creation. Now it is not:

transformers/training_args.py

    def __setattr__(self, name, value):
        # Once fully through the `__post_init__`, `TrainingArguments` are immutable
        if not name.startswith("_") and getattr(self, "_frozen", False):
            raise FrozenInstanceError(f"cannot assign to field {name}")
        else:
            super().__setattr__(name, value)

I configure the majority of my arguments from the command line, but I dynamically determine the output_dir at runtime. I also have certain scenarios where I need to dynamically adjust the batch size. Now none of this is possible anymore. This is pretty annoying. For now, my workaround is

object.__setattr__(training_args, "output_dir", "foo/bar/")

but this is rather annoying.

Why was this change made? Does anyone have a better solution?

1 Like

I am also not fully sure why this change was made.
The PR for the change is here:

The recommended workaround seems to be to use dataclasses.replace as was done here

from dataclasses import replace
...
training_args = replace(training_args, **kwargs_to_set)
...

The TrainingArguments were never meant to be mutable, so if you truly need to change them, the solution there is the right one

Thanks for the dataclasses tip!

Is the immutability meant to make multiprocessing safer? Just curious, but I don’t see any reason off the top of my head why this is necessary.

A design paradigm, users really shouldn’t be messing with these at all, unless under specific circumstances