I am using huggingface transformers
. with TrainingArguments(report_to="wandb", ...)
. This adds the trainer arguments into the config
on wandb.ai. Is there a way to change this behaviour? For example,
TokenizerArguments(...) ---> config.hf.tokenizer.<here>
TrainerArguments() ---> config.hf.trainer.<here>
ModelConfig() ---> config.hf.model.<here>
Is that possible?