when I tried to use wandb for hyperparameter-search, I got this warning:
[WARNING|trainer.py:1456] 2024-05-07 10:40:15,466 >> Trying to set _wandb in the hyperparameter search but there is no corresponding field in `TrainingArguments`.
[WARNING|trainer.py:1456] 2024-05-07 10:40:15,466 >> Trying to set assignments in the hyperparameter search but there is no corresponding field in `TrainingArguments`.
[WARNING|trainer.py:1456] 2024-05-07 10:40:15,466 >> Trying to set metric in the hyperparameter search but there is no corresponding field in `TrainingArguments`.
[INFO|trainer.py:1472] 2024-05-07 10:40:15,467 >> W&B Sweep parameters: {'_wandb': {}, 'learning_rate': 9.844105079620968e-06, 'per_device_train_batch_size': 64, 'assignments': {}, 'metric': 'eval/loss'}
This is the code that I am using:
def model_init(trial):
return CLIPModel.from_pretrained("openai/clip-vit-base-patch32")
# 8. Initialize our trainer
trainer = Trainer(
args=training_args,
train_dataset=train_dataset if training_args.do_train else None,
eval_dataset=eval_dataset if training_args.do_eval else None,
data_collator=collate_fn,
model_init=model_init,
)
def wandb_hp_space(trial):
return {
"method": "random",
"metric": {"name": "eval_loss",
"goal": "minimize"},
"parameters": {
"learning_rate": {"distribution": "uniform", "min": 1e-6, "max": 1e-4},
"per_device_train_batch_size": {"values": [16, 32, 64]},
},
"run_cap":8
}
# Define compute objective function
def compute_objective(metrics):
return metrics["eval_loss"]
best_trial = trainer.hyperparameter_search(
direction="minimize",
backend="wandb",
hp_space=wandb_hp_space,
n_trials=16,
compute_objective=compute_objective,
)
While I got an output, I still did not understand the warnings and their impact. Thank you for helping me understand.