Using hyperparameter-search in Trainer

Here’s a recent blog post by @matteopilotto about using W&B Sweeps with HF transformers. http://wandb.me/hf-sweeps

You can use hyperparameter_search(backend='wandb'...) or you can use the W&B logger and use Sweeps to control the search and you get this plot to understand your metrics:

To use W&B Sweeps, you define a config with your search params and then create the sweep with wandb.sweep(config, project='your-project-name').

def train(config=None):
  with wandb.init(config=config):
    # set sweep configuration
    config = wandb.config
    training_args = TrainingArguments(
        output_dir='vit-sweeps',
	    report_to='wandb',  # Turn on Weights & Biases logging
        num_train_epochs=config.epochs,
        learning_rate=config.learning_rate,
        weight_decay=config.weight_decay,
        per_device_train_batch_size=config.batch_size,
        ...
    )

3 Likes