Wandb.watch in accelerate library

When I’m nominally using wandb, in there documentation for pytorch integration there is a suggested call to the watch method:

wandb.watch(my_model, log='all', log_freq=8)

With the following enviroment variable set:

export WANDB_WATCH="all"

I’m able to record the values of the gradients and the parameter values throughout training. Is there a way to record the gradients and parameter values throughout training by using the accelerate library?

Did you try calling wandb.watch() just after initialising the wandb run in Acclerate? It might still work as long as its called before you start logging in your script

1 Like

For code of what @morgan means, it’d look something like this:

accelerate = Accelerator(log_with="wandb")

1 Like