When I’m nominally using wandb
, in there documentation for pytorch integration there is a suggested call to the watch
method:
wandb.watch(my_model, log='all', log_freq=8)
With the following enviroment variable set:
export WANDB_WATCH="all"
I’m able to record the values of the gradients and the parameter values throughout training. Is there a way to record the gradients and parameter values throughout training by using the accelerate library?