How to turn WanDB off in trainer?

I am trying to use the trainer to fine tune a bert model but it keeps trying to connect to wandb and I dont know what that is and just want it off. is there a config I am missing?

1 Like

import os
os.environ[“WANDB_DISABLED”] = “true”
This works for me.

8 Likes

alternatively, you can disable the weights and biases (wandb) callback in the TrainingArguments directly:

# None disables all integrations
args = TrainingArguments(report_to=None, ...)
9 Likes

It should be “none”, not None

19 Likes

Hi after doing that i have the following error: “Error: You must call wandb.init() before wandb.log()”

1 Like

Hi @hiramcho, check out the docs on the logger to solve that issue. You just need to call wandb.init(project='your_project_name') somewhere before you start using the logger. https://docs.wandb.ai/guides/integrations/huggingface

1 Like

For me, I want wandb, but I don’t like the many prints during training.
Even though I set the logging level to be WARNING, I still get many:


Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.0"
}

How do I turn this print off? (Print is in configuration_utils.py:543)

2 Likes

I get these annoying prints when running on Kaggle

Generate config GenerationConfig {
“decoder_start_token_id”: 0,
“eos_token_id”: 1,
“pad_token_id”: 0,
“transformers_version”: “4.26.0”
}

when the code starts evaluation
it prints each step

how to tun it off Please
???

1 Like

I also had the same issue with the annoying Generate config GenerationConfig lines. In my case, upgrading from transformers 4.26.0 to 4.27.2 solved the issue.

1 Like

I also faced the same issue, and the issue is caused by transformers/configuration_utils.py at main · huggingface/transformers · GitHub

Comment that part the issue should be solved.

import wandb
wandb.init(mode=“disabled”)

1 Like

Thanks! This is so weird. I was banging my head on it for too long lol.