Trainer.train() prints some values like loss, grad_norm etc. to the console but not to log file

trainer_args = TrainingArguments(
        output_dir=output_dir,
        warmup_steps=5,
        per_device_train_batch_size=batch_size,
        gradient_accumulation_steps=4,
        learning_rate=learning_rate,
        logging_steps=25,
        optim="paged_adamw_8bit",
        logging_dir=os.path.join(output_dir, "logs"),
        save_strategy="epoch",  # Keep this as "epoch"
        save_total_limit=1, 
        evaluation_strategy="steps",
        eval_steps=5000,
        do_eval=True,
        report_to="none",
        num_train_epochs=num_train_epochs,
        gradient_checkpointing=True,
        max_steps=30,
    )

Is it possible to get trainer to print values to a specific file and not wandb or something else ?

1 Like