Wav2vec2 finetune - loss goes to zero at some point

Hi. I’m fine tuning the wav2vec2 model on my own data. The data ranges from very short to very long audio files.

After some time and some epochs the loss collapses to zero with no reason.
Just to clarify: the model has passed through all the data a few times prior to this.

My settings:

training_args = TrainingArguments(
    optim="adamw_torch",
    push_to_hub=True,
    log_level='info',
    hub_token='TOKEN',
    output_dir=DIR,
    group_by_length=True,
    per_device_train_batch_size=batch_size,
    per_device_eval_batch_size=batch_size,
    evaluation_strategy="steps",
    num_train_epochs=epochs,
    fp16=True,
    gradient_checkpointing=True,
    save_steps=5000,
    eval_steps=5000,
    logging_steps=500,
    learning_rate=0.0001, 
    weight_decay=0.0001, 
    warmup_steps=2000,
    save_total_limit=50,
    overwrite_output_dir=True,
    ignore_data_skip=False, 
    resume_from_checkpoint=True,
    load_best_model_at_end=True,
    metric_for_best_model='loss')
trainer = Trainer(
    model=model,
    data_collator=data_collator,
    args=training_args,
    compute_metrics=compute_metrics,
    train_dataset=x_train,
    eval_dataset=x_test,
    tokenizer=processor.feature_extractor,
    callbacks=[EarlyStoppingCallback(early_stopping_patience=6)],
    )```
1 Like