Passing gradient_checkpointing to a config initialization is deprecated

When initializing a wav2vec2 model, as follows:

feature_extractor = Wav2Vec2Processor.from_pretrained('facebook/wav2vec2-base')
wav_to_vec_model = Wav2Vec2Model.from_pretrained('facebook/wav2vec2-base')

I get the following warning:

UserWarning: Passing gradient_checkpointing to a config initialization is deprecated and will be removed in v5 Transformers. Using model.gradient_checkpointing_enable() instead, or if you are using the Trainer API, passgradient_checkpointing=True in your TrainingArguments.

I’m not using the TrainerAPI, so I tried adding:

wav_to_vec_model.gradient_checkpointing_enable()

Which doesn’t work. What am I doing wrong? Thanks

Not sure why this pretrained model has gradient_checkpointing enabled in its config @patrickvonplaten ? It will make everyone who wants to fine-tune it use gradient checkpointing by default which is not something we want.

That was sad,i meet this too

I have encountered the same problem. What is the reason for the warning and what should I do if I only want to use the pre trained model?