Wav2vec2 finetuning custom dataset

Hello,

Thank you for sharing such nice model again on this framework.

I am trying to finetune a wav2vec2 model on a custom dataset (so not from the dataset package of huggingface). I have tried to follow these two tutorials :

I also encoutered memory issue with the GPU (16 gb) with a base Wav2vec2 model, even with batch size = 1. What is the maximum batch size for a base and large model for 16 gb, and with what length of sample ? (using fp16).

I thank you for the help

1 Like

You can convert your custom dataset if its a dataframe into Hugging Face format using this.

from datasets import Dataset, load_metric

train_data = Dataset.from_pandas(train_df)
test_data = Dataset.from_pandas(test_df)

If you are facing memory related problem, set num_proc = 1
It solved my problem.

1 Like

hello i want to finetune that model but on arabic tartil dataset what changes should i do ?

1 Like