How much data were used to pre-train facebook/wav2vec2-base

The Original problem is here
I changed the pre-training model facebook/wav2vec2-base-100k-voxpopuli from to facebook/wav2vec2-base. It works. Everything is ok. However [facebook/wav2vec2-base-100k-voxpopuli] was pretrained on 100k hours data, I don’t know how much data were used to pretrain facebook/wav2vec2-base。 Maybe there is something wrong in facebook/wav2vec2-base-100k-voxpopuli?

Hey @zzuczy,

We could indeed have explained it better on the model card. As you can see in the official paper: https://arxiv.org/pdf/2006.11477.pdf the model was pretrained on 960h hours of Librispeech