Transformers module - parameter count and size

Normally, the model size / parameter count can be calculated like

pytorch_total_params = sum(p.numel() for p in model.parameters())

But when we use transformers module, the normal ways didn’t work

from transformers import Wav2Vec2Processor
processor = Wav2Vec2Processor.from_pretrained("facebook/hubert-xlarge-ls960-ft")
# processor.parameters() -> error (has no attribute parameters)
# Wav2Vec2Processor.parameters() -> error (has no attribute parameters)
# Wav2Vec2Processor.feature_extractor.parameters() -> error (has no attribute parameters) 

Is there any way to know the model size / paramter count of transformers module?