Attention heads of a Peft fine-tuned Whisper model

Hi
referring to:

“If you fine-tuned your own checkpoint, you may need to inspect the cross-attention weights to find the appropriate layers and attention heads”

I’m looking for a way to find the attention heads of a Peft fine-tuned Whisper model
I need it in the format of:
[[10, 12], [13, 17], [16, 11], [16, 12], [16, 13], [17, 15], [17, 16], [18, 4], [18, 11], [18, 19], [19, 11], [21, 2], [21, 3], [22, 3], [22, 9], [22, 12], [23, 5], [23, 7], [23, 13], [25, 5], [26, 1], [26, 12], [27, 15]]

Thanks