Undestarding output_attentions= True

I have a question about output_attentions, and I need to make a heatmap about the attention of the final layer of the BERT model. But I do not know if the output_attentions[0] is the first or last layer. I tried to check the documentation, but I did not find it.