Replace Causal Mask of T5 to custom mask

I’m a beginner of nlp,and i’m using t5 for my experiment.My goal it’s to modify the mask of t5 decoder,as the official doc said,t5 use default causal mask for decoder.
transformers/src/transformers/modeling_utils.py at main · huggingface/transformers · GitHub
I wonder how i can modify this default mask and define it myself.

Hey @yoohell, you can create your own decoder_attention_mask and pass that to T5 instead of using the default

do you mean decoder_attention_mask of forward() function?
I found it in t5 doc,it’s said that :
decoder_attention_mask (torch.BoolTensor of shape (batch_size, target_sequence_length), optional) — Default behavior: generate a tensor that ignores pad tokens in decoder_input_ids. Causal mask will also be used by default.
and the casual mask matrix is built by buil-in function,is there anyway i can replace it with my custom matrix?
thanks for replying me!