Limit length of output sequence in Seq2Seq model

Hi, I’m developing a Seq2Seq model with BERT. However, I’ve got some problems, the output sequence is usually very large and words coming out in the end tend to be rubbish. How could I limit the length of the output sequence? Would you give some example with code? Thanks in advance