What value should the sequence_length parameter be when converting to TFLite

I am converting a model hosted by HuggingFace to a TensorFlow Lite model by using HuggingFace’s Optimum exporter.

In their documentation they use the following example:

optimum-cli export tflite --model google-bert/bert-base-uncased --sequence_length 128 bert_tflite/

As you can see they specify sequence_length as 128 which the help output describes as:

Sequence length that the TFLite exported model will be able to take as input.

When converting another model what sequence_length should I use? Should it not simply match the sequence length of the original model and if so how do you find that value?