Sequence_length vs context_length in autoformer

Hello,
I’m trying the AutoformerModel for the first time. Looking at the example, provided here, I’m confused about what’s the difference between the sequence_length (the second dimension if the past_values) and the context_length in the config file.
In the example above, the context length is 24, while the sequence_length is 61.

Thank you,

Hello,

The context_length refers to the number of time steps from the past that the model looks at to make a prediction. It’s the amount of historical data the model uses to understand the context before making a forecast. Whereas, the sequence_length here is equal to context_length + max(config.lags_sequence). However, generaly, The sequence_length , often used in the context of preparing your dataset, refers to the total length of each sequence or sample in your dataset. This includes both the historical data (context) and the future data points you want to predict.