Time series for prediction: generate RuntimeError

Hello everyone,

I trained a Time Series Transformer for prediction.

I am trying to do something that is a bit unorthodox because my interest is not really forecasting, but rather learning the conditional distribution associated with a stationary time series with the objective of extrapolating its long-term autocorrelation function with precision.

For this purpose, my inputs are 3-d historical trajectories of length context_length=128 steps with no lags and I am trying to predict the subsequent first difference, the variation Delta X_t if you wish, instead of the absolute value X_t. The prediction length is therefore only 1.

The NLL looks good, It becomes negative and stabilizes to a certain value after around 100 epochs.

My problem is that when I try use the generate method, it gives me an error.

The architecture config dict is the following:

TimeSeriesTransformerConfig {
  "activation_dropout": 0.1,
  "activation_function": "gelu",
  "attention_dropout": 0.1,
  "cardinality": [
    0
  ],
  "context_length": 128,
  "d_model": 32,
  "decoder_attention_heads": 1,
  "decoder_ffn_dim": 32,
  "decoder_layerdrop": 0.1,
  "decoder_layers": 1,
  "distribution_output": "normal",
  "dropout": 0.1,
  "embedding_dimension": [
    0
  ],
  "encoder_attention_heads": 1,
  "encoder_ffn_dim": 32,
  "encoder_layerdrop": 0.1,
  "encoder_layers": 1,
  "feature_size": 11,
  "init_std": 0.02,
  "input_size": 3,
  "is_encoder_decoder": true,
  "lags_sequence": [
    0
  ],
  "loss": "nll",
  "model_type": "time_series_transformer",
  "num_dynamic_real_features": 0,
  "num_parallel_samples": 1,
  "num_static_categorical_features": 0,
  "num_static_real_features": 1,
  "num_time_features": 1,
  "prediction_length": 1,
  "scaler": false,
  "scaling": "mean",
  "transformers_version": "4.37.2",
  "use_cache": true
}

At the inference phase:

def sample(self, history_input, features):
        past_time_features = torch.ones(history_input.size(0), history_input.size(1),1).cumsum(dim=1).to(history_input)
        past_obs_mask = torch.ones(history_input.size(0), history_input.size(1),3, dtype=torch.bool).to(history_input)
        future_time_features = 129*torch.ones(history_input.size(0), 1, 1).to(history_input)
        outputs = self.timeseries_transformer.generate(
                                            past_values=history_input,
                                            past_time_features=past_time_features,
                                            past_observed_mask=past_obs_mask,
                                            static_real_features=features[:,:self.modes + self.temp],
                                            future_time_features=future_time_features,
                                            )
        return outputs

And the following error appears

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/gmcherch/.cache/pypoetry/virtualenvs/neural-ar-p_BuPOL2-py3.9/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/gmcherch/.cache/pypoetry/virtualenvs/neural-ar-p_BuPOL2-py3.9/lib/python3.9/site-packages/transformers/models/time_series_transformer/modeling_time_series_transformer.py", line 1767, in generate
    decoder_input = torch.cat((reshaped_lagged_sequence, repeated_features[:, : k + 1]), dim=-1)
RuntimeError: Sizes of tensors must match except in dimension 2. Expected size 128 but got size 1 for tensor number 1 in the list.

I don’t understand why this errors appear since it since I am feeding the correct tensors as inputs.
Is there something I didn’t set properly in the configuration phase?

I would like to thank anyone who could provide any useful advice.

[EDIT]

The issue was already filed and answered here: