Setting up a timeseries transformer

I’ve been trying to set up the TimeSeriesTransformerForPrediction model with a time series dataset I’ve preprocessed for it.
After a painstaking process I’ve managed to get to the point in the script where I’m just about ready to initiate the training of said model.

however I am thuroughly stuck on this issue:

File “C:\Users\difrf\ASSEMBLE TRANSFORMERS\lib\site-packages\torch\nn\modules\module.py”, line 1501, in _call_impl
return forward_call(*args, **kwargs)
File “C:\Users\difrf\ASSEMBLE TRANSFORMERS\lib\site-packages\transformers\models\time_series_transformer\modeling_time_series_transformer.py”, line 1430, in forward
transformer_inputs, loc, scale, static_feat = self.create_network_inputs(
File “C:\Users\difrf\ASSEMBLE TRANSFORMERS\lib\site-packages\transformers\models\time_series_transformer\modeling_time_series_transformer.py”, line 1303, in create_network_inputs
torch.cat(
TypeError: expected Tensor as element 1 in argument 0, but got NoneType
0%| | 0/81910 [00:00<?, ?it/s]

After triplechecking that I’m passing: [past_values, future_values, past_time_features and past_observed_mask] as tensors I’m still receiving this error.

I’m starting to suspect there might be another required positional argument I’m not aware off or maybe I’m just hopelessly stupid. Any help or feedback on how to get past this issue would be greatly appreciated

You need to also pass in future_time_features.

I just got an Informer model to start training and had to solve this problem as well. I figured it out by looking at the library code around where the torch.cat() is being called.

I’m now encountering the rather odd behavior of having negative loss show up during training. I’m quite curious if you encounter the same behavior after your training starts. https://discuss.huggingface.co/t/positive-loss-value-changes-to-negative-loss-while-training-informer-or-timeseriestransformer-model/37304

1 Like

I had the same problem with Time series transformer.
It’s a bug from the old version of transformers lib. Upgrading to 2.28 solved the problem for me.

Hi,

I am not acquainted with the Informer network but if the Informer is optimised through Maximum-Likelihood on a continuous probability density ( student or normal), like the TimeSeries Transformer for prediction, a negative loss function can happen.
It means the model is assigning very low fluctuation to the predictions, which makes the negative log-likelihood go negative.