Test data size error in TimeSeriesTransformer

I am trying to apply TimeSeriesTransformer for the ett dataset, which unlike the huggingface example has num_dynamic_real_features = 6. When I run the following part of the code:
model.eval()

forecasts =

for batch in test_dataloader:
outputs = model.generate(
static_categorical_features=batch[“static_categorical_features”].to(device)
if config.num_static_categorical_features > 0
else None,
static_real_features=batch[“static_real_features”].to(device)
if config.num_static_real_features > 0
else None,
past_time_features=batch[“past_time_features”].to(device),
past_values=batch[“past_values”].to(device),
future_time_features=batch[“future_time_features”].to(device),
past_observed_mask=batch[“past_observed_mask”].to(device),
)
forecasts.append(outputs.sequences.cpu().numpy())

I get this error:
ValueError: all the input array dimensions except for the concatenation axis must match exactly, but along dimension 1, the array at index 0 has size 8664 and the array at index 2 has size 8640

The size of the test_dataset is 8640. The size difference is always a product of prediction_length, in this case it is 24.

Hope someone can help me