Variable length batch decoding

As this is auto regressive model which predicts next token based on previous tokens, it might not generate correct tokens when there are eos in the text.

I thought you were asking about batching at training time. Sorry about the misleading answer.

Right now generate does not support batched generation for gpt2.

Pinging @lysandre

2 Likes