Generation utils for pytorch transformers

Hello,
I have a encoder-decoder transformer model using normal pytorch functions (nn.TransformerEncoder, nn.TransformerDecoder) which I have managed to train. I am hoping to using huggingface’s implementation of beam search with this model.

So the question is, is there a way to add the generation utilities mentioned here: Generation. If not, is what is the equivalent of using Huggingface’s nn.TransformerEncoder, nn.TransformerEncoderLayer, nn.TransformerDecoder, nn.TransformerDecoderLayer so that I can use the generate function via huggingface transformers instead.

1 Like

I am looking for something similar. Were you able to find a solution for this?