Stop generation while using past in GPT-2

I am writing custom backend support for a game using GPT-2. I am try to tokenizing \n to stop generating when we reach a new line. The token is a blank token with nothing in it. Is there a way while using past to stop generation at a certain token?

This is modified example

Code 1

`EndTokens = enc.encode(EndToken)`

EndToken is \n

Code 2

        if len(EndTokens) > 0 and prev[0, 0] == EndTokens[0] : 


Prev is a output from gpt-2. This code is in a loop generating a set amount of tokens.