Error with <|endoftext|> in Tokenizer GPT2

Sorry, I realized my mistake. It should have been like this:
tokenizer = AutoTokenizer.from_pretrained…
model = AutoModelWithLMHead.from_pretraine…


“If nothing works, then read the instructions”