How to obtain GPT2 tokens without Transformers library?

Atm, I’m using the GPT2TokenizerFast from transformers to restrict input length to OpenAI’s GPT3 (GPT3 and GPT2 use the same tokenizer). It feels wasteful to install transformers on production K8s pods if all I need is the tokenizer, yet I was unable to find the pretrained GPT2 Tokenizer in :hugs:Tokenizers library.

Can anyone, please, recommend a more efficient approach for obtaining GPT2 that doesn’t require installing the entire transformers library?