Moving tokenizers between machines

Because of some restrictions I have, I need to use the Huggingface libraries on a server without network access. This is extremely cumbersome

What I am trying to do is to download each model, dataset and tokenizer locally, save them and move them to the server. But I am having an issue when moving tokenizers because I still get references to local paths. For instance, I tried doing this:

from transformers import BertTokenizer

tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
tokenizer.save_pretrained('/Users/andrea/data/models/bert-base-uncased')

I thought that the content of /Users/andrea/data/models/bert-base-uncased would be enough to reload the tokenizer elsewhere, but unfortunately this is not the case. In particular, /Users/andrea/data/models/bert-base-uncased/tokenizer_config.json contains

{
  "tokenizer_file": "/Users/andrea/.cache/huggingface/transformers/534479488c54aeaf9c3406f647aa2ec13648c06771ffe269edabebd4c412da1d.7f2721073f19841be16f41b0a70b600ca6b880c8f3df6f3535cbc704371bdfa4"
}

which is a path on my local machine.

Now, I could rewrite it, move that file as well, and so on, but still this is unconvenient and I am not sure I will be missing something else. Is there a reliable way to move a tokenizer on a different machine in such a way that it will be usable there without internet access?