Loading local tokenizer (RobertaTokenizerFast.from_pretrained)

Hello, I have been following this tutorial; Google Colab however, I cannot get around an issue with loading my locally saved vocab and merges file for the tokenizer.

When I use:

from transformers import RobertaTokenizerFast

tokenizer = RobertaTokenizerFast.from_pretrained(r"C:\\Users\\folder", max_len=512)

I get:

OSError: Can't load tokenizer for 'C:\\Users\\folder'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'C:\\Users\\folder' is the correct path to a directory containing all relevant files for a RobertaTokenizerFast tokenizer.

The directory contains a vocab and merges file, however whatever I try I cannot load the tokenizer - appreciate any help anyone can provide!