Can't load tokenizer for 'rukaiyaaaah/fine-tuned'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name

I have fine-tuned a model and uploaded it on huggingface however I am not able to access it and I get these errors, I made 2 commits to the same model which may have added to this problem. I get the error
OSError: Can’t load tokenizer for ‘rukaiyaaaah/fine-tuned’. If you were trying to load it from ‘Models - Hugging Face’, make sure you don’t have a local directory with the same name. Otherwise, make sure ‘rukaiyaaaah/fine-tuned’ is the correct path to a directory containing all relevant files for a GPT2TokenizerFast tokenizer.

Any help would be deeply appreciated. Thanks!