HF Datasets not working with Language Modeling

I am trying to use this notebook (https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/language_modeling.ipynb) to finetune and generate text with GPT-Neo using my own custom dataset. I uploaded my own text file to my dataset, but when trying to use it, it just gives me this error


FileNotFoundError Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/datasets/load.py in prepare_module(path, script_version, download_config, download_mode, dataset, force_local_path, dynamic_modules_path, return_resolved_file_path, **download_kwargs)
354 try:
→ 355 local_path = cached_path(file_path, download_config=download_config)
356 except FileNotFoundError:

4 frames
FileNotFoundError: Couldn’t find file at htts://huggingface.co/datasets/Trainmaster9977/zbakuman/resolve/main/zbakuman.py

During handling of the above exception, another exception occurred:

FileNotFoundError Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/datasets/load.py in prepare_module(path, script_version, download_config, download_mode, dataset, force_local_path, dynamic_modules_path, return_resolved_file_path, **download_kwargs)
357 raise FileNotFoundError(
358 “Couldn’t find file locally at {}, or remotely at {}. Please provide a valid {} name”.format(
→ 359 combined_path, file_path, “dataset” if dataset else “metric”
360 )
361 )

FileNotFoundError: Couldn’t find file locally at Trainmaster9977/zbakuman/zbakuman.py, or remotely at htts://huggingface.co/datasets/Trainmaster9977/zbakuman/resolve/main/zbakuman.py. Please provide a valid dataset name

(Slightly modified beginning of url due to new users only able to put 2 links in posts)

I tried to use this to load it

from datasets import load_dataset

dataset = load_dataset(“Trainmaster9977/zbakuman”)