while running below code in google collab
from datasets import load_dataset
raw_datasets = load_dataset(“kde4”, lang1=“en”, lang2=“fr”)
i am always getting following error
/usr/local/lib/python3.11/dist-packages/huggingface_hub/utils/_auth.py:94: UserWarning:
The secret HF_TOKEN
does not exist in your Colab secrets.
To authenticate with the Hugging Face Hub, create a token in your settings tab (Hugging Face – The AI community building the future.), set it as secret in your Google Colab and restart your session.
You will be able to reuse this secret in all of your notebooks.
Please note that authentication is recommended but still optional to access public models or datasets.
warnings.warn(
Downloading builder script:
4.25k/? [00:00<00:00, 114kB/s]
Downloading readme:
5.10k/? [00:00<00:00, 146kB/s]
Downloading data: 100%
7.05M/7.05M [00:01<00:00, 7.19MB/s]
Generating train split: 100%
210173/210173 [00:09<00:00, 15036.91 examples/s]
NotImplementedError Traceback (most recent call last)
/tmp/ipython-input-2-4272550563.py in <cell line: 0>()
1 from datasets import load_dataset
2
----> 3 raw_datasets = load_dataset(“kde4”, lang1=“en”, lang2=“fr”)
1 frames
/usr/local/lib/python3.11/dist-packages/datasets/builder.py in as_dataset(self, split, run_post_process, verification_mode, ignore_verifications, in_memory)
1171 is_local = not is_remote_filesystem(self._fs)
1172 if not is_local:
→ 1173 raise NotImplementedError(f"Loading a dataset cached in a {type(self._fs).name} is not supported.")
1174 if not os.path.exists(self._output_dir):
1175 raise FileNotFoundError(
NotImplementedError: Loading a dataset cached in a LocalFileSystem is not supported.