Downloading a dataset files locally

Due to proxies and various other restrictions and policies, I cannot download the data using the APIs like:

from datasets import load_dataset
raw_datasets = load_dataset("glue", "mrpc")

I had the same problem when downloading pretrain models, but there is an alternative, to download the model files and load the model locally, for example:

git lfs install
git clone

Then i can use

model = AutoModelForSequenceClassification.from_pretrained("path/to/locally/downloaded/model/files")

Can I download the dataset files in a similar fashion directly and for example use? if yes how?

raw_datasets = load_dataset("path/to/locally/downloaded/dataset/files")
1 Like

You can use the wget command followed by the file’s URL, which should have the following format: <HUB_REPO_URL>/resolve/main/<FILE_NAME>. If you are unsure about the exact URL, you can just go to the “Files and versions” section and right-click the little arrow next to the file size to select the “Copy link address” option.

For instance, this would be a way to download the MRPC corpus that you mention:


And then you can enter python and do:

from datasets import load_dataset
mrpc = load_dataset(“./”, “mrpc”)

1 Like


It does not work.

First of all, it could not directly download the dataset . Second, even the above code does not work.

For instance, after downloading, I use the following code and try to download the XSUM dataset.

from datasets import load_dataset
raw_datasets = load_dataset("./",  "raw_datasets", split="train)

It shows the error as follows.

FileNotFoundError: Local file data/XSUM-EMNLP18-Summary-Data-Original.tar.gz doesn’t exist

I find there is one line code in the

# From
_URL_DATA = "data/XSUM-EMNLP18-Summary-Data-Original.tar.gz"

Option 1:

It can download the dataset but ReadError while “Generating train split” if f I use the following code to replace the above “_URL_DATA…”


Option 2

After adding ssl code, it works if using the original code as follows.

import ssl

    _create_unverified_https_context = ssl._create_unverified_context
except AttributeError:
    ssl._create_default_https_context = _create_unverified_https_context

# The xsum dataset is stored in .cache/huggingface/datasets/xsum
from datasets import load_dataset
raw_datasets = load_dataset("xsum", split="train")

Anyway, the method is not a direct method. It could not save the code locally.

The direct downloading method is listed as follows.

$ wget --no-check-certificate

However, it is not easy to get such a downloading weblink for every dataset in HuggingFace.