Specifying download directory for custom dataset loading script

I have a question.
How can I change my cache dir to a cloud storage like S3 bucket right now I have this code

from botocore.session import Session
import s3fs
from datasets import load_dataset_builder
import datasets
storage_options = {"key": "XXX", 
                "secret": "XXX"}

s3_session = Session(profile="hf2S3")
storage_options = {"session": s3_session}

fs = s3fs.S3FileSystem(**storage_options)
output_dir = "s3://path/to/my/bucket/"

builder = load_dataset_builder("SLPL/naab-raw")


builder.download_and_prepare(output_dir, storage_options=storage_options, file_format="parquet")

I modified codes from this post: Cloud storage but my dataset size is huge(~130G) and I wanted to transfer cache to S3 too.
I modified codes like this:

builder = load_dataset_builder("SLPL/naab-raw", cache_dir=output_dir)

but I got this error

AttributeError: 'S3' object has no attribute '__aenter__'. Did you mean: '__delattr__'?

which according to search I’ve done might be a network issue.
do you have any solution to this problem?
Thanks in advance :slightly_smiling_face: