GPU Memory usages varies with the size of the dataset

Hi,

Does the GPU Memory usages vary with the size of the dataset? I am experimenting with the code from here.

I observed the GPU memory varies (not proportional to) the size of the dataset. Trying to understand why? Anyone would help explain?

The author of the code help me figure it out. It was because the Dataset class implemented in the code loading the encoding in __int__ function

Yes I have similar doubt. any suggestions?