Double expected memory usage

I’ve used a few different models through Huggingface and consistently noticed approximately double expected memory usage. For example, the below code uses a model that has a 1.5 GB model file so I expect about 1.5 GB of memory usage. Instead, I see memory usage increase by about 3 GB. Is there something I’m misunderstanding or doing incorrectly?

from transformers import pipeline
from time import sleep

classifier = pipeline('zero-shot-classification', model='facebook/bart-large-mnli')
print('Done loading!')
sleep(10)