How to get model size?

Hi, everyone!
Is there a quick way to get model size for an arbitrary transformers model?

Do you mean size in mb or the number of the parameters used?

Regarding size check this:

Regarding the number of the parameters in PyTorch you can use:
sum(p.numel() for p in model.parameters())


For me, the simplest way is to go to the “Files and versions” tab of a given model on the hub, and then check the size in MB/GB of the pytorch_model.bin file (or equivalently, the Flax/Tensorflow model file).

e.g. GPT-J-6B is 22.5 GB, as can be seen here:


Hi Kirill!
Thank you!
Inspired by the number-of-parameters part of your answer I figured, I could actually get it in a slightly simpler way:


which returns the same number as your solution.
Thanks for help!


Hi Niels!
Thank you!!

Is there a way to sort or filter models by file size?


Is there a way to get the size of pytorch_model.bin without actually downloading the file? mteb/leaderboard · Add size and language to table?

Nevermind, figured it out:

git clone --no-checkout
cd multilingual-e5-large\
λ git lfs ls-files -s
020afdebf2 - model.safetensors (2.2 GB)
bb5a52503a - onnx/model.onnx (546 KB)
0cf1883fee - onnx/model.onnx_data (2.2 GB)
cfc8146abe - onnx/sentencepiece.bpe.model (5.1 MB)
62c24cdc13 - onnx/tokenizer.json (17 MB)
9aaa222c5a - pytorch_model.bin (2.2 GB)
cfc8146abe - sentencepiece.bpe.model (5.1 MB)
62c24cdc13 - tokenizer.json (17 MB)

Even better:

λ git lfs ls-files -s --include=pytorch_model.bin
9aaa222c5a - pytorch_model.bin (2.2 GB)