Hi, everyone!
Is there a quick way to get model size for an arbitrary transformers model?
Hi🤗
Do you mean size in mb or the number of the parameters used?
Regarding size check this:
Regarding the number of the parameters in PyTorch you can use:
sum(p.numel() for p in model.parameters())
2 Likes
For me, the simplest way is to go to the “Files and versions” tab of a given model on the hub, and then check the size in MB/GB of the pytorch_model.bin
file (or equivalently, the Flax/Tensorflow model file).
e.g. GPT-J-6B is 22.5 GB, as can be seen here:
4 Likes
Hi Kirill!
Thank you!
Inspired by the number-of-parameters part of your answer I figured, I could actually get it in a slightly simpler way:
model.num_parameters()
which returns the same number as your solution.
Thanks for help!
1 Like
Hi Niels!
Thank you!!