Is there a quick way to get model size for an arbitrary transformers model?
Do you mean size in mb or the number of the parameters used?
Regarding size check this:
Regarding the number of the parameters in PyTorch you can use:
sum(p.numel() for p in model.parameters())
For me, the simplest way is to go to the “Files and versions” tab of a given model on the hub, and then check the size in MB/GB of the
pytorch_model.bin file (or equivalently, the Flax/Tensorflow model file).
e.g. GPT-J-6B is 22.5 GB, as can be seen here:
Inspired by the number-of-parameters part of your answer I figured, I could actually get it in a slightly simpler way:
which returns the same number as your solution.
Thanks for help!