System Requirement for a given model to run it on local

HI All,
I am trying to experiment models for RAG using my official documents. For this I would like to run model on my local machine.

But as I am new to LLM world, I keep hitting roadblock because some models have specific requirements and I don’t find it explicitly mentioned on model page. For e.g. I was trying to run LlaMa 2 on my m1 mac, but then to realize that I would need CUDA suitable GPU for it to run.

Is there any way I can understand what is the minimum system requirements from hardware and software perspective to run a given LLM model ?

Thanks