Local Installs and GPUs

Is there a way to sort by GPU for project/models to see what is best suited for a system.

I am getting an rtx 4070 and hoping to to do some pre work, and not get my hopes up.

My focus - text to image training, and text-movie - mostly learning.

I would also like to build a ‘health-gpt’ loading my lab results and other information, into an LLM, but have it go to be on the web. I am definitely happy to help others, but not risk my privacy.

Thoughts about this?

Thank you, Huggers!