How to run an LLM on a local Computer or VPS with external GPU?

I would like to run an LLM on my Local Computer or (even better) a Linux VPS Server, but things like oobabooga don’t really work for me, because I only have 3 GB GPU local and my VPS has just a basic onboard GPU.
So my plan is to rent a GPU hourly to run the LLM.
What’s the best way to do it?
Would be best if it works on a Linux VPS, so other people could use it too.
I would like to run this Model: mosaicml/mpt-7b-storywriter · Hugging Face

1 Like

Interested too!

Interested too!