Hardware requirements for using sentence-transformers/all-MiniLM-L6-v2

Hi,
Can someone please advise me upon the hardware requirements of using sentence-transformers/all-MiniLM-L6-v2 .
I had downloaded the model locally and am using it to generate embedding, and finally using util.pytorch_cos_sim to calculate similarity scores between 2 sentences. All was working good locally in my Mac Pro ( 2.4 GHz 8-Core Intel Core i9 processor and 32 GB memory); but after I moved the model to containers of 1 core CPU and 4 GB RAM (within my company firewall), the code is taking at least 15-20 times more time to generate the cosine similarity score.

Did someone face a similar situation? Kindly advise.
Thank you in advance for the help!