Low Dim Embeddings from Similarity Transformer Models

I’m looking for a similarity model which may be capable of creating low dimensional embeddings for data of approximately 50 thousands of sentences. Right now I’m using paraphrase-MiniLM-L3-v2 model, which outputs vectors of dim 384. This is just too much for my objective. I was wondering if there was a good model which would output embeddings of dim around 100, or less…

If there’s none, does anyone know if dim reduction techniques like PCA lose too much information? Any recommendations for a dim reduction technique?

what models can i find on hf that has a low dim embedding like 50-100
please paste the linksz below