How to find models that work on low memory/CPU edge devices

I am working on a device with 1GB RAM, running Ubuntu in headless mode with a quad core ARM CPU.

I would like to know how I can search for models that would work successfully given these limitations.

I ideally would like something that would do simple translation (say, English to French), but I want to use other languages and other models as well.

I do not wish to train a model, but just use something that already exists.

Presently, searching HuggingFace I can not see any means to filter models that would work or not work given these resource constraints.

Thank you for any and all help!

1 Like