My google-fu has failed me, I can not figure out how to import a model in to the cache. I am working with hundreds of jupyter notebooks, and I end up downloading tons of models.
Sometimes via git, sometimes HF, sometimes oobabooga
Q1: if I have a local git lfs copy of a model, how to I make it so that huggingface libraries do not re-re-re download that model
Q2 if I have used oobabooga’s text-generation-webui to download models how do I import them?
If there is help need to improve this part of the code, I would be grateful if you might point me to a specific post where cache improvements are being discussed.