Hey, everybody, I’m a newbie. Yesterday I installed a phi3 model locally (it is extremely slow, as I don’t have access to my main device and have to use a weaker device). This model is very slow as the device is not very powerful. I want to clarify if it is possible to interact with the models through a free api. I looked through the article about Influence api (there was something about the Transformers library), but I don’t quite understand, do I also need to install the model locally?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Request API access? Remote model access | 3 | 27 | August 25, 2025 | |
How to avoid downloading models | 0 | 810 | June 25, 2023 | |
Host Models on Hugging face and Perform Inference on Hugging Face Infrastructure | 1 | 28 | August 24, 2025 | |
Model Deploy On-prem | 1 | 814 | March 21, 2024 | |
What models can be used with a free account through the inference API? | 1 | 472 | November 30, 2024 |