Starting around Feb 8th, 19:15 PST, I started seeing some bizarre network connectivity issues while downloading model files with wget:
Download rates dropped precipitously to between 100 KB/s and 500 KB/s, occasionally stalling.
After downloading a few gigabytes, the connection would close, and upon attempting to resume the download I would see a “403: Forbidden” error.
Restarting the download a few minutes later works, but again with low download rate, and only for a few more gigabytes before disconnecting + 403’ing again.
Have I just been downloading too much data, and run into a rate limiting policy? Is there a monthly download cap? I’ve looked for an official policy to that effect, but haven’t found anything relevant.
I don’t think there is anything that has been clearly stated, and I don’t think it would be caught if it was only a few GB…
I think there is a possibility that there are some restrictions in place on the ISP or company’s internal network.
Thanks for verifying that I’m not overlooking anything well-known.
I wondered if it might be a problem at the ISP or with my own network, but the occurrence of 403 replies seems to contradict that. A 403 has to be sent deliberately by the web server on the remote end (or perhaps a proxy in front of it).
The problem does not seem particular to any specific model, either; it has exhibited itself when trying to download files from a handful of different models.
If other people aren’t seeing the problem, it makes me think my IP address (108.206.101.129) has found its way to a graylist or maybe an IPVS load-balancer is persistently mapping my IP to a misbehaving server, or something. I have been downloading quite a bit of data every day for several months, which made me wonder about HF download limits.
I downloaded about 5TB this month without tokens, but the speed didn’t slow down, so I think it’s probably not a restriction or something, but a problem somewhere in the route (like a CDN or a cable connecting countries…).