Will Home Video Surveillance Force Isps To Change Upload Bandwidth Amounts?

Here is the pricing plan for my ISP. This is a typical structure, where users spend more per month buying more bandwidth capacity, but notice how asymmetrical the plans are in terms of download vs. upload speeds.

15 Mbps down vs only 2 Mbps up? Even the best/most expensive plan offers 300 Mbps down vs 30 Mbps up, a ratio of 10:1.

I wonder if the popularity of cloud video like Nest, Canary, and countless others are going to force changes here. The density of cloud based systems in homes is growing, and more than just cameras may add to the demand.

Even a modest camera system with 2 - 3 HD cameras can quickly exceed 10 Mbps upload, meanwhile 150 Mbps isn't really in danger of being reached much less pegged out by a few Netflix streams.

Has your ISP adopted to the cloud based home surveillance trend?


One ISP has a clever commercial regarding this:

They claim to offer 'balanced' connections that provide equal Upload vs. Download rates.

Verizon FiOS and other fiber-based services seem to have moved to symmetrical bandwidth. I've got 75 up/75 down and based on Speedtests, it seems to be accurate.

The cable companies could make speeds more symmetrical, as DOCSIS 3.0 clearly allows it. But in the quest for ever-greater speeds, they seem to be ignoring it. It's better marketing to say 300 MEG DOWNLOADS and add "Sorry, your upload is only 10 Mb/s" as a footnote, I guess.

So are symmetrical plans more common from fiber based providers?

If so, that's an interesting aspect I did not realize. My ISP is a cable company, and I don't think a fiber ISP is available in my area.

I dont know if its significantly more common, but I know it was the case everywhere with FIOS when I worked for Verizon, and from discussions with my friends up North. I've not found a network engineer that can explain to me why it hasn't changed at all in the last 5 years on cable networks, other than, "who cares/why bother?".

For CATV ISP's everything is a tradeoff of frequencies.

I haven't kept up with CATV systems in the last several years, about 8 years ago a high-end system would be roughly 1.2Ghz (meaning that the highest frequency it could carry was 1.2Ghz), IIRC.

Within the bandwidth that the system is designed and certified for, the cable company can carve out frequencies for regular channels, pay-per-view, data download, and data upload. It's all a trade-off.

I would suspect modern cable plants can support much higher frequencies (and thereby throughput) today, but older systems may still be limited.

In order to accommodate higher upload speeds, something has to give: downstream speed, PPV channels, etc. For most consumers the upstream demands have been low, so the cable company can "steal" some bandwidth there to enable higher download speeds, or provide more PPV channels, or 5 extra HBO's.