Bandwidth consumption can vary tremendously, even with the same resolution, frame rate, compression and scene, by 50% to 80% in our tests. This is because cameras process video differently (e.g., gain control applied, noise reduction technology, etc.).
We tested 16 cameras head to head to understand who is the best and worst at minimizing bandwidth consumption / storage costs.
In the 1MP category, we set these 6 cameras to 720p / 10fps:
In the 2MP category, we set these 4 cameras to 1080p / 10fps:
Additionally, we tested 6 integrated IR cameras to match up against these.
The Wrong Way
The simple, naive and wrong way is to put cameras side by side, measuring bandwidth at the same resolution and frame rate.
Why? The default compression settings of cameras vary widely by manufacturers. Though compression is often overlooked, every H.264 camera must choose a Q or quantization level (from 0 to 51). The lower number chosen, the higher the bandwidth, and the greater the compression (to learn more, see our IP Camera Manufacturer Compression Comparison). This does not make one camera 'better' or 'worse' and can be adjusted by the user.
Indeed, in IPVM testing, default compression variances routinely accounted for a massive 50%+ difference in bandwidth consumption.
To truly and fairly measure bandwidth, compression levels must be normalized, that is, they must be set the same. Using a stream analyzer, IPVM determined the actual quantization levels and adjusted the appropriate scales so the quantization levels all would be the same for our tests (using Q=28).
Test Questions and Answers
This IPVM test help identified:
- The 1 cameras that clearly had the lowest bandwidth consumption
- The 1 camera that clearly had the highest bandwidth consumption
- The 4 camera manufacturers whose default setting make them look far worse than they are
- The tradeoff of non IR and IR cameras and how it varied by low light levels
- The impact of VMD