Commonly, an HD IP Camera (30FPS 720p/1080p, 15FPS 3M or 8FPS 5M) will deliver ~9Mbps of data to your typical VMS (8Mbps main stream + 1Mbps sub stream).
Now, according to the 70% throughput design rule - one can expect to reliably run 7 such cameras on a 100Mbps switch dedicated to the cameras - utilizing <70Mbps of bandwidth.
However, during our internal tests, we found typical 100Mbps switches struggle to achieve <50Mbps of bandwidth.
In other words, we notice frame drops/stuttering/freezing of video feeds with just 5 cameras (we set them to CBR streams and monitored the network throughput to verify there were no spikes).
Are there any networking experts out there that can throw light on this issue?
If we were to extrapolate this to gigabit networks - does that mean we can only expect to achieve 500Mbps on the average switch?