I agree with John, 4Mbps is more normal for 1080P30 nowadays. Main profile helps keep a more consistent bandwidth as does a broadcast grade encoder (like Ambarella).
Some of the mfgs that have ‘rolled their own’ encoders will have higher bandwidths than normal (or worse video for the same bitrate). Modern and professional grade cameras also now have pretty good temporal noise filtering which helps to keep bitrates lower in low light conditions.
Setting H.264 to VBR in "real world" scenarios will help given the fact that if there is no motion, the stream can settle down to a quiescent state which lends more bandwidth budget to other ports on the switch.
But enough about that, the question was regarding network gear.
Consumer grade switches 'saturate' pretty quickly - especially with all the ports running at a constant high data rate.
They're really designed for more intermittent data, like that of web surfing and the occasional file transfer over the network. Every penny has been squeezed out of the cheaper switches and the manufacturers know that most of them sit idle anyway.
Facebook or IPVM does not generate that much data, and only 1 or 2 ports at most will be used for Netflix or other form of high bandwidth traffic. That and the manufacturers don’t care... Most consumers will never know how many packets are dropped by their crappy switch.
Gigabit switches are of course faster and have a higher speed backbone (internal network that connects all the ports together) but the primary gain is realized by having a faster port to the VMS / PC that is recording all the devices. More professional 10/100 switches will have gigabit uplinks that can be connected to the server.
One other thing to note is that H.264 has very asymmetric packets. IDR / I frames are much larger than P frames or B frames (not common in security). Because of this, H.264 can overwhelm a 10/100 switch due to the instantaneous bandwidth exceeding 50-60Mbps when a IDR frame is transmitted.
If the network is huffing and puffing, packets can be dropped from the IDR, which causes the most damage to the decoded stream. All subsequent P frames will have errors causing tearing in the stream. Normal encoders output 1 IDR frame and 29 P frames (1 IDR every second for 1080P30). If the IDR is damaged, it will take a second for everything to clear when the next IDR is delivered. Some encoders stretch out the IDR interval, which makes the overall bitrate lower, but it makes it more prone to packet loss and decode errors.
Gigabit switches can handle spikes of traffic better and will usually drop less frames as a result. Most ‘consumer’ Gigabit switches however can’t do ‘line speed’ (1Gbps) and will max out at around 500-600Mbps.. still plenty fast for most security applications.
It should be noted that Server PC’s also have a tremendous limitation on the ‘line speed’ rating of the Network Interface Adapters (NICs). Cheap NICs and cheap motherboards will have a pretty low sustained throughput on the internal bus (usually PCI Express x1) and could actually be the limiting factor in this case.
There’s a reason why IT depts shell out huge dollars for enterprise class switches and servers. You really do get what you pay for.
All that that said, you should be able to get more than 5 modern 1080P30 cameras on one switch at one time.
Cheers,
Ian.