Is It Worth Installing Cat 6 Ethernet Cable For Ultra HD 4K Cameras?

Last week at ISC West in Las Vegas, a number of vendors were showing off their first Ultra HD 4K resolution IP cameras. This week in Las Vegas, the NAB Show is full of Ultra HD 4K products for the film and video industries.

In the IP camera course, it was emphasized that high resolution IP cameras have poor low light performance so one is often much better off with 720p or 1080p cameras which also have much lower bandwidth requirements. On the other hand, it seems likely that the low light capabilities of high resolution cameras will be improved over the next few years as this is what the industry wants.

Knowing there are now Ultra HD 4K resolution IP cameras, should one consider installing Cat 6 Ethernet cable for IP cameras so as to future-proof an installation, or will Cat 5e suffice even for 4K?

My own feeling is that Cat 5e will be all that is needed for the next 5 years but I'd be interested to hear what others think. Thank you for your comments.

Since even a 4K camera is unlikely to go over 100Mb/s, I do not see why you would need to upgrade cabling.

A 1080p HD camera at full frame rate, H.264 typically comes 2Mb/s to 10Mb/s. 4K is 4x the pixels of 1080p, so assuming a linear relationship that would mean 8Mb/s to 40Mb/s (and linear is conservative since bandwidth tends to increase at lower rate than pixels because of compression benefits across more pixels).

Thanks John. I'm guessing that anyone using 4K might also be using multiple streams, eg so they can watch a few specific areas of the image perhaps at 1080p resolution. I think Cat 5e should still be OK but multiple streams could bring a worst case scenario very close to 70 Mb/s at which time one should seriously start considering using 1 GbE cabling and switches. Hence my reason for provoking some discussion on this topic.

I think it it is highly unlikely that people will be using multiple 4K, full full frame rate streams at the same time. The typical multi-stream example is full resolution PLUS quarter or lower resolution, which would be substantially lower than 70Mb/s even in the worst case scenario.

Also, my understanding is that 4K cameras are still going to have 100Mb/s Ethernet interfaces only so even if theoretically you were streaming out 70+Mb/s, your #1 issue would be the camera's NIC before anything else.

Ahh! That's very interesting information about the NIC John and certainly answers my question if that's what the camera manufacturers choose to use.

The highest bitrate I saw on the ISC show floor last week was about 40 Mb/s. That was much higher than others (which I already thought were running higher than they had to). Most people I talked to said they felt 10-15 Mb/s is probably what you'll see for a "decent" stream.

So even worst case, pulling a full res stream plus second or third lower resolution streams, it should be fine.

The bandwidth specification for Cat 5e is 1000 Mb/s. 4K bandwidth requirements should be less than 40 Mb/s (and perhaps drastically less if H.265 ever shows up! :-) ). So I can't see any practical reason why you should use Cat 6 instead of Cat 5e.

Even regular Cat 5 @ 100 Mb/s should be fine for 4K