Best Of Both Worlds? HDSDI

For analog customers before they jump to IP all together, we are considering the HD-SDI system even though has never been really pick up, but we are beginning to think as the HDSDI DVR prices drop this may be a feasible solution for the price sensitive customers here.
Questions:
1. How to determine IF the existing coaxial cable at the customer's site is still in good enough condition to be use for the HD-SDI? I only read that the coax is suppose to read 75 ohm for 100 meter long cable, but in reality we never know how long the installed cables are so how will the length affect the reading and is there some mathematical formulas for calculating that?
2. What kind of device to read this different length and what is the maximum acceptable resistance in ohms for the coax cable?

1. Guess.

2. There isn't one.

As soon as the 're-use existing RG59' selling point for HD-SDI was floated a few years back, all the technical folks here and on LinkedIn forums asked exactly what you asked..... and we are all still waiting for answers. :(

You would need a "cable length finder", such as the Fluke TS90 or TS100.

I'm one of those still waiting for the answer. It was mostly all I wanted to know! I would be willing to be the reason the answers never came is that they were not favorable answers. So I finally looked at it myself, and went straight to the audio-visual world, since they've been dealing with HD-SDI since the 90s.

So here's what I CAN tell you:

  • HD-SDI is generally accepted to have a max distance of about 100m on RG-59 cable. Maximum, in a perfect world. Using old beat-up cables, you might only get 100'.
  • If you're doing a new install, You can use RG-6 or RG-11, but of course, they're not generally seen in retrofit, and they're bigger and more expensive. You can definitely run further if you use cables which were made for HD-SDI, too...but again, these are more expensive.
  • Also, unlike composite video, which will actually run further on UTP, HD-SDI doesn't. You can get baluns, but you're still generally looking at a 100m limit.
  • Cable testers like the Fluke MicroScanner can test coax length. They'll also test for continuity and find shorts in the cable, which will be good for retrofit, as well. There are testers that will run a TDR trace which will locate abnormalities in impedence, as well. But they're thousands of dollars, and may be unnecessary.

So with all that in mind, I think the unfortunate truth is to know whether the coax will support it: get an HD-SDI monitor and an HD-SDI camera and plug it in. If it works, ok. I would also plan on investing in a quality compression tool, and plan on reterminating every cable. The way some (most) techs terminated RG-59 with crimp connectors, with the braid hanging out of the barrel, sheesh. Terrible.

What about the crappy twist-on connectors that fall off if you even look at them?

Heng, a couple other things to consider with SDI:

You're limited to 1080p maximum (2MP), while 1080p is pretty much the bottom end now for IP camera resolution (er, pixel count - sorry John, I keep forgetting ;)

Where with SDI you're limited to 100m runs *at best*, if you want to re-use existing coax, EoC converters will let you run IP for hundreds of meters. If even one camera on a site is on a run this long, it's probably worthwhile to prefer an IP option. I've used EoC on a number of sites, some with very old installations, and never had failed connections due to poor cable. Granted, I don't know if any of the cable WAS poor/damaged/etc., but then, I haven't had a need to test it, as the EoC has always just... worked.

Edit: Carl, you'd better not look at them then!

Matt: Which EoC adaptors have you used? Have you deployed the cheap 10Mbps passive ones that cost <$30 per pair like these?

Or other entry level 100Mbps ~$50 per channel ones like these?

Heng Dju Ong: This is the way I see it:

Assumption: CCTV solutions are choosen for the end-user by an expert (integrator/consultant)

Now there are some projects that may require the use of IP based systems, because:

a) Too many cameras for HD-SDI (like 48+)

b) Need for some/all cameras to be wireless - specifically Wi-Fi

c) Need to use existing network infrastructure for cost reasons (e.g. large factory/warehouse where they have a under utilised network between 2 offices at either end of the site - can have just one server in one location and use existing infrastructure to feed video driect from cameras closer to the other end)

d) The need for more user friendly software like HD Witness, advanced integration like Milestone/Genetec, or video analytics like 3VR

e) the need for multi-sensor high detail panaromic image capture - like Arecont SurroundView cameras.

Assumption 2: IP CCTV is a topic requiring a lot of training, experience and dedication to master, Therefore, it would be unwise for an integrator to spread his/her knowledge gathering and experience between HD-SDI (optimized for small systems only) and IP (universally aplicaable to any situation).

Thirdly, I would like to point out that many IP CCTV vendors (Hikvision, Dahua, DynaColor, Vivotek, etc...) now have plug and play NVRs with 4-16 integrated PoE ports that can be installed just like HD-SDI systems (i.e.e just plug in the cable and it works) so the learning curve advantage of HD-SDI is becoming increasingly diminished.

Fourthly, I would like to point out that eventhough HD-SDI cameras are simplier in that they do not have to do H264 encoding - the encoding hardware still has to be SOMEWHERE - in this case the HD-SDI DVR - so the cost advantage of HD-SDI I believe will not be that great in the longer term - end-game pricing would be more or less the same between the two solutions. The counter arguement to this - that multichannel hardware encoder chips in an HD-SDI DVR are more efficient then single channel encoder chips in each IP camera, is, in my opinion, not the correct, because if that where the case, we would not be seeing quad core ARM CPUs in phones (just make 1 fast CPU core!) or 8-core AMD CPUs in PCs. Multiple, smaller silicon chips are more practical for many reasons - the number one being heat dissipation capability (larger area to dissipate heat) which directly influences power efficiency (higher temperature chips waste more power due to increased electrical resistance) and reliability (hotter circuit boards fry themselves exponentially faster than cooler ones).

I personally see this like the HD-DVD/Blu-ray war: HD-DVD permits greater REUSE of manufacturing facilities (i.e. analogy to reusing cable) where as Blu-ray offers greater capacity for a better end-game result performance wise.

@Marty, the problem is, you work for an all-IP-all-the-time company... yeah, thats it. If you believed more, the answer would present itself.

@Bohan, I get your point, I just don't think the analogy is right. HDDVD actually offered better image quality on the same movies in heads up comparisons. Blu-ray won because Sony instantly added 20 million players to the street when the PS3 was released (at a price close to half of what the nearest BR player cost at that point).

This is DAT vs. CDs. http://en.wikipedia.org/wiki/Digital_Audio_Tape vs http://en.wikipedia.org/wiki/Compact_disc One has the perception of being still the same form factor, same manufacturing to build the product (tapes), with a better quality. The other is a complete shift in technology and vision.

Frankly, I still believe a significant part of Blu-Ray's success is the catchy name...

Anyway... Bohan, I've used a range of EoCs from the "original" Veracity Highwires, to the cheap "non-passive" units very similar to the one you linked (same brand, older version). Only twice have I encountered issues: once where I'm pretty sure I had a ground loop going on between the DVR and two or three remote cameras that were using the cheap units (side note: the mini-receiver in those fit nicely inside the dome camera I was using), and once site where the same cheap adapters all used F-connectors, and several turned out to have weak "clips" for the center conductor, leading to many troubleshooting sessions.

Lately we're using the Altronix eBridge 1 adapters - around $120 cost, and they can send analog video over the line along with the IP (I have yet to actually find a use for this... but it's there nonetheless).

Matt: this part of the world even the VGA resolution still rules, I know the rest of the world are moving up the mark into gigapixel as I learn from IPVM, but this is also a very backward development as even the internet here is still expensive for example. It's a far cry from the neighbouring countries like Singapore where they are providing fiber optic to some homes directly whether they ask or not, even more for South Korean, they have the cheapest (and fastest) internet fee in the Asian region I think.

IP system is undoubtedly has it's advantage in many ways, especially when talking resolution as Bohan pointed out too with many other options, but again the bottom line is the customers has limited budget and the IP cameras which are marketed here are STILL considered expensive.

That's why even though I did not really considered HD-SDI in the past, now I have to re-think as these budgeted customers are also quite a few of them actually.

For other customers who are not really concern about budget we have the liberty to offer them better technolog, like I had a chance to install about 30 units of Arecont panoramic (AV8185DN) but even that was full of drama as their products AV8185DN has given me a lot of headache of endless supports due to hardware failure.

So all in all, I think whether we are going accross different platform (analog, HD-SDI or IP) for us it doesn't really matter as long as we can fullfil the demands of the users at their monetary level, though we did not go to the lowest of them all as it's really nasty and I borrow Carl's term: crappy. It's literally that here exist for real.