Heng Dju Ong: This is the way I see it:
Assumption: CCTV solutions are choosen for the end-user by an expert (integrator/consultant)
Now there are some projects that may require the use of IP based systems, because:
a) Too many cameras for HD-SDI (like 48+)
b) Need for some/all cameras to be wireless - specifically Wi-Fi
c) Need to use existing network infrastructure for cost reasons (e.g. large factory/warehouse where they have a under utilised network between 2 offices at either end of the site - can have just one server in one location and use existing infrastructure to feed video driect from cameras closer to the other end)
d) The need for more user friendly software like HD Witness, advanced integration like Milestone/Genetec, or video analytics like 3VR
e) the need for multi-sensor high detail panaromic image capture - like Arecont SurroundView cameras.
Assumption 2: IP CCTV is a topic requiring a lot of training, experience and dedication to master, Therefore, it would be unwise for an integrator to spread his/her knowledge gathering and experience between HD-SDI (optimized for small systems only) and IP (universally aplicaable to any situation).
Thirdly, I would like to point out that many IP CCTV vendors (Hikvision, Dahua, DynaColor, Vivotek, etc...) now have plug and play NVRs with 4-16 integrated PoE ports that can be installed just like HD-SDI systems (i.e.e just plug in the cable and it works) so the learning curve advantage of HD-SDI is becoming increasingly diminished.
Fourthly, I would like to point out that eventhough HD-SDI cameras are simplier in that they do not have to do H264 encoding - the encoding hardware still has to be SOMEWHERE - in this case the HD-SDI DVR - so the cost advantage of HD-SDI I believe will not be that great in the longer term - end-game pricing would be more or less the same between the two solutions. The counter arguement to this - that multichannel hardware encoder chips in an HD-SDI DVR are more efficient then single channel encoder chips in each IP camera, is, in my opinion, not the correct, because if that where the case, we would not be seeing quad core ARM CPUs in phones (just make 1 fast CPU core!) or 8-core AMD CPUs in PCs. Multiple, smaller silicon chips are more practical for many reasons - the number one being heat dissipation capability (larger area to dissipate heat) which directly influences power efficiency (higher temperature chips waste more power due to increased electrical resistance) and reliability (hotter circuit boards fry themselves exponentially faster than cooler ones).
I personally see this like the HD-DVD/Blu-ray war: HD-DVD permits greater REUSE of manufacturing facilities (i.e. analogy to reusing cable) where as Blu-ray offers greater capacity for a better end-game result performance wise.