One point Axis should make is that Analog HD cameras are not drop in replacements for Analog cameras in the major VMSes, requiring additional hardware, usually a DVR, to decode.
The manufacturers of HD analog cameras might well point out that many (but not all) of their cameras can be switched between SD and HD analog output and therefore can be used as drop in replacements for SD analog cameras.
So you can buy HD analog cameras now but use them as SD cameras with your existing SD-DVR. If you want to replace your SD cameras and SD-DVR with HD products at a later time, you won't have to replace all your cameras as you can just switch the HD analog models into HD mode.
We had a demo about 6 months ago with functional 4MP HD-CVI cameras (they are not released yet but I hear that they will be demoed at the ISC show in Vegas in April 2016). The same crew who showed us the 4MP HD-CVI cameras 6 months ago are saying that by the end of the year 2016, there will be 4K (8MP) HD-CVI cameras available.
If Axis does not believe that this is game-changing technology to which they MUST react and adapt, then they deserve to be left behind.
10 years ago, I heard the same kind of comments concerning the impact that IP cameras would have on the marketplace coming from Pelco's upper management...
Analog needs to die like it should have loooong looong ago
the whole world is IP, yet we've got these people wasting time with Analog.... yes they are doing wonderful things getting their last dying breath out of Analog, but it's a lot like the guys squeezing ever slightly faster internet speeds out of copper, give up and move to fiber.
Hi Michael, I disagree. IP cameras and HD analog cameras both have a place depending upon the needs of the customer.
I come from the broadcast TV and film industry where uncompressed digital video is noticeably superior to uncompressed analog video because digital doesn’t contain analog noise. So digital is much better than analog for high end video.
However this doesn’t hold true with IP surveillance cameras. IP cameras don’t necessarily give a more pure image than HD analog cameras because both use compression and both show compression artefacts.
I therefore disagree with your statement that “Analog needs to die…” and that “people [are] wasting time with Analog.” Far from getting their last dying breath out of analog, analog has made a resurgence and is here to stay for some years.
One can also debate the differences between IP vs HD analog cameras including: complexity vs simplicity, potentially hackable over a network vs not hackable, video analytics on the camera vs no video analytics, slow power up times vs immediate power up, and many more.
For video surveillance cameras, I think it is wrong to say that digital is better than HD analog. The needs of the customer should be taken into account when determining which technology is the right one to use.
the only place for analog is where people have an existing system that they are too cheap to replace, everyone else needs to move on to 21st century technology
there is a huge difference in quality for ethernet over Analog where are your 12mp to 30mp cam's for Analog?, you can always provide better devices for each end and improve the quality of what's there for IP
I understand that there are plenty of people attempting to get the last dying breath's out of Analog, but it has no future, how much bandwidth is available for analog, Digital in normal copper form already has 10gbit, that's ignoring fiber, how long can you run your Analog without converting it to digital and back again? Fibre has no issues with kilometers... Digital also has a huge amount of error checking and diagnostics available at the switch level, as well as the ability to expand networks comparably infinity, people still pushing analog where they are not forced into it by ancient systems are kidding themselves.
if the person doing the install does not know how to isolate IP networks, assume you are at a big enough business to require it, you should not be doing the install and should instead be getting people in who know what they are doing with Vlans and so forth..
anything that can be connected to the internet can get hacked, does not matter if it's IP or analog, generally the IP stuff is more flexible and being built around very solid operating systems it's quite easy to log and trace issues, most Analog systems I've ever seen you don't know something's going wrong until it stops dead (but I'm far from an analog expert so...)
with Simplicity vs Complexity, PoE switches are cheap, most people already have existing network cabling that can be re-used, so if you try and do a new analog install you are double cabling.
with power up times, how often/why would you be regularly rebooting your system to worry about the power up times? most IP stuff is visualized, so something happens you can force a restart, or automatically failover to another server, can you do that with Analog without having to unplug/replug gear?
Don't get me wrong, Analog has been a great thing in the past, we've had systems that cranked on for years without being touched, but Technology has moved on, don't paint yourselves into a corner.
...how much bandwidth is available for analog, Digital in normal copper form already has 10gbit, that's ignoring fiber, how long can you run your Analog without converting it to digital and back again?
With all due respect Michael, I think you have the nuts and bolts of Analog vs digital backwards
Analog will always have at least equal bandwidth (and usually far greater) than digital.
Why? With digital, unless you are sending abstract 1's and 0's via telepathy, you are actually transmitting using Analog waves! There's nothing 'on the wire' but an analog current. There is ALWAYS an underlying Analog transmission mechanism.
Digital is not a transmission format, it's an encoding format.
Its a convention that says, (for example), when the voltage is between 0 and 1 volts, that's an binary one, if it's less than that's a binary 0.
This, from an information standpoint, is extremely wasteful, since analog without digital encoding could transmit all the values in between.
But it has the advantage of allow perfect copies of the signal, due to its margin of error being so wide. While analog without digital encoding, degrades.
This is what digital encoding is all about, being able to make error free copies, not increasing bandwidth over analog.
Case in point: You think your 5MP IP camera is transmitting more information than a HD-CVI 2MP, just because it has a higher resolution?
Sure, in theory if they are using the same cable, both would be limited to the same ability to send voltage vs the background signal noise of a cable, however most analog systems are Coax, which has just a single conductor, not eight as your average Cat5/6/7 cable has.
the massive advantage that Cat5/6 has, is that over 100m distances it's structure can contain far more than Coax, including other network traffic, and as mentioned to other switch's allowing practically unlimited expansion (how many cam's can you run off a single 10gbit connection? 10,000?)
with regards to compression do not the current HD sensors all produce a digital signal ? if so they both go through this encoding process you talk about... right? the digital gets compressed to fit into less frames of the TCP/IP standard, and in the case of HD-CVI it gets converted to an analog signal, so neither of them are a perfect copy right?
not sure what you are getting at with regards to the HD-CVI 2mp creating less data than a 5mp, the amount of data created would surely be a function of the Video Recorder on the other end in the case of Analog, in the case of Digital that depends on your VMS but most do a lossless conversion that efficiently dumps to disk.. what's your analog doing? also converting to Digital to store to disk? how much quality is lost there vs the original pure signal?
I assume that we can agree that a 12mp cam looks a world ahead of a 2mp of anything(assuming reasonable sensors/lens), let alone the 16mp + cam's out there, irrelevant of what's being lost on the compression (which you have a reasonable amount of control over can I say)...
"I assume that we can agree that a 12mp cam looks a world ahead of a 2mp of anything"
It's an interesting comparison since both 4CIF to 2MP and 2MP to 12MP are ~6x increases in pixel count
But we are reaching the point of diminishing returns.
The need for a single camera's resolution is constrained by barriers (e.g., walls, trees) and angles of incident (e.g., inability to capture facial details of a person walking at a 60 degree angle to the camera).
Going from 4CIF to 2MP is undoubtedly a major practical advance, simply because you go from struggling to cover even a mid size room with any pixel density to 9x that area.
But 2MP to 12MP is not as broadly applicable practical advance, certainly not indoors, where most cameras are deployed. And that's not getting into the issues with low light performance (even with the 'best' lenses), worse WDR, etc.
I get what you mean by diminishing returns, but for a lot of people it means they can have a far wider angle lens and still have good details, this is not a good time to ask when you are going to review some more 12mp cam's with the larger sensors is it ;)
on saying that there is not much point however I'm going to have to disagree, it means that we can see license plates on trucks clearly with cam's that are mounted 5-meters high and still have a 90 degree wide view, in the case of indoors it allows us to more accurately see what users are doing on the keyboard/screen, and even better we can now far more clearly and from more angles (due to the distance of multiple cam's dropping the ppi) see people's expressions, and even where they are looking when something goes wrong..
I and the sites I look after are blown away by those gigapixel images you see occasionally, remember when Bill said x amount of ram is all a computer will ever need?
Axis really screwed up with that cam you link, not sure what they were aiming for, but when Hik has better quality they should have just canned it and done a recall, on Tech china is hard to beat, I can't see any of the western companies coming up with anything to beat china, considering their stuff comes from china in component form anyway does it not? Avigilon want $20k for their 34mp, what a joke, pretty much an insult.
I suspect that the biggest challenge with business's going for higher pixel counts is not the cam pricing nor the VMS or hardware, but actually the storage... I can't see it being to many more years before SSD's drop past the 50c a gb figure (for large drives) at which point it'll we'll be off.
not disagreeing with you, just really keen to see you guys start reviewing all the other 4k cam's... somewhat excited even! as I quite like your testing methology, so it'll be interesting to see if you guys find anything that might be more worth buying than the Dahua (yes I know I'm getting irritating about it, but there are guys in the US that can now get them special order)
my point about 12mp being great indoors is that the considerably increased detail allows us to now see specifically what people are doing on their computers in an office environment, when we run 3mp all but the closest PC screens are pretty blurry, now we have a fair chance of being able to see if people are wasting time on their personal gmail accounts or spending time doing company work for example, nothing is perfect but from our perspective it's a big change
my point of disagreement was that while they are great outdoors, indoors we're seeing a big improvement to0
"now we have a fair chance of being able to see if people are wasting time on their personal gmail accounts or spending time doing company work for example, nothing is perfect but from our perspective it's a big change"
Let's see how many people are interested in that level of detail.
...now we have a fair chance of being able to see if people are wasting time on personal gmail accounts...
In the U.S. at least we have had this ability for 30 years, it's called endpoint security, and is a routine practice.
You get a time stamped list of where, when and how long they were on gray sites. It sure beats trying to look over the shoulder of an employer from across the room and read their screen, no matter what the resolution.
it's far easier for Management to just view the cam's on the phones, and at a whim just zoom in on whomever's screen and see whats on it, end point security tends not to be very good at filtering wether someone is actively using something, or just has it open in the background, there are packages that do it, but they work out pretty expensive and pretty intensive to maintain as opposed to just throwing in a higher resolution cam
it would not surprise me if a lot of integrators are still using cat5/5a
you're saying right now there are no camera's using gigabit interface so you think it's going to stay that way?, this is not a great way to give yourself future sales. (customer " can we fit x device?", sales rep "yep, actually we can fit x, and y and z") Everyone is moving away from Coax what percentage of released devices in the last year came standard with RJ45 connections? can you even configure a camera through Coax?
I say in theory as twisted pair have become the mainstay for loads and loads of devices, so there are heaps of applications out there, where you can make use of the bandwidth, your saying no one camera can use it, but that's my point, these days heaps of devices use it... plug a server in, plug a NAS in, plug a switch and a crapload of devices... you'll soon use that bandwidth
why did they rip coax out of all the corporate networks decades ago?
use it as a good pull cable for a pair of cat6 lol or even better use it for a pull cable to pull through another pull cable as well as a cat6 and future proof yourself :) if you need to get more through whatever space/conduit it goes through in future
Hi Michael, thanks for your reply. I think that that IP makes a lot of sense for bigger deployments and your email suggests that is the main area in which you work so I understand your comments.
For small businesses and SOHO installations, IP is not always better than HD analog. This may not be a market that concerns you but is a big part of my customer base. When I show smaller customers a side by side demo of IP vs HD analog with equivalent cameras and recorders, customers who previously assumed they wanted an IP system are often left wondering what the fuss is all about with IP. These customers typically only need 1080p cameras or less resolution. They are more concerned about WDR and low light than very high resolutions.
"Everyone else needs to move on to 21st century technology," sounds like the kind of spin I'd expect from an aggressive TV salesperson trying to sell me an Ultra HD 4K TV over an HD TV. That description doesn't provide me with any information that I would find helpful. It just sounds like a put-down.
In my opinion, the selected technology should suit the customer's needs and budget, and not just be chosen because it's 21st century technology. I still walk to my local bakery for bread because most of the time, it's faster than waiting in a queue of cars for the two sets of traffic lights to turn green. Walking is definitely not 21st century technology but most of the time, it's faster, cheaper and healthier for my needs.
maybe the only way it's not better than IP is if you don't want to do some arrangement with your small customer to piggyback on their existing infrastructure? maybe even bundle an upgrade for their existing infrastructure at the same time (say if they are stuck on 100mbit switch's) maybe they are just that painful and prone to stuffing around and breaking things
if you were buying a TV today, that you could reasonably expect to last 10 years, I'm not sure why you'd go "I'll be happy with HD as opposed to 4k" cheap 4k's over in Aus are $800 for a pretty big one, we have specials where a 65" 4k is $1200 from aldi.... so why would you not... I hope your not going to come back with "there is no current 4k broadcast"
with all the new Fad's I think walking might be 21st century! or maybe running or crossfitting (as opposed to driving) lol my point is that all the standards are moving along, sort of like saying stick with cassette vs CD's... or betamax vs VHS?
I believe its time for BOSCH, AD or VICON to create matrix cards to update old systems to accept HD-CVI/TVI/AHD as inputs along side SD video and then HD CVI/TVI/AHD decoders for monitors allowing complete reuse of cabling for jails, prisons, small casinos and everywhere they are still selling matrix switch components even though that market was supposed to be dead long before analog.
Add video recording and playback from a keyboard like some did to provide a total local system without cable changing and network infrastructure.
A partial or complete upgrade could be accomplished quickly to 720/1080 and soon 4MP and 4K.
I don't work for any of those guys, but I did work for one awhile ago and each year they forecast the matrix going away....we sold more.
Perhaps those manufacturers don't see much point in creating matrix cards given that external encoders already exist. The encoders take the HD analog signal and turn it into an IP signal which some VMS's already accept. See Testing Dahua HDCVI Encoder With VMS's. While external encoder boxes cost more than matrix cards, they can be much more convenient as there is more space for all the cable connections. This is good unless you want a really compact solution in which case the cards would be preferable.