Why Megapixel Mis-Marketing Works
Both megapixel and video analytic manufacturers routinely mis-market their products, vastly overstating their capabilities. One market, analytics, has suffered greatly while the other one, megapixel, has thrived and, perhaps even, gained from the misleading marketing. Both can be outrageous. Indeed, megapixel can be even be worse, with claims of replacing dozens of cameras. Why has one created such ill will for vendors and the other has not?
The key difference is the frame of reference.
"Underestimation of storage requirements".
Matt, you said a mouthfull. I finally wound up telling our prospective Integrators exactly how much storage we wanted so there would be no doubt. Even the term "net storage" had to be specifically defined so that they wouldn't include parity drives, hot spare drives and Windows overhead in the total.
Add bitrate to that list. Integrators have tried to convince me of misjudgement of my specific bitrate requests, claiming they used lower bitrates on other jobs and the end users were perfectly happy. After seeing the video quality at some of those properties, I have to wonder what they were comparing the quality to?
Carl, I suspect you, in a casino, need to identify finer details than the typical surveillance user. That's probably why they can get away with lower bit rates (at modestly higher compression levels) than you can. Yes/no?
Maybe, except the other properties we visited were also casinos ;>)
I did a full sequence of bitrate tests on both encoders and IP cameras during Phase 1 and Phase 2 evaluations over the last year.
For analog cameras, I varied encoder bitrates from 1.5Mbps to 5Mbps. I noted substantial increases in picture quality in the first steps, followed by lesser and lesser improvements at the upper range. The image quality increase from 1.5Mbps to 2Mbps was immense and the improvement by going from there to 2.5Mbps was very apparent, even to untrained eyes. The improvement was much more subtle when going from 2.5Mbps to 3.0Mbps - it consisted mostly of a reduction in I-frame "pulsing" and edge noise. Above 3.0Mbps, improvements were not apparent.
The same was true of 720p IP cameras. Although many would consider 3Mbps "suitable", I noted a substantial reduction in compression noise going from 3.0Mbps to 3.5Mbps; less noticeable but still apparent noise reduction going from 3.5Mbps to 4.0Mbps and much more subtle improvements thereafter.
The thing is, we could live with somewhat lower bitrates for recording, since those are viewed far less often than "live" monitoring in our application but, as I mentioned in another discussion, maintaining 30fps while dual streaming can be hit or miss. I absolutely refuse to provide what I consider to be substandard image quality to our users. They are used to analog via matrix and although IP cameras will provide better resolution, low noise is not typically their strong suit.
One of the weirdest issues I encountered was GOP size, aka GOV length. Manufacturers' defaults varied from one I-frame per second to one I-frame per every four seconds (assuming 30fps). In some ways, I can understand the argument because the less often the I frames, the less apparent I-frame "pulsing" is. Still, increasing GOP size/length had readily apparent downsides, including a substantial increase in motion artifacts, to the extent of causing extreme problems during PTZ motion, and an increase in the actual magnitude of pulse noise on each I-frame.
I specifically called for a GOP size of 15 (500ms) for all sources and an increase in bitrate from 2.5Mbps to 3.0Mbps on encoders and from 4.0Mbps to 5.0Mbps on IP cameras. This cut the pulsing substantially, which allowed the lower GOP size (circular logic be damned).
Carl, have you tried to analyze what the quantization level of each stream was? For any given scene, if all one changes is the bitrate, the camera will adjust the quantization level 'behind the scenes'. If you knew what the quantization level of each stream / camera was, it could help normalize the comparisons (i.e., this camera/stream was at q=27 but the other one was q=32, etc.).
No. I didn't have the tools or the time. The systems weren't here long enough.
Being a sales engineer, I often annoy our salespeople by pushing better sensor and low light performing 1MP cameras (that cost less), vs a 5MP to a customer. Its my opinion that with the right customer if you're explaining to them "why" youre doing this, they will appreciate that you're the expert and not just pushing boxes; and they'll actually trust you more in the future when you have a situation that would benefit from 5MP.
I think in many cases its manufacturers using marketing tactics to excite sales "opportunities" with salespeople who have enough technical ability to know that 1MP < 5MP (... therefore 5MP is better), and salespeople being programmed to just push the next thing, bigger number, higher margin box.
Yeah but then I need 5 of those 1MP cameras to equal a 5MP one!!!
OMG, thanks to IPVM, I learnt a lot in several days. I have handheld HD Sony camcorder at same resolution of most CCTV camera manufacturers. I am not worrying about the storage nor data transfer. I want to have a camera which can record the same quality like my handheld camcorder. CCTV cameras are not cheaper than camcorders but they providing much lesser quality than camcorders. Manufacturer says 30fps @ 1920*1080 or whatever. I am looking at the recorded footage, its skipping! One of the IPVM article measured 60fps camera which was recording max 30fps and 30 fps camera was recording at max 15fps, what a joke!
Should we start using camcorders instead of CCTV cameras??? What makes cctv cameras so special against camcorders? (I am well aware of differences) but just my body not accepting that misleading ads. That has to stop I believe.
Undisclosed, it's a good question that is raised every so often. I started a new discussion to address this: Why are IP Cameras Much More Expensive Than Camcorders?
When I click the above link (Why are IP...), it goes to the GoPro website...is that on purpose?