In written technical standards, we consider the business needs. In this case, 2MP is my standard for indoor cameras where employees and visitors have already been identified by access control or visitor management. There are exceptions for manufacturing processes that require higher resolutions for monitoring details.
At critical entrances, docks, etc., exceptions are made for higher resolutions, depending on the scene, where pedestrian identification has not been confirmed.
This, along with thoughtful placement of cameras in parking areas and license plate capture at entrances, provides excellent layers of protection with video security.
2MP indoors where people are close to cameras. 5MP outdoors where object distance is a variable. 8/12MP in areas where a multi-sensor can't be installed within the center of a scene and the camera is far from objects.
I wonder if people claiming they use 4K+ is because they feel multi-sensors are 8MP when they are actually 2MP each or 20MP when they are actually 5MP each, etc.
Interesting article - thanks for posting. I typically used 1.3MP mini dome cameras for interior single-door entrances; however, Avigilon seems to be moving toward 2MP as the base resolution on their H5 camera lines. Based on PPF in the small field of view, I tend to think this is overkill. If technology continues improving and price keeps dropping, are we going to be installing 4K cameras in hallways just because we can?
If technology continues improving and price keeps dropping, are we going to be installing 4K cameras in hallways just because we can?
Yes, generally speaking. It might not literally be 4K for some years but 4MP is getting close.
What will / is happening is that the marginal price of 4MP cameras will become so close to 1080p, low light performance will catch up, smart codecs / H.265 will make bandwidth relatively negligible, and 4MP will become 'standardized'.
A comparable example to share. Maybe one's pixel density calculations says that a 720p camera is sufficient for an area. The reality is, very, very few would actually now specify a 720p camera because 1080p pricing is generally negligibly more and 1080p (in today's models) is generally better or roughly equal across the board.
I wonder what the true cost to a manufacturer is between resolutions. I imagine at this point they are just pricing to a perception. I can't imagine there is actually much difference between anything under 4MP anymore.
The problem is that resolution is probably the least effective way to rate the quality of a camera. PPF is not all created equal and it doesnt take environmental conditions into account. A 4mp camera with a low quality lens will look worse than a 1080P camera with a high quality lens. There are various grades of CMOS sensors as well.
resolution is probably the least effective way to rate the quality of a camera
That's not our experience in testing. Resolution (i.e., pixel count) is highly correlated with day time image quality. To the extent there are differences, lensing is not the most common issue. Imagers, SoCs, IR optimization do make differences for WDR and low light, though even in WDR, differences have become much narrower than in years past.
More generally, I really think it's bad to say "resolution is probably the least effective". Resolution is one important element but people should still consider others (WDR, low-light, etc.)
You just did a 4K shootout between 3 cameras and one of the manufacturers had noticeably worse image quality in a static daylight scene, yet all three would be calculated as the same PPF. I've done my own shoot out between a competitor's 4mp camera and my 1080p camera.
one of the manufacturers had noticeably worse image quality in a static daylight scene
Do you have a reading comprehension problem?
resolution is probably the least effective way to rate the quality of a camera
And my response was:
Resolution (i.e., pixel count) is highly correlated with day time image quality. To the extent there are differences, lensing is not the most common issue
On occasion, lensing is a difference in day time image quality but overall it is not a key factor (i.e., why I said it was highly correlated and not always the case). I don't want people reading IPVM and coming away with your false claim that "resolution is probably the least effective way to rate the quality of a camera".
Resolution (meaning pixel counts) has limitations, people need to understand those limitations but anyone who comes away with "resolution is probably the least effective way to rate the quality of a camera" is doing themselves a disservice.
For 2022, we expect a continued gradual movement to higher resolutions. 1080p will drop to 30% or lower usage while 4MP will become similar to 1080p in 2018, the predominant resolution used. At the same, usage of 6MP and 8MP / 4K will grow. However, resolution usage over 8MP will likely continue to be small. While 8K / 32MP options will grow, they will most certainly remain niche and relatively expensive in the next 2 years.
I'm surprised that 16:9 is so popular. I find the blind spot left under the camera by such a narrow vertical FOV difficult to work with in most applications and almost always go for a 3 or 5 MP 4:3 or similar to avoid this.
We didn't break this out but we are pretty sure Dahua and Hikvision drive the average resolution up because they sell higher resolution at lower prices and lower deltas than Axis or Avigilon. I think that's a good point to note.
I would generally tend to agree with this, but my feelings are beginning to sway. Hik and Dahua aren't the exceptions anymore. They are the normal. They and the many others like them. Low cost, high resolution is available from many overseas OEMs. Companies like Raysharp even make Hiks prices seem high when strictly speaking about resolution. Then when you add in the consumer lines (Nest, etc.), you quickly find that Axis and Avigilon are very much the outlying data points. Even Raspberry Pi driven high resolution cameras can be had for next to nothing.
Even among dealers, there are installers putting in a lot of consumer grade stuff.
But consumer lines are still mostly 1080p, right? Even the Nest '4K' option is just a 4K sensor, not actual 4K resolution.
It really depends on where you come at the market, if you are mostly using China products, for sure things like Axis and Avigilon look like outliers but they are still pretty 'outliers' when it comes to commercial usage.
For sure they are mostly 1080p still, but trending toward higher. (Though I believe the wireless delivery is probably halting that for the time being. Having to serve the general masses means building for poor wireless and low end Internet connections.)
In my area, commercial installers aren't using "dealer" lines. Everything is Chinese with most installers using junk like Qsee and other Amazon-grade products being most common. This is even true of national brand companies and schools in many cases. It's actually difficult to find a quality brand represented here and the manufacturers don't seem to have an interest in serving this market either. Kind of a shame, really.
Every project with more than a couple cameras starts with 1080p for me with higher resolution added on a per camera basis. 1080p provides a sweet spot between high resolution, effective performance, cost-value balance, and usability. Higher resolution cameras tend to be difficult or impossible to deploy correctly so that you actually get the advantages of higher resolution. This is especially true in larger systems.
Most systems I come across have settings tuned down below the advertised resolution anyway due to bandwidth or storage limitations.
Many 5MP or 8MP cameras have firmwares that limit their framerates as well because compression is difficult to achieve at those resolutions given current network and Internet bandwidth standards.
As a backend video processing and storage company we find there to be HUGE disconnects in what the CAMERA resolution is, and what the client actually RECORDS at. We are seeing a very large increase very quickly in 4-5MP cameras.
However when push comes to money, the vast majority of those 4MP cameras record at 1080P and 15FPS. We of course would love them to all record at 4MP and 30 FPS. We do after all selling video appliances.
So an even MORE interesting data set to look at for these numbers would be resolution of the camera PURCHASED, against resolution of the camera recorded.
That just doesn't make sense to me. The integrator could either save the customer money (or boost their own margin) by spec'ing and installing 1080p cameras if that's the resolution and framerate they're going to be working with. Why would anyone spec a higher-res camera and intentionally cripple it like that?
yeap. almost every project starts with "all cameras record at native resolution." and "40-60% of cameras are 4MP and above." THEN you show the processing and storage costs of that and very quickly that gets cut to record all cameras at 1080p 15 FPS except for these 6 crucial 4MP cameras. The software and camera guys all toss out the " that feature is free!" because they can toss the cost to hardware. (network, CPU, storage etc)
If I may I'd like to throw in my $0.02 since I wasn't special enough to get the survey. :(
As a newcomer just getting things going in this industry, I spec primarily 4MP. My clients are mainly landlords with highly problematic properties with a lot of vandalism, drug abuse, vagrancy, etc. so they have a desire for high detail in order to identify problem individuals as accurately as possible.
The challenge is that due to the nature of these buildings their budget is often constrained, but we are still able to deliver them affordable solutions at 4MP with probably 95% of the equipment at that resolution. The other 5% is 2MP for very specific applications like inside the NVR room recording to an SD card in case the NVR is stolen, or long-lens 6mm+ applications where a specific spot is to be monitored and 2MP provides high enough PPF without wasting budget on a 4MP.