As I mentioned in Hikvision ColorVu is Smart Marketing, these test results show that integrated white light illumination, while technologically not complex, provides value in very dark scenarios even against top-performing expensive low-light cameras.
I do not know about exacq but for example, our software at least sets the quality/bitrate for the camera before it pulls the stream out. I assume exacq does the same. And it might be different bitrate for different codecs manufactures etc( it's up to exacq internal logic). So unless you pay attention that exacq does not change settings differently for every camera ( which you probably do) your test is about how cameras work well with exacq specifically( not in general).
I'm very impressed with the work you have done and only say for the sake of constructive improvement.
It's a fair concern and it's one of the reasons we use Exacq. It generally does not change camera settings when adding cameras. There are some exceptions, which we've seen throughout years of testing with it. The only time it changes bitrate or compression settings is if you manually change them in the client.
All of that aside, we also double check settings and re-check codec and other settings after adding to the VMS to confirm.
Related question: If I turn off the Nx Witness "automatically optimize settings" option, does it still make changes to the stream? In our tests it didn't seem so but curious to get a definitive answer.
I am glad members like this! Keep in mind we can only do some many of these reports even with a 12,000sqft test lab, even with a team of a half dozen people testing since it takes months to put this all together.
In other words, next time someone complains that IPVM writes too much about China or business, keep in mind those China or business posts take a few hours (or days) to write, this one takes a few months.
Jamie, I am sure 'many' people subscribe just for tests but when we look at what people read, people read far more on the business reporting so it seems that 'many' care more about the business reporting.
To be clear, I am not favoring one or the other, we are firmly committed to both, I am trying to have members understand the dynamics involved - the massive cost to doing such tests and the much lower read levels for them. All I ask is that when we do business reports, try to keep this in mind if you are unhappy or uninterested with those reports.
I am not saying not to release the manufacturers of each tested camera. I was just suggesting maybe a delay to release.
The spectrum of the selected cameras are arranged to be tested in the same use case scenarios but the sensors used in each camera are of different class. The newer and better gen normally outperform the earlier gen.
Any manufacturer who has an account here will come in and try to defend their own product (inside the tested list). Believe you can see from the ongoing discussion here.
If the shortlisted cameras were to pit against each other in the shootout, they should be compared in the same sensor class or budgetary type.
they should be compared in the same sensor class or budgetary type.
We've talked about sensor class before and the practical problem is that almost all manufacturers hide this information. Even if we could find it out, separating shootouts into sensor class would not make sense for most users as most want the best image quality, not a specific sensor class.
The newer and better gen normally outperform the earlier gen.
Sure, but it is what it is. What point would it be to say that, e.g., Sony is the best performer among those using 2015 sensors? You may care but I don't think most buyers are interested in that as much as sheer performance.
As for budgetary type, we list the prices prominently in the report even in the ranking itself. Price is clearly a factor and users can contrast the price vs the performance.
The subject walked from ~50' to 0, directly past the cameras. I'd say within a few feet of the cameras. We have seen super low light overexposure at times in past tests and called it out, but we haven't looked at it as thoroughly as we do here.
What, if any, factor does resolution play in these outcomes? For instance, since light sensitivity per pixel doesn't seem to apply as it used to in the past, wouldn't one expect the 4K ULL model to outperform the 2MP ULL equivalent?
For instance, the Hikvision 2MP DS-2CD2725FHWD-IZS vs the Hikvision DS-2CD2347G1-L comparison. Since all cameras are rated as Darkfighter, wouldn't you naturally expect the 4MP version to be better at static images? (I'm not trying to account for motion here, for arguments sake)
My point is, wouldn't it be more fair to compare all of the cameras at a standardized resolution at some point? That would negate the pixel superiority, just as you standardize on shutter. Just a suggestion.
Sensitivity definitely still matters, just less than it did in the past. There is still a sensitivity difference if you look at the same manufacturer's specs, most often. Or, in the case of the Hikvision Smart Series Darkfighters (5526 and 5546), they are both spec'd at the same minimum illumination (0.002 lux), but there is a noticeable, but not major, difference.
From a practical standpoint, yes, we could test at a standard resolution, but it's not very practically useful, since people are generally buying the resolution they need. An academic test of standardized resolution which doesn't show potential differences due to higher/lower pixel counts would not translate well into real world performance. Do you agree?
I guess the real issue is that you guys can’t realistically test every single model of every brand for an exhaustive test. For example, my curiosity asks which of the Hikvision DS-2CD21XXFWD-I models has the best low light image quality. What is the actual difference between the 2MP, 4MP, and 8MP versions? How much, if any, quality do I loose in dark scenes with the 8MP vs the 2MP. I know what I gain in well lit scenes. And I’m not expecting IPVM to test to this level because as soon as you do this for one brand, you set yourself up to do it for all brands. Instead you ask for input for certain models and democratically choose a bunch of units from varied brands. However you will never satisfy all because you didn’t pick “their model”. It’s classical damned if you do or don’t.
I am glad you did these tests, but it just makes me want more. 🤣
That's a good explanation! We actually have talked about doing bigger shootouts within a manufacturer's line. So in the Hikvision example, shooting out Value Express vs. Value vs. Performance vs. Smart, various resolutions, multiple form factors. I think it would be too large to add into this test alone but we are asked fairly frequently how camera A compares to camera B from a certain manufacturer. It's gotten especially muddy as the Axis P series has improved to be closer to Q and Hik Performance Series is more competitive with Smart. We'll keep it in mind for future tests.
I would love to see pixel size and F number of lens to see if (for the same type of sensor) that would take some of the mystery out of how cameras from the same manufacturer can appear at both the top and bottom of the list. Maybe there is a straight forward calculation off the data-sheets that could help camera selection.
We can certainly gather imager size and lens F stop. I don't think there is a straight forward calculation, though. I do think that in general a bigger imager and lower F-stop compared to a smaller imager and higher F-stop in the same manufacturer will have an obvious difference.
The Hanwha ExtraLux, for example, is a 1/2" sensor with a very low 0.94 F-stop, while the standard Wisenet X 6080 cameras are 1/2.8" with a more typical F1.4 lens. ExtraLux has a clear advantage in our tests in low light.
It's not this simple across manufacturers, though. A larger imager and lower F-stop definitely helps, but it's not a guarantee that one manufacturer will be better than another.
Great report guys - very informative and you've covered many important aspects of low light performance testing.
As the mobile sector of the market now represents a significant portion, it would be great to see a test of all the latest cameras that are rail rated (particularly to the EN50155 standards). Most of the leading manufactures have these in both dome or wedge form factor. I know what Im suggesting below means a combination of dynamic testing on an actual vehicle, or a complex simulation in your lab, but it could include test parameters such as:
- scene hot spot management inside a crowded public transit vehicle
- rapid light changes from sunlight to darkness, inside the same vehicle as it enters/exits tunnels
- how these cameras perform as dash-cams on a train or a bus (rendering traffic signal colours accurately is always a problem in poor ambient lighting)
- motion blur on track/roadside objects at various vehicle and shutter speeds
- managing rapidly flashing indicator lights and oncoming headlights against dark backgrounds
From experience, I know its not an easy task to undertake and there may be other priorities for IPVM right now. Having said that, a lot of the dash-cam road tests can be done from a van or car (of course obtaining permission to carry out tests on moving trains is much more difficult). However, the results from roads tests will give a good indicator of how the camera will also perform on a track.
I think if anyone can do this type of test thoroughly without bias - its you guys.
We did not, but it's on our list of things to do. All the lighting in this test is LED, actually. We have had plans to purchase a few different lighting sources for lab testing, but haven't pulled the trigger yet. If you have ideas on that, shoot me an email at firstname.lastname@example.org
Ethan, Thanks for your reply. I have worked in Video Testing Labs for years and have many great ideas for you. As an ASIS Member for almost 25 years I am dedicated to the industry and do not mind helping anyone to improve for the better of this industry. Just a few points and I do apologize for not reading all the comments as I am sure I am duplicating many points that were already brought up however I will list a few tidbits below:
1- Are cameras being tested out of box at Factory default?
2- Do you have spare cameras in case there is a weird result that needs to be verified?
3- Are light tests being conducted using cameras at widest lens setting?
4- Are Camera/Views framed for exact scene from camera to camera?
5- Are all Cameras being tested in same position or as close to being side by side as possible?
6- Are all Cameras tested at same Lux levels? Do you use a light control board?
7- LEDs typically do not emit IR which will result in greater differences when cameras change from Day to Night mode.
8- Different LED L.E.D. Temperatures can produce different results.
9- Testing with Sodium and Mercury may show you how well the cameras are color balanced.
10- Incandescent (Old but still available, Flourescent Low and High Temps and good old sunlight should also be used.) I will think you will be surprised at some of the results you will find.
11- For Sunlight you can use electric window shades and conduct WDR, BLC testing as well.
12- Do you test for Aliasing, Barrel roll and Complete 100% Motion across all Pixels?
Just a couple of tidbits to help improve your testing.
Thanks for the comment, some answers: We don't test cameras using purely default settings. Some settings are changed across the board:
Frame rate is set to 15 FPS
Compression is standardized to ~28-30 quantization
Exposure is set to a maximum of 1/30s or 33ms, depending on how the camera is configured
Slow shutter is disabled, if enabled (which is uncommon in 2019)
On these shootouts, manufacturers are shown preliminary images and given opportunity to suggest settings changes. However, in this cases, none of these adjustments really improved practical detail. Most suggestions amounted to decreasing noise reduction, which decreased some blur, but the added noise reduced details by practically the same amount.
As far as different light sources, we have tested mercury vapor, LED, sodium vapor, and others. See Camera Color Fidelity Shootout. We did not do extensive testing at multiple light levels in that test. However, while I think that there will be some color fidelity differences in brighter low light scenes under other lights, maybe 2 lux and up, in darker scenes below 1 lux, low light looks very similar. We have tested with different LED temperatures and seen this. Effectively, dark is dark, and at that point colors are poor for the vast majority of cameras.
Great report guys. Specifically in the section noted as Newer Vs. Older Camera Models, since you have newer models from most of the manufacturers why are you comparing a newly released camera to a model that is 3 years old, instead of similar models? Seems odd since electronics always improve with time, chipsets, technology. I'd expect a new Axis model to outperform a 3 year old Panasonic or 2 year old Hanwha. Were all cameras viewed through a VMS (which) or direct through the browser? Did that make a difference?
you have newer models from most of the manufacturers why are you comparing a newly released camera to a model that is 3 years old
If there is a specific model that you think we should have tested, please let us know. For some manufacturers, an "older" model is the highest spec'd available product for that company. Which models from those manufacturers that we did not test should we?
Those 3 companies are bigger than all the others and have bigger R&D budgets and release cameras more frequently.
Even if Axis, Dahua, and Hikvision had no expertise, the sheer fact they release more frequently means they have an opportunity to use newer, better imagers and SoCs. Recall someone objected to us including 2 or 3-year-old cameras in the test. But that's those manufacturer's fault for not being committed enough to release more frequently.
Does anyone know how the Sony SNC-VB770 "ultra-high sensitivity camera" compares in very low light? I have a couple of them, and they are better than my other cameras, but very expensive. Can I now get similar performance to the VB770 with a better price?
Based on our past tests, I do not think any of the cameras in this shootout are going to produce images as bright as the VB770 in very low light. For example, when we tested the VB770, we measured 0.01 lux at the point the subject is standing in this shot:
Now that lux level was measured facing the camera, there is obviously additional ambient light in the scene from other lights, but the lights you see on the right side of the FOV are actually from soccer fields ~1/2 mile away. There were no other lights on closer than that.
By contrast, here is one of the overall brightest images from our test, the Dahua N45EM63 in 0.01 measured inside.
There is very little ambient light, aside from a couple of cracks through doors and next to ceiling joists, so it's closer to a "true" 0.01 lux, but the image is obviously much darker, much noisier, and detection at longer range is difficult.
Even in the higher priced models which would theoretically compete more with the VB770, like the Q1798-LE, performance in very dark scenes is not really on the same level (full field of view below).
If your light level was a bit higher, something closer to 1 lux, you are much more likely to be able to get away with something lower cost, but if it is approaching 0, likely not.
I'm quite disappointed in the Panasonic WV-S2531Ls performance. Some in my company move a lot of this model, on the order of millions of dollars per year. Panasonic has always touted that they have the best optics but acknowledge weakness in other categories. At this point the poor results on a relatively new camera aren't backing that statement up.
I expected the Avigilon H5A to perform mediocre. It seems like a slight processor bump over the H4A.
Did you ever receive and test the Uniview Starlight? - Additionally, we have been working on getting Uniview Super Starlight models for more than a month, they are supposed to have shipped and we will add them to this report when we receive them.
Especially the IPC3635ER3-DUPZ has my attention ;-)
Just curious to know what IPVM does with all the cameras after the test. Do you keep them around for further tests or do you re-sell them? Looks like not only a time-intensive test but an expensive one, too. Well done.