Samuel, we are tying to get more information on this. The only details we see so far are:
"The standard will be used to grade digital cameras on a scale of 1 to 5 for a number of characteristics including image sharpness, field-of-view confirmation, signal-to-noise ratio, TV distortion, relative illumination, color fidelity, dynamic range, maximum frame rate, gray level, sensitivity, bad pixels, veiling glare and housing tamper protection."
I am really curious to see how they are going to do such 1 to 5 ratings as it's not very granular. Also, it has the potential for a ton of spec gamesmanship, i.e. only cameras with a field-of-view confirmation of 5 will be allowed.
I found and bought a copy of the draft (40 pages) for $105. Here's some key highlights:
Lower number is better: 900+ points is a level 1, 750 - 900 is level 2, etc. There's no talk about releasing or sharing images from the test.
"Five cameras of the same model and design shall be tested", "Lowest of the sharpness score of the five tested cameras will be the model’s Image Sharpness Score"
Image sharpness will be measured in lw/ph - see this tutorial on image sharpness; for the center, 2500 lw/ph will be a perfect score, 350 will be the lowest; presumably all cameras regardless of resolution will be rated on the same scale
Dynamic range: "This specification characterizes the video camera dynamic range by measuring the video camera grey level under different lighting conditions. For example, if a video camera can work properly under a maximum illumination of 200 kLux and under a minimum illumination of 1 lux, the dynamic range of the video camera is 200 thousand times." This is weird because it doesn't really test WDR in the sense of handling a scene with both really bright and dark portions simultaneously. It seems that their relative illumination test might address that with "one diffused and uniformly illuminated light source in the center and one diffused, uniformly illuminated light sources in each of its four corners."
I didn't see anything about low light testing so either I missed something or that's a strange omission.
Btw, and this relates to IP vs non IP, if you do a test for image sharpness, what do you do about compression level? If it's an IP camera then it's compressed (at least somewhat). Do you try to normalize all the compression levels or leave the manufacturer's default because differences in compression levels can impact visible sharpness.
Here's an article citing a UL sales person. This chart helps explain what they are doing. Note they have moved from a 0 to 5 scale to a 0 to 100, which is obviously much more granular.
This could be useuful though it depends how many manufacturers pay to get their cameras scored and how much the scores vary (i.e. does everyone get a 90 something or will there be significant, meaningful variances amongst them).
Dahua issued a press release saying it has 1 of its cameras passing the UL2802, noting that "(Relative Illumination / Dynamic Range / Bad Pixel) were perfect scores, with others (Gray Level / Sensitivity / Veiling Glare) scoring strongly as well"
This actually makes sense for Dahua because since many will expect their quality to be lower anyway (perception or not), a 'good' score could help validate them.
Now the question is do big brand manufacturers pay UL as well and risk their scores being the same or close to companies who sell at 1/2 their price?
Got feedback from Dahua. Here's the UL2802 test report for their 2MP bullet camera. There's 2 columns - the 'performance score' (0 - 100) and the 'converted unit score' which is more useful since it's based on real metrics (like lux, fps, etc.)
I found one other test report from Uniview. There are 2 cameras on that report but most of the stats are fairly similar on that one.
You can search for other or new ones on the UL Online Database form, just enter VCAM as the category code (as of this post, only 2 are listed).
I don't like the way they handled max framerate, especially. Firstly, a score of 18 for 30FPS is kind of odd, since very few security cameras go above 60FPS. Secondly, FPS isn't a great metric to use for security, anyways; motion blur would be better.
I would have probably done something like "X shutterspeed, giving minimum Y IRE at Z Lux."
Example: This camera required a shutterspeed of 1/500 to achieve 50IRE @ 100LUX, which resulted in a maximum relative subject speed of 50 pixels per second without significant motion blur (measured by some given formula).
That would actually give you some concrete, usable information.
True, but that's not something you should actually have to test - that's an inherent property of the camera that should be called out in the specifications. I don't need UL to tell me, for instance, that the maximum framerate of the Axis P3384 is 30FPS.
UL should be focusing on standardized tests for things that are commonly misrepresented or underreported in the industry. Not something everyone already agrees on.
Maybe if it was graded on the minimum illumination required to achieve the maximim stated framerate (i.e: A shutterspeed faster than the framerate at a given IRE) it would be more helpful.
Update, the first Western manufacturer, IndigoVision, got UL listed.
What's weird is that Hikvision beat them on image resolution. I don't know if this is valid or not, but these are the type of results that will surely make bigger brands refuse to pay because it can make them look bad.
LOL. First you criticize IV because their camera got lower scores on some of the UL factors, then you state that "bigger brands (will) refuse to pay because it can make them look bad.". At least IV had the balls to submit their cameras to the UL for testing.
And in at least one area the Ultra 2k cameras blow Hikvision's away: Sensitivity. One thing missing in the UL's test is multi-stream frame rate. Hikvision just states up to 60fps and I assume that is on a single stream. I know from testing and experience that IV's cameras are capable of providing up to 17 simultaneous 30fps streams (8xTCP/IP plus 8xUDP plus 1x Multicast) and that they guarantee that capability.
"At least IV had the balls to submit their cameras to the UL for testing."
Once. My point is that IndigoVision will not be very motivated to do so a second time and that other bigger brands will look at this as a reason to avoid such tests.
I am not putting any stock in UL's metrics one way or the other. I am simply saying if a company needs to pay money to get rated, and then gets negative ratings, their motivation to repeat or continue will decrease.
OK, but have you perused IV's camera offerings? They don't have very many models, especially in their in-house line (BX-series are OEM'd). The Ultra 2k series only consists of one fixed dome with two lens and a couple of mounting options and one PTZ with the same basic mounting options. It's not like they have to submit any more cameras.
Besides, if I were to use UL certification results as the sole criteria for camera selection, the pickings would be mighty slim...
Why doesn't IPVM take this opportunity to either prove or disprove the validity of UL certification testing? I think it would make for an excellent article that would be very informative to IPVM's readers?
The other point is that the UL test comparisons you chose were between an IV integrated dome and a Hikvision box camera with a separate CS-mount lens. IV uses an S-mount power zoom/focus lens, which are not noted for having the best specs, and the Ultra 2k camera's deficiencies are clearly influenced by the lens (Image Resolution and TV distortion). I would love to see a compartison between lens types. I would bet the results would prove my assertion.
I guess you could say that. Nevertheless, if you had wanted to compare apples to apples, you could have used the Merit Li-Lin ZR6122EX3's UL certification. Both it and the IV Ultra 2k are 1080p domes with power zoom/focus lenses. Comparing the two, they both have similar Image Resolution and TV Distortion but the Merit Li-Lin camera falls down in Sensitivity. I believe that the lens is a major factor in both cameras' shortcomings.
And yes, I do tend to defend IndigoVision. We are very happy with them as a company and with their products. Although they are not comparable in some respects to so-called "open architecture" companies like Genetec, Exacq, ONSSI and Milestone (and to a lesser extent, Avigilon) and they tend to focus their efforts on limited verticals like casinos, I believe that focus makes them better suited for those applications than systems that are "Jacks of all Trades but Masters of None". Our testing in 2012/2013 confirmed that NVR/VMS systems that were designed specifically for the casino vertical were a much better fit than the less focused systems.
In addition, unlike their predecessor (Honeywell), the company has been very responsive to our needs and Feature Requests. Customer Support has been excellent and the company has more than lived up to its promises to consider our suggestions for new or improved features and functions. In fact, by the time our system was being installed, the software already incorporated improvements based on our evaluation notes and each release since then (there are 3 per year) has added features and improvements we have suggested. Honeywell couldn't have cared less!
Even our comments upon testing their 9000 and 11000 series of cameras were considered when they designed the 12000 series (Ultra 2k).
So am I defensive when posting comments when I see what appears to be an invalid criticism of IndigoVision? You bet! At least consider that my defensive posts are coming from the viewpoint of a highly satisfied client, rather than a company representative or others with a specific axe to grind.
I do understand that IPVM cannot test every product in the market place and, like CU, you tend to focus on products with a broader appeal. To do otherwise would be a waste of limited resources. However, like Consumer Reports' tests of Consumer Electronics, you tend to omit products that may be superior for their intended applications. I don't see CR testing Audiophile or Videophile-oriented components - the mass appeal just isn't there.
"Nevertheless, if you had wanted to compare apples to apples, you could have used the Merit Li-Lin ZR6122EX3's UL certification"
Why should we do a test of Merit Li-Lin vs IndigoVision? To analyze UL certification?
You are asking us to test 2 products very few people are interested in for a certification almost no one uses.
If there was some belief / claim that these products were significantly better than Axis, Sony, Avigilon, etc., etc., we would do it. But I don't hear such claims. I simply see a lot of work for a report that very few would be interested in reading.
it would have been a more accurate comparison if you had chosen the Merit Li-Lin dome camera, which has an integrated S-mount power zoom/focus lens, than the Hikvision camera with an obviously superior CS-mount lens. You didn't compare IPVM's tests - you compared UL's tests.
Those are UL's results. Hikvision beat IndigoVision in image resolution on UL's test. This is a fact. Not a criticism of anyone.
As I have now said 3 times, I do not accept UL's results one way or the other, nor are they important enough for us to spend a week testing to see if UL is right or wrong.
My criticism in this thread is about UL's chance of succeeding with 2802, both because of the low uptake and because manufacturers who market around image quality (like IndigoVision) are unlikely to continue to pay UL for results that show Hikvision beating them in key image metrics.
IndigoVision has issued a press release touting its UL2802 certification. Carl will be happy to know that I was wrong in my belief that IndigoVision would find the results to be a negative.
What IndigoVision was bragging about was 'perfect low light performance.' For sensitivity, they got a 100, the highest possible score, which is a bizarre scale because it means no camera ever, even with better low light capabilities, can outscore IndigoVision.
When is the last time IPVM tested an IndigoVision camera? IPVM has tested many cameras and posted their relative merits, and for that I applaud you. Yet, when you pre-judge a company's cameras based on press releases and your own judgemental conclusion that UL's tests are worthless in your eyes, without showing those tests to be valid or invalid, my hands just can't come together.
Carl, you are remarkly dense when it comes to your choosen product line.
For the fifth time, I am not making judgements about IV's performance. I am making judgements about UL's structure.
Again (1) this leaves bigger manufacturers having to pay to be judged judged worse than company's that they claim have inferior video quality (as Hikvision beat IV above in resolution). And (2) giving any camera a perfect score in sensitivity / low light means the ranking will be unable to proper differentiate other low light cameras. What happens if IV's next camera is even better in low light? The top score is 100, so it will get the same score.
Carl, I voluntarily added information that contradicted my own claim that IV would find these test results to be negative. If anything, by doing so, I have shown that I care about being fair and thorough.
Update: In the past 9 months, adoption has been weak. Manufacturers having any cameras with UL2802 has gone from just 5 to 7 (adding OEM extraordinaire IC Real Time and China manufacturer Uniview). [(see UL Online Database form, just enter VCAM as the category code)]
Update: Looks like even UL stopped mentioning it. At ASIS, they moved on to new promotions. And no new listings have been added to UL2802 database listing.
It appears dead.
Axis praised it at the beginning of 2013 but it appears they, as well as all the big Western manufacturers were not interested. Again, not surprising to us, given the risk that such standards would help reduce the brand advantages better know manufacturers have against lesser known, less expensive one.