Who Killed IRE Levels?

Not too long ago, IRE levels were commonly included in minimum illumination specifications, e.g., .1 lux f/1.2 50 IRE

A few manufacturers still include the IRE level, but it's definitely a minority of IP camera manufacturers.

The goal of using IRE was to define a standard level of contrast so there was uniformity in when one decided an image was no longer any useful. Since images progressively get worse in low light, you need some metric to decide what lux level to declare as minimum illumination.

IRE is a "unit used in the measurement of composite video signals."

The problem with IP cameras is that you no longer have a composite video out, you have frames / packets. So even if you wanted to use IRE, you cannot.

Can anything be done here? Anyone have any advice.


This is a very good question. I was always puzzled when seeing some IPC specs where the IRE numbers are included. How do we measure the IRE in an IPC camera?

Could the following two metrics be used to replace the IRE?

1. SNR

2. Gain in AGC

John, good thoughts.

It's funny because IP camera manufacturers are increasingly omitting the SNR as well :)

As for gain, I am not sure if you can compare gain across manufacturers. Is a 35dB gain for Axis the same as 35dB for ACTi? I don't know, but I don't think so.

So even if you wanted to use IRE, you cannot.

Unless you really, really wanted to...

Camera ip out to MegaPixel composite decoder, camera set to minimal compression (MJPEG?), 30fps. Probably not what you had in mind, but would it work at all?

Rukmini,

Which output of that decoder do you propose to use?

The unit supports two: BNC or HDMI. If we go BNC, then its an analog NTSC/PAL signal, which you can measure IRE but have lost the original resolution. If we go HDMI, it is still digital, right? If so, how do you measure IRE.

You better believe I an using the BNC; The hardest part was finding an encoder that supported both MP and composite video, since many only go to D1.

As for the resolution dilemma, its true the the output is scaled down to D1, but of course its not cropped or anything, so the first question to ask would be, is too much information lost in the down scaling to render the IRE measurement unrepresentative?

Instead of measuring IRE, imagine its just you looking at various MP cameras thru this decoder, don't you think you could make a relative subjective assessment of the cameras low-light performance, even if you were just looking a at scaled down D1 version?

Like in the same way that analytics perform well at quarter-pixel resolution, I don't think too much of the relative contrast info would be lost. But it's easily tested too. And if it turns out unfair or unpredictable then we could always wait on one those HD-CVI to ip decoders, (or try the HD CVI output of Dahua hybrid DVR) ;)

Though I think we should be looking at replacing the metric with one redesigned from the ground-up.

Presumably in the past, camera manufacturers would simply connect the composite analog output of their cameras to a waveform monitor to check IRE levels? Assuming that to be true, I think there are a couple of options for checking IRE levels with IP and other non-composite cameras.

The first option is to use software. You could take a video clip recorded by an NVR and open it into a non linear editing (NLE) application such as Final Cut Pro or Premiere Pro. Both of these applications have waveform monitoring scopes which can show luma in IRE units. However I don't like this method for a couple of reasons:

  • The first is NLE operators have complained for years about the inaccuracy of the built-in scopes in NLE software. They are only useful to provide an indicative result.
  • The second is this method doesn't help with live testing because it is reporting the IRE levels of the video clips recorded from the IP cameras. If I was a camera manufacturer wanting to adjust IRE levels, I would want some way of doing this live.

The second option is to use a hardware waveform monitor which was traditionally very expensive with scopes costing over US$10000. Forgive me if the following reads like an advertisement for Blackmagic Design but I'm intimately familiar with their products. I previously worked at Blackmagic Design where they created their UltraScope and SmartScope Duo products for US$495 to $995. These products include waveform monitoring of standard television signals in SD and HD. Quoting from the UltraScope marketing material, "Composite video is synthesized from the SDI input displayed for a better view for aligning decks and other video equipment. Even though UltraScope is fully digital, the synthesized composite waveform makes it easier for editors who are familiar with composite waveform monitors." Blackmagic Design also has their UltraScope Duo 4K product which includes composite waveform monitoring of SD, HD and Ultra HD 4K television signals for US$995. The waveform monitor on all four products can use IRE units and I think this method may have potential for the purpose you have described.

The only trick will be how to get the live video from the IP camera into the SDI port of the UltraScope or SmartScope Duo. I think some IP cameras have an HDMI output which could be adapted using a US$295 Blackmagic Mini Converter HDMI to SDI 4K. If the camera had no video output, one could set the desired IP camera to display full screen on the VMS workstation or NVR. Then adapt the HDMI or DVI output, of the graphics card, to SDI by using a converter such as US$395 Blackmagic DVI Extender or the Mini Converter HDMI to SDI 4K. This would allow one to perform live waveform monitoring of the IP camera using traditional IRE units all the way from SD up to 4K.

Check the following document @ P.9

Tariq, the problem with Stanislav / CCTVCad's analysis is that it is stuck in older analog technology and does not reflect changes in how IP cameras operate and the image processing that has become critical for low light differentiation. We actually had a detailed debate with him here.

John, I stopped dispute with you in that topic by your request "Stansilav, please do not comment again about this." "please spare us the rest.".

But it doesn't mean that I agree with you. Certainly, IP cameras use new different technology, but basic principles of work analog and "IP" cameras with lens and image sensor are the same.

Moreover, without good understanding these basic principles it is impossible to make right conclusion about more complicated cameras.
It is simple to stick the label "that it is stuck in older analog technology" but it is a fundamental mistake.

In my article the differences and troubles with testing IP cameras were mentioned.

Stansilav, you had more than a dozen comments to make your case and you couldn't.

In particular, you showed that you do not understand / appreciate the role of digital enhancement / processing plays in modern IP cameras.

You are confusing our readers with how stuck you are on just lens and image sensor. Today the role of the encoder / processor on board the camera to enhance (or not) low light images is a major factor in comparative low light performance.

John, I will not repeat that I wrote in that discussion, as you. Let's remain on our opinions.
By the way I found your tests of fisheye cameras very informative, and your images are very close to my models. Thank you for these tests!

(Why) do you think that a manufacturer's specification such as 1 lux f/1.2 50 IRE, is anymore meaningful than 1 lux?

Since neither gain or S/N is not specified (nor is it easy to specify as you point out), manufacturers can manufacture almost any number they wish. Perhaps the noise at 0 lux could be subtracted from the measurement at 1 lux to normalize the gain, but....

I understand how people 'want a number', but in this case at least, everyone would be better served by the mfr. capturing a standard tvl chart at standard distance at whatever lux they desire, and publishing the picture and the lux number, in the same spirit as you did here: Ranking IP Camera Low Light Performance. Agree?

"(Why) do you think that a manufacturer's specification such as 1 lux f/1.2 50 IRE, is anymore meaningful than 1 lux?"

I actually do not think it's more meaningful. Lux ratings are ridiculous, regardless of how you frame them, simply because everyone gets to do it themselves.

It's as if every lawyer got to decide on their own how good they were at litigation.

I only brought up the IRE aspect to see if anyone had any feedback or alternatives. For instance, Luke's comment is interesting and we are going to research more on that.

Lux ratings are ridiculous, regardless of how you frame them, simply because everyone gets to do it themselves.

I assume by "ridiculous... because everyone gets to do it themselves", you are saying that because manufacturers get to derive their own specious formulae/methods/criteria, the results are not trustworthy. But not just because they perform the test themselves, right? For instance we can usually trust their product dimension measurements because one can verify or reproduce such specifications easily.

And to the same end, namely independent verification, the addition of f-stop and IRE into the spec is shouting "Were not making this up, you could do this test yourself!". And it might just be if one could pin down some of the other degrees of freedom (gain,s/n) out there.

I asked because at first glance, it does seem to be an improvement over just a bare lux number, but, as it stands, it still has too much wiggle room...

Apparently someone didn't tell LT Security (Hik) about the demise of IRE for MP cameras:

Though they're probably just using their SD analog output scaled down. Also interesting is the claim of support for an unspecified type of H.264 SVC

Now, you are (optimistically) actually assuming they ran their own tests. I bet they copied it from the sensor manufacturer's specs!

Btw, there are some IP MP manufacturers that list IRE levels, it's just not many, and even those that do, can't really be trusted.