IPVMU Certified | 02/12/15 07:48am
From PPF Test - Getting High Quality Surveillance Video, come these guidelines:
- Difficult to detect person: < 8 pixels per foot
- Rough guess of person (age, gender): 5 - 12 pixels per foot
- Higher probability guess of person (hair, accessories, etc.): 15 - 24 pixels per foot
- Blurry face (could identify if you already knew the person): 25 - 50 pixels per foot
- Clear face (could identify a stranger): 50 - 80 pixels per foot
- Like TV quality (very sharp details of face and body): 80+ pixels per foot)
These are assuming ideal lighting conditions, and of course poor light will negatively affect these estimates. But what about ideal IR lighting? Let's assume that we have an even IR spread on subject at a perfect intensity to start with.
So, first thing is it's monochrome, where color information is now represented by gray-scale. How much does that affect it?
But I think it's worse than just a standard monochrome picture from a b&w camera without IR. Because the wavelength of IR is longer than all the visible colors, and therefore has less definition. But I'm not sure how to quantify either of these degradations...
Does anyone know of any similar guidelines to those above, but for ideal IR conditions?