This seems like it would be useful for analytics:
We have developed a proprietary imaging technique that uses a combination of color filters and imaging processing to obtain both a color image and high-precision a depth map from a single monocular camera image. By attaching a proprietary color aperture filter consisting of blue and yellow filters to the lens aperture, a combination of blur and color shift occurs, and this combination depends on the distance to the object. Distance to the object is detected for each pixel through image analysis from the blur and color deviation obtained within a single photographic image. Deterioration in the quality of the captured image is also suppressed, because this color filter allows transmission of green light, which has a higher contribution ratio to overall image brightness. In tests using a commercial camera, we have confirmed that the distance accuracy obtained from a single image taken using a monocular camera is comparable to that obtainable with an image taken by a stereo camera that has its lenses 35 cm apart. Therefore, it is possible to construct an inexpensive image sensor using this method as it consists solely of a lens device and image processing.