The Key Challenge for Video Analytics

Published May 07, 2008 00:00 AM
PUBLIC - This article does not require an IPVM subscription. Feel free to share.

It's all about false positives. Catching the bad guy is pretty easy for today's perimeter violation and abandoned object detection. Not catching anything else -- the sun, the moon, the wind, the birds, the rain, the trees -- that's the challenge.

This should not be that much of a secret. A little bit of experience with any of the dozens of video analytics vendors in today's market demonstrates this.

So it shocked me that Bosch has decided to turn the easy part into the 'challenge'. Here's Dr. Bob explaining what Bosch did at ISC West [link no longer available]:

"For the IVA (Intelligent Video Analysis) challenge we had people lining up from 10am when the doors opened till the end – over 2,300 people took the challenge, which was to steal a Bosch power drill from under the nose of our IP camera without the alarm going off.

Dr. Bob exults: "Yes, I think we can safely say we got our message across – our products are smarter than the average bear, and bringing intelligent video analytics to the masses is a good example." 

This is cunning marketing but, at best, it is wildly misleading and, at worst, it is a scam.

Now, I am not judging the quality of Bosch's product nor is Bosch the first or last company who will frame the issue as such.

That being said, this challenge is worthless in judging the quality of an analytic because:

  1. Any decent prototype can accurately generate correct matches in a controlled environment.
  2. Economically, minimizing false alerts is critical to establishing the business case.

Reducing False Positives

Making sure you only trigger when someone is violating is hard because there are so many factors that might set off the analytic besides a valid adversary. To a computer, rain, dust and snow can all generate a similar form to a human being. Quick changes in light or the movement of water (waves) can also generate such forms. A camera that shakes because of the wind or issues with the mounting or installation also can trigger such alerts.

The hard part in such analytics is to make sure that these alerts can be eliminated. This is a key metric in testing and differentiating between analytics.

The Economics of False Positives

False positives drive up the cost of systems. Many (most?) organizations do not have a centralized monitoring system in place to respond and assess video analytic alerts. This places the burden on individual security managers to respond to alerts. Over the last 5 years, getting dozens (hundreds) of false alerts per day (especially if you deploy numerous cameras with video analytics) has been common. This can become an emotional and operational dealbreaker (even if it passed the Bosch challenge).

While it is better operationally to centrally manage alerts, if the system generates dozens or hundreds of false alerts per day, the costs can become prohibitive. Let's say an 'intelligent' camera generates 5 false alerts a day at the cost of $1 per alert (the unit cost to pay a monitor to assess). That's $5 per day, over $1,800 per year and about $10,000 for a projected 5 year lifecycle. If you have dozens or hundreds of cameras, this hidden operational cost can be in the millions.

And this is not theoretical. This is the feedback you will hear time and again from real world deployments. It's widely accepted that this is improving but it is still the major factor in assessing the quality of analytics.

Bosch may think that these tricks are making people more excited and more accepting of analytics. Most security managers have been excited about analytics for a long time. They see the obvious potential to mitigate risks and reduce losses.