Angry ********
*** ************ ********* ********:
** **** *** ********* is ** ****** **** expectations **** ********* ***** installation. ** *** * lot ** ******* **** online *********. **** ********* objects *** ********* *** metro. ** **** *********** it ***** *-* ***** alerts *** **** *** for ** ** *** not **** *** ******, but *** **** ******* it **5000 ******, so customer now complain and we are working hard couple of months trying to get better result, but still not sure if we can fulfill their expectations. Conditions in metro are really bad for this kind of analytics when people are staying and waiting without any movement. And this customer ** *** **** *****. [Emphasis Added IPVM]
** ** ****** **** hard ** ******* **** getting ***** ****** ***** ****** ***** cause ***** *********?
Bad *******
** ********, **** ** unfortunately * ***** *** consistent ***** **** ***** analytics *********.
** **** **** **** numerous *****, ****** **** *** year **** ******* ************* and ***********. *** ******** consensus ** **** ** we ***** **** '*******' customers ** '*****' ************ that ******* **** ******* would ********* ** ****. Wrong.
Needs ** ****
********* **** ** **** and **** ** *** key ****, **** **** to **** ** *** customer's ************. ***, *** real **** ********, *** expectation ** *** **** perfection.
***, ** *** ************, developer ** ********** ***** 5-6 ***** ****** *** hour ** *** ***? Sorry, **** ** **********.
**** ******* ** **** the ******** ******. *,*** alerts *** **** *** much? ****** **********.
******* *** '**** ************ from ********* ***** ************'? Sorry, **** *** ** your *****. *** *** properly ******* *** *** expectations ************* ** *****? Did *** ***, "** expect *,*** ***** ****** per ****. *** ** with ****?" * ***** that.
Applies ** *** ****-**** *********
** **** *** ****** if ** *** *********** perimeter *********, ********* ******, facial ************, ***. *** time *** *** ********* a ****** **** ****-**** analytic ***** *************, **** is ********.
**** ** *** ******* alarm ***** ***** *-* times ** ****? **. Do *** ***** *** would ** *** ** business ** ** ***? Yes.
Eliminate ***** ****** ** ****
** ****** ** **** simple. ****** *** *********** eliminate *** ***** ****** or *** ****.
** *** **** ** know *** *** ***** analytics ****** *** ******** so *****, ******* ** years ** ******, **** is **.
**** ******** ***** ** 'customer ************', *** **** selling **** ****** *** until you *** ***** ******* false ***** **** *********.
Comments (88)
Luis Carmona
I would blame the manufacturer more than anything, but the integrator shares a portion, too, albeit smaller. I say this because I have always seen manufacturers oversell analytics. So when manufacturers do this, isn't it logical the integrator will too, since they are relying on their manufacturer "partner" to give them reliable information...? But the integrator shares blame too for "trusting but not verifying" the claims and doing their own R&D to test it's limitations. Testing products costs time & money, but that is the nature of the beast, it just is what it is. We can blame the "CSI Effect" all day, but if we sell expectations based on sales literature and not properly vetted one, because a competitor is more willing to sell unrealistic performance without doing proper testing and so therefore at cheaper cost, that is on us (the integrator).
Create New Topic
Undisclosed #1
Doesn't it depend on the application?
For instance, a cross-line detection in the metro might be annoying if falsely triggered once a day, esp. without 24/7 live monitoring. On the other hand, once a day false alerts for cross-line in the Capitol building might be an acceptable trade off considering the assets at stake and the budget available.
Which brings another point, false positives can often be significantly reduced by lowering the sensitivity, which of course will increase the false negatives.
So do you mean "Either you essentially eliminate all false alerts without missing ANY real ones, or you fail?"
I think that slider is vastly different depending on the circumstance.
As an example, 1 out of 3 times I travel thru the airport the detector flags me, without apparent fault of my own. They can't be overjoyed with this result, but I'm guessing they could also turn down the sensitivity if they wished less false positives.
Create New Topic
Kevin Nadai
Is the OP suggesting that people standing still are causing the nuisance alerts? If so, a size limitation on the analytic may reduce them. A target would only count as an object if it was smaller than a person. (Which opens up the problem of really small persons)
Even so, a train platform is not a good application of "object left behind." People naturally set things down while waiting. Neither is the baggage claim area at an airport, frequently cited in news reports regarding the "magic" of analytics.
Are the 5-6 alerts per hour truly "false" (mistakes by the analytic) -- or "nuisance" (technically correct but not what the customer wants to know)? As I have said in this space before regarding analytics -- often what the customer wants to know, they don't want to know. A screen shot of one of the "false" alerts would facilitate discussion.
A project can fail even with perfect analytic performance. This may be such an example. In addition to accuracy, I still contend that managing customer expectations is essential to success with video analytics. Based upon the OP information, IMO, the customer's expectation in this application should have been, "This ain't gonna work."
Create New Topic
Scott Sheldrake
This article is bang on. Unless video analytics are perfect, you just cannot sell them.
Countless times we are asked to set up email notifications for Video Motion Detection, and in 100% of cases we have ended up turning it off for the customer due to high number of false alarms. Forget line crossing, facial recognition, etc, this is basic VMD. It absolutely does not matter which brand or product you use, or how much tweaking/adjustments you make, you will get false alarms. Unless you turn it down to be so insensitive that it does nothing.
Sadly, video analytics are unlikely to improve while lawyered-up companies like Avigilon are hoarding patents, so we are stuck here for a while.
Create New Topic
Adam Jaison
Is it wrong to mention the manufacturer referred to in this article?
Create New Topic
Undisclosed #2
"It really is this simple. Either you essentially eliminate all false alerts or you fail."
So,if manufacturers/integrators can not eliminate all false alerts
what do You/we do?
Create New Topic
Murat Altuev
I'm first to completely agree on all statements. But this is not manufacturers fault! We always warn our integrators that real time analytics is not perfect, but they want the project and that is all!
They are even doing pilot installations, but trying to hide real results. So integrators often knows the real situation, but they prefer not to say "not" to customer, because in this case customer will find another one who will say "yes".
I don't know the right behavior for manufacturer in this situation. We have this analytics, because everybody has and often it is requested. But we know, that to manage effectively you need operators to confirm often false alerts and we always saying this. Another our decision is to set zero price for online analytics! We have all kind of online analytics and it is free! So, we are not giving the reason to complain, that online analytics is not perfect. Yes, it is not perfect, that is why we are not taking money for it! Now we take money only for forensic search analytics as this is really helpful tool which increases efficiency of investigations, saving operators time and money as result!
Create New Topic
Frank Jacovino
John, as an engineer who has been involved with "video analytics" for over two decades I am amazed by the strides the industry has made in its algorithms, but, at the same time, truly amazed by of the markets perception of video analytics capabilities.
My analytics background comes from the semiconductor industry where we had to commit to 100% identification of true defects and false alarm rates close to zero percent. This was not an easy task to achieve. It is important to understand that our task was easier a surveillance application since our scene was always the same and the lighting was always the same. With surveillance applications you have scenes which are constantly changing: people/cars moving, trees waving, day/night transitions, weather implications etc.
I believe the problem lies with the manufacturers not really explaining what video analytics can do and under what conditions they can do them. Although I do not know the specific details my initial reaction is that arming 1000 cameras within a metropolitan area was irresponsible. The manufacture must have known of the application, the extent of the installation and the risks associated with such an installation. To not expect 5000 false alerts an hour and not know this would be unacceptable is borderline criminal.
My view is that analytics, even in their current state, have a place in our industry. Applications which have controlled environments or applications where the identification of an alert is far more important than the false alarm rates, are examples where analytics can be deployed. For instance, if the manufacturer told the metro center to deploy the analytics in 4 or 5 critical areas, educated him on what to expect for a false alert rate, the customer would probably be seeing analytics as a success.
Create New Topic
Luis Carmona
Aspirin does not always work to take the pain away- faulty, pull them off the market?
Aribags don't always save lives- faulty, pull them off the market?
Speed limit signs don't always make people travel the speed limit- faulty, pull them off the market?
CAT scans don't always catch a blood clot- faulty, pull them off the market?
What are we arguing about? That analytics are not 100% accurate so they are faulty and never be used? Even if you explain the limitations to a customer because they will ignore what you said, or is in writing, and expect 100% accurate results?
Or whether or not the customer was told what to expect?
Create New Topic
Frank Jacovino
You are ignoring this particular customers perspective in this situation. The statement was made that there are 5000 false alerts per hour (over 1000 cameras). How can any municipality respond to 5000 alerts an hour? There is such much "NOISE" that the "SIGNAL" (true alert) is going to get lost. It is our jobs as manufacturers to provide meaningful information and to separate the signal from the noise.
Create New Topic
Sean Nelson
06/29/16 08:05pm
Working with pixel based analytics has to be much more difficult to master than the PIR based motion sensors that ADT uses so not sure if thats a fair comparison. I somewhat agree with the manufacturer that customers expectations are too high but I do think this should have been explained more upfront before the sell was made.
Im a big fan of the cameras that have built in PIR though. Its much more of a true motion event for sure. I'd like to see more of them. The only issue is if the user wants to detect motion from several hundred feet away, then thats where PIR's start to lose effectiveness. But I'd like to see more advancements in this area (cameras with built in PIR's). If I were a camera manufacturer, I would do this. Not sure if PIR's could help with advanced analytics past typical motion detection though.
Create New Topic
Undisclosed Manufacturer #3
Story time!
New York, fresh off one disaster with analytics asked a friend during a presentation how well his "object left behind" analytic would work. His answer was "see something, say something," like your wall posters would work better.
The board agreed and disclosed they use that question as a "BS detector".
While this subject is debating all analytics, one in particular "object left behind" was the focus of this issue.
What people want is "dangerous object arrived" and anything less than that is a problem for managing expectations. If the system accurately detects an article left behind like the backpack from the Boston Bomber that's good? Right?
How soon do you respond and with what? Should they have shut down the marathon because of a backpack? How many were set down that day for more than say.... 30 seconds?
If you want to assign blame, it's "all of the above" for not thinking through the whole scenario. It would be different in a sterile environment, back hallway door, secure facility....maybe, possibly, depending.
Create New Topic
Paul Curran
Analytics need a yard stick measurement or what happens is you get every man and his dog adding it in cameras etc and it fails. iLids is a uk test, I'm interested to see what standards exist for testing?
Create New Topic
Undisclosed Manufacturer #4
A couple of thoughts:
1) Try before you buy.
2) TECHNICAL NOTE - Of the Video Analytics tasks - detecting abandoned objects in a busy place like a station is one of the very hardest. The level of machine vision required is up there with driverless cars driving around cities.
To understand why, look at the numbers. Let’s say each station has 50 cameras and each camera 'sees' 20,000 people per day. Every one of these video 'events' needs to be analysed to see if anything has been left behind and whether it might be a threat. During some periods of the day the cameras can hardly see the ground at all, it’s just full of people. And the user wants less than one false alarm per day, which is one false alarm per 1 million events.
Video Analytics can deliver 95% accuracy or maybe even 99% accuracy. But looking for 'rare events' in large quantities of data as in this case requires 99.9999% accuracy. No Video Analytics system on the market comes anywhere near this yet.
Create New Topic
Undisclosed Integrator #5
I think Video Analyitcs in its current format is totally wrong, until some Artificial Intelligence is not developed to make proper decisions regarding an event.
For a human or an AI, its not a hard job to decide that someone REALLY left a backpack on the floor which he carried before, but an algorithm can only see pixel changes on a static image which it tries to analyze and cannot detect such a simple thing for example in a crowded hall where people are constantly moving in front of and behind each other.
Create New Topic
Undisclosed Integrator #5
Instead of looking for left objects, they should employ well trained dogs, who can scent explosives and get the attacker before they enter the area. A dog can be a lot smarter than any camera with analytics...
Create New Topic
Sarah Doyle
The last time I asked only 3 companies were approved for iLids 'sterile zone' and 1 for 'long range thermal imager.' I don't know of any company who passed the others 'left baggage'/ 'multiple camera tracking' etc.
One of the issues is that people often don't want to invest in testing analytics prior to a sale ( the sales man might not offer it due to cost).
I would be nice if you could have a centralised open source database of videos that represent a certain scenarios and analytics companies tested themselves against them
Create New Topic
Dale White
Nice article, John. All the comments are very good. It's not as black and white as some demand though. Everything depends on where and what you are trying to use the analytics for. We have struggled with these issues for a long time. Given the infinitely variable environmental conditions, I don't think near perfect performance is a realistic expectation.
This is reminiscent of discussions I used to have with customers over volumetrics many year ago. Even now, 98% are still nuisance activations.
We hope that the intelligence in real-time video analytics will make it an effective tool for many applications. But, many on both sides try to turn their hope into expectations.
A minimum level of false activation gives a human an opportunity to discern what is true short of numbing the system to the point that you miss a real event. But too many falses desensitize reactions rendering the system ineffective. Same old dilemma.
I'll keep hoping, but I always temper my customer's expectations with reality.
Forensic analytics have a brighter future than their real-time cousin.
Create New Topic
Robert Baxter
Would be interested in knowing "exactly" the application alluded to in the "left behind" or metro analytics situation. It also appears that there is confusion of what is meant by false positive. A false positive is not an event generated where it is not a threat. A false positive is where what the analytics should ignore (eg. birds, tree branches moving, animals) is generating events. The analytics did what it was supposed to do if it is detecting people - what was the error is not considering the frequency with which non threat events will be generated by normal activity in the scene.
A common customer request is dealing with crime in their parking lot (eg. golf course parking, apartment parking, metro parking, casino parking). These applications are not workable with analytics for many reasons that has nothing to do with analytics; while analytic applications involving car dealerships do. I run into these kind of installations from time to time - all have been turned off. If your staff doesn't understand why they don't/can't work you will spin your wheels in misapplying video analytics.
Likely there are similar principals that make a left behind or metro analytics situation workable or unworkable.
Create New Topic
Undisclosed Manufacturer #6
I'd ask for a bit more clarification on the user's definition of "false alert".
For example If the analytic is set to notify on an "abandoned object" and the dwell to alert is 1 minute once an object is detected and a passenger sits a suitcase down for 1 minute is it a "false alert" if the analytic triggers? By the customers definition it is, but is it really since the analytic has no way of distinguishing that?
That may be an oversimplification, but again, more details are needed.
I'm in agreement with those that stated it was a bad application of the technology for this scenario.
By the same token, we also need to examine the customers motivation for wanting it to begin with. He has 1,000 cameras. How big is the staff that monitors 1,000 cameras? Nothing takes the place of actual human surveillance and decision making, but obviously there's no way the customer can afford staff to effectively monitor it so he's turned to technology as a force multiplier.
I'm hard pressed to understand why an initial pilot of the analytic on a dozen or so cams wasn't done to begin with so he could establish a baseline to determine if he could live with the "false positives" (if that is indeed the case) to begin with.
Create New Topic
John Marco
I would like to join this conversation from a sensor POV. We have been testing, with success, the combination of sensors working as the trigger for video analytics (we call this a IP video double knock internally).
Testing the complementary features between the video analytics and physical sensor in outdoor environments, we have set up a live system using both technologies: Optex sensors and video analytics. The footage gathered gives a good insight on how the double knock can decrease false alarms and provide real time forensic information:
https://www.youtube.com/watch?v=CYYrmWPOeb0&feature=youtu.be
Introducing physical sensors solely dedicated to the function of sensing and independent of the camera view and angle can help. Using different sensing technologies that can suit different environments also help lower false alarms. Sensors can also help cameras eliminate cause of false alarm in outdoor environment where lighting conditions will hurt video analytics.
How does it all apply?
Sensors can be used to trigger a number of devices including non video devices such as lighting, signaling systems, barriers, fogging systems. Sensors can detect in areas where no camera can be installed: walls or network cable. Sensors can cover detection areas difficult to achieved with a camera. Sensors complement and validate video based detection to create a “double knock” system.
All in all, there is not one technology that can do it "all." Why not use multiple technology strengths based on environmental, system configuration and requirements? I hope this POV can help manage some of these missed expectations.
Create New Topic
Undisclosed Manufacturer #7
There is certainly room for improvement, and here is an example; In automotive-grade analytics used for self driving cars (ADAS) the failure rate requirements are extreme. This level of analytics deploys a lot of processing performance, multiple sensor types, and in general use other classes of algorithms than what is typically used in security today. True, the use cases as different, but there is significant overlap in tracking and object classification.
A challenge in surveillance is the inability to charge (enough) for analytics in a smallish market -> which means less R&D can be invested -> resulting in less performing solutions, etc. Not so in automotive analytics. That is a much bigger market and good analytics is a must with tough industry conformance testing, which drives specialized chipset and algorithm development. My guess is that such automotive technology will become more affordable and trickle down to surveillance in next few years in some form, making security analytics more reliable and usable. Security being a relatively small market, has great traditions in deploying standards and technology from other bigger markets. The era of seeing cameras and robotics has begun.
Create New Topic
Sean Nicholas
For what its worth, every implementation of abandoned objects or removed objects using VMD (video motion detection) is destined to fail.
I do however have first hand experience with more than one implementation of abandoned objects (or removed objects) using NMD (non-motion detection) with a video analytics company out of Toronto circa 2008. I wont plug their name, but you can google Non-Motion Detection as its a patented technology which allows them to operate extremely effectively in crowded environments.
With standard VMD implementation the detection times for abandoned object is often within seconds of the object being placed. This is because of limitations of this solution using VMD.
Simply ask any analytics company if their object detection is based on VMD or NMD and/or specifically inquire whether their solution can detect an object that is left behind for 5 minutes in a busy environment , i'm certain they would hesitate and be vague in their response. However, if you ask the video analytics provider that's capable of using NMD, i'm certain you'll get a much more confident response.
Better question for the sales people are to ask if they can detect a car parked for longer than 20 minutes. Its guaranteed that VMD will fail in this scenario, but for NMD this is where it would shine.
Its a pity that RFQs arent more specific. Customers should request that a system can detect objects left behind for longer than 3 minutes or 5 minutes. It would protect themselves.
FYI, another highly effective application of NMD is graffiti detection.
Create New Topic
Undisclosed Manufacturer #8
What did Einstein say again:
“We can not solve our problems with the same level of thinking that created them”
Video analytics needs some drastic different approach. Maybe integrated multi sensor negotiating (not video alone) would get us somewhere.
Create New Topic
Sagy Amit
Wow...
I usually give up on reading comments after 8, maybe 9.
Admittedly this subject is close to my heart and I have played the devil’s advocate on many discussions such as this one in the past. So I’ll try to be objective here, as much as I can after 11 years in Video Analytics and 4 different manufacturers.
I will work with the main story and avoid response to individual comments as there are too many of them.
“Conditions in metro are really bad for this kind of analytics”
Well, no freak’n way, Einstein!! BUT YOU REALLY WANTED TO CLOSE THE SALE, DIDN’T YOU?
I can write a book about all those projects I have turned down, in which VA was just NOT the rights solution!
From detecting sharks on a Florida beach front hotel, to determine a friend from a foe (U.S military…), VA is not meant for that!
Nevertheless, saying that video analytics is a worthless solution?? It is like saying that police radar detectors are useless, because of false alarms...
It is true that very few (If any) VA companies made it to the finish line of profitability. We can blame many factors such as timing, investors patience (or lack thereof) and bad leadership.
Nevertheless, we can’t take away some great success stories in which Video Analytics are used successfully and almost exclusively. One example is the Remote Video Monitoring business. There are tens if not hundreds of small time operations that cater to 100+ locations and provide them with useful service.
“Does an ADT burglar alarm false alert 5-6 times an hour? No. Do you think ADT would be out of business if it did? Yes.”
First off, the burglar alarm industry, for the longest time prior to recent years regulations cost the tax payers $2B annually in false alarms. Statistics don’t lie and 90% of all police dispatches were false alarms! In fact, the reason why VA is so useful in the RVM business is recent requirements for Video Verification of alarm events.
In my past 11 years in the business, there has been very little improvement to VA performance. I’d argue that we even went backwards on innovation.
The reasons??
I don't know, maybe greed, made up market value projections, and down the loophole of inexcusably dishonest marketing campaigns. I have recently gave up my job due to that exactly.
We will never eliminate ALL false alerts. Heck, the radar detector in my car drives me nuts, but it saved my ass on countless occasions!
For fence-line intrusion detection, where the alternatives are "dumb" cameras, PIR triggers camera, or fence sensor triggers camera, Analytics is still the best option. With current thermal camera prices, thermal analytics is more cost effective than most solutions and provides the best combination of detection/verification.
Bottom line, for perimeter detection, what is a better alternative??
Create New Topic
Sagy Amit
As to "Object Left Behind", a requirement to detect potential bombs left to detonate in a crowded mall, subway station, etc. From a pure optics standpoint, will never be a reliable solution. those environments are too crowded. in most cases the object will be blocked from camera FOV. Where it will work, indoor, sterile environment, it beats the purpose..
As to the guy with the new bombastic acronym, if you wait 5 minutes or more to provide a detection alert, the bomb already went off..
The ONLY current solution for suspected objects and suicide bombers is human vigilance.
Create New Topic