Even perfect video analytics performance cannot overcome mismanaged customer expectations.
My very first VA project was a system protecting Air Force One when it came into Boeing for periodic maintenance. After landing, it had to taxi across a public access roadway to reach the maintenance facility. The president is never on board at this time, but AF1 would still be a juicy target for mischief or otherwise. (None of this is classified information) They can and do block the road during the transfer, but they were concerned a vehicle, person, or device would be pre-positioned under the taxi bridge. Multiple cameras covered the area, with the live video feed sent to the airport control tower. The controllers were objecting to the additional workload of watching the cameras 24/7.
Unbeknownst to us, the customer locked the system in a closet during the pilot. Two months later, they opened the closet and saw 300+ pending alerts waiting to be viewed. They packed it up and sent it back.
Let's do the math. Before, you were supposed to stare at 6-8 cameras all day. Now, this system will do it for you and generate 1 or 2 alerts during your eight-hour shift. That the alerts were valid (people often walked under that bridge on the way to the beach, flat tires, cars stopping to watch other widebody aircraft cross over, etc) was irrelevant; the "additional workload" of 150+ "false alarms" killed the deal. You might be wondering, "What did they expect?" Exactly. And the answer is -- nothing that could be fixed by VA "getting better."
Another example was a highrise building in Houston. This security director wanted to detect objects left behind in case of explosives, as well as people lurking at ground floor exits waiting to slip in behind someone coming out. The system, very accurately, generated thousands of alerts per day. They started when the morning newspapers were placed outside the building tenant's interior doors. It continued during the day as mail, packages, and office supply deliveries did the same. Smokers congregated outside next to the exit doors. At the end of the day, people put their wastebaskets in the hall before locking up their office and leaving. At night, cleaning crews would leave behind carts, equipment, and supplies when entering individual offices. What the security director wanted to know -- he really did not want to know. And better accuracy would never have saved that sale either.
One last example was a system I deployed at a pharmaceutical campus that housed the Anthrax virus. They had a nice high fence around it, but a natural stream flowed through the campus. During the dry season, you could simply walk down the stream and right under the fence. Four cameras, two at each end, covered the area and with great accuracy generated alerts on anything bigger than a raccoon.
About a week later, the customer called to complain about a deer that frequented one side every morning and every night, generating alerts. "How cool is that?" I thought. "Free walk testing!" The customer found that unacceptable. Four alerts per day (two for each camera) times thirty days equals 120 "false alarms." I remember him saying almost word for word, "We would not tolerate that many false alarms from any of our other security systems either."
So unless Moore's law could drive VA innovation to the level of person vs. deer performance, this would still not be a successful sale. In each case, the salesperson had failed to gauge or manage customer expectations. Chief among them was presenting video analytics as an alarm system rather than an alert system.