Moores Law Slowing - Impact On Video Surveillance?

Over the past few years, more and more news sites have commented on the slowing of Moore's laws. It appeared in today's NY Times with a survey of the problem.

This can be readily seen in personal computers where both clock speed and number of cores are increasing relatively slowly (compared to the good ol' days of doubling every 18 months).

What impact is this / will this have on video surveillance?

One area where this might already be impacting is video analytics, which benefits from increased computational power. Agree/disagree?

UPDATE February 2016:

For all of you naysayers, more news / claims that Moore's Law is slowing / dying.

The chips are down for Moore’s law

Ars Technica: Moore’s law really is dead this time


Here's a few thoughts on the topic.

  1. “The reports of my death are greatly exaggerated” – Mark Twain

I’ve been hearing about the “death” of Moore’s Law for the past 25 years. So far, “They” have always been wrong.

  1. Video surveillance chipsets still have ways to go. While clock speeds for mainstream (x86) CPUs have plateaued somewhat, there is still plenty of upside for VS chips (which historically are about 2 process generations behind the leading edge). As long as HiSilicon and Ambarella remain committed to servicing this marketplace, there will continue to be exciting new platforms every 2 years or so.
  1. Right now the most interesting developments in analytics and computer vision are happening with Convolutional Neural Networks (CNNs). By interesting I mean getting close to actually being able to recognize objects without a ton of false positive/negatives. This technique currently uses large-scale CPU/GPU server farms, and we will have to wait and see if someone is successful in putting that technology into an embedded system.

I’ve been hearing about the “death” of Moore’s Law for the past 25 years. So far, “They” have always been wrong.

Strictly speaking, you are correct since Moore's law initially was stated in terms of the number of transistors that could be etched on a single chip. That growth has continued eerily on track.*

However the reason for this is because of multi-core chips, something that Moore was not envisioning back in the day. As the graph shows, relative performance has been experiencing only incremental gains, not exponential as Moore's law requires.

As far as security impact, I don't see much because other more important cost based metrics have not leveled off.

Things like GFLOPS/$ and TB/$ for instance, are creating more processing power for the buck than ever before. Even if the highest end chips are only slowly advancing.

*Yes I know the graph only goes til 2010, but it's very nice and pretty, and in any event, all it is showing that the slow down in performance started at least 10 years ago.

I can certainly believe that Moore's Law has a useful lifespan and may be approaching its end.

BTW, while Moore’s Law is specifically about transistors on a chip, I equate it to the geometric growth of not just CPU speeds but also storage and networking capability.

I don’t think slowing of Moore’s Law is the end of the world for video surveillance.

Surveillance has only begun to take advantage of the tremendous capacities provided by GPUs for encoding, decoding, analytics, etc. There’s still a lot of software and features that can take advantage of the momentum that’s already built up behind GPUs.

The dramatic increases possible in WAN bandwidths/bit rates could have significant impact on surveillance, but will take considerable time for infrastructure roll out, well behind any slowing of Moore’s Law.

Likewise there’s plenty to be done with our current spot on the Moore’s Law curve as it can be applied to sensor and display technology—where it makes sense—in video surveillance. We’re just behind the curve.

I disagree that a slowdown in the advancement of computational power will have an impact on video analytics. This is because lack of computational power is not what ails video analytics. VA can already do astonishing things, just not ones many people want to pay for.

At Cernium, we were working with all the same analytics available today, with performance equal to or superior to what is currently available. And we were doing so across sixteen video channels, simultaneously, on a Windows XP SP2 PC computer that doubled as a VMS workstation back in 2003. A mix of boneheaded management and lack of paying customers led to its demise.

The problem is that what people think they want to know -- they do not really want to know. Objects left behind in the baggage claim area of an airport? Good luck with that one. After the first few thousand alerts, that system ends up in a closet somewhere.

The main challenge to the adoption of video analytics is managing customer expectations, not lack of computational power.

Moore's Law always had a limit based upon the size of conductor and semiconductor molecules. Now that conductors on a chip are just a few molecules wide, something smaller needs to be used. Photons or quantum mechanics will be the next step.

"The main challenge to the adoption of video analytics is managing customer expectations, not lack of computational power."

Strongly disagree. This at the root of the problem of video analytics.

Trying to convince customers to be less demading is a futile proposition. The analytics must get better if you want more people to use them.

This is a key reason why Cernium Archerfish failed despite Cernium setting up unrealistic restrictions.

John, in context of what Kevin actually said, it sounds like you’re saying that if sufficient computational power existed, analytics would finally fulfill customer’s desires. I agree that analytics could improve, and computational improvement probably helps. Analytics are often marketed in overly simplistic terms and lack the accuracy to satisfy the customer’s assumptions about its performance.

Whereas what I hear Kevin saying is that even when analytics work, they often solve the wrong problem. Object left behind is a great example. I agree with his point that customers often have unrealistic expectations—not just of the analytic performance, but of the analytic’s applicability and their preconceived satisfaction with the results even if the analytic was 100% accurate. I do think Kevin’s being too broad, as some analytics are more accurate than others, and some are more applicable than others in terms of the actual usefulness of their true positives.

Steve, thanks. Let's see what else Kevin has to add and then I'll share more thoughts.

Hello John,

Even perfect video analytics performance cannot overcome mismanaged customer expectations.

My very first VA project was a system protecting Air Force One when it came into Boeing for periodic maintenance. After landing, it had to taxi across a public access roadway to reach the maintenance facility. The president is never on board at this time, but AF1 would still be a juicy target for mischief or otherwise. (None of this is classified information) They can and do block the road during the transfer, but they were concerned a vehicle, person, or device would be pre-positioned under the taxi bridge. Multiple cameras covered the area, with the live video feed sent to the airport control tower. The controllers were objecting to the additional workload of watching the cameras 24/7.

Unbeknownst to us, the customer locked the system in a closet during the pilot. Two months later, they opened the closet and saw 300+ pending alerts waiting to be viewed. They packed it up and sent it back.

Let's do the math. Before, you were supposed to stare at 6-8 cameras all day. Now, this system will do it for you and generate 1 or 2 alerts during your eight-hour shift. That the alerts were valid (people often walked under that bridge on the way to the beach, flat tires, cars stopping to watch other widebody aircraft cross over, etc) was irrelevant; the "additional workload" of 150+ "false alarms" killed the deal. You might be wondering, "What did they expect?" Exactly. And the answer is -- nothing that could be fixed by VA "getting better."

Another example was a highrise building in Houston. This security director wanted to detect objects left behind in case of explosives, as well as people lurking at ground floor exits waiting to slip in behind someone coming out. The system, very accurately, generated thousands of alerts per day. They started when the morning newspapers were placed outside the building tenant's interior doors. It continued during the day as mail, packages, and office supply deliveries did the same. Smokers congregated outside next to the exit doors. At the end of the day, people put their wastebaskets in the hall before locking up their office and leaving. At night, cleaning crews would leave behind carts, equipment, and supplies when entering individual offices. What the security director wanted to know -- he really did not want to know. And better accuracy would never have saved that sale either.

One last example was a system I deployed at a pharmaceutical campus that housed the Anthrax virus. They had a nice high fence around it, but a natural stream flowed through the campus. During the dry season, you could simply walk down the stream and right under the fence. Four cameras, two at each end, covered the area and with great accuracy generated alerts on anything bigger than a raccoon.

About a week later, the customer called to complain about a deer that frequented one side every morning and every night, generating alerts. "How cool is that?" I thought. "Free walk testing!" The customer found that unacceptable. Four alerts per day (two for each camera) times thirty days equals 120 "false alarms." I remember him saying almost word for word, "We would not tolerate that many false alarms from any of our other security systems either."

So unless Moore's law could drive VA innovation to the level of person vs. deer performance, this would still not be a successful sale. In each case, the salesperson had failed to gauge or manage customer expectations. Chief among them was presenting video analytics as an alarm system rather than an alert system.



"So unless Moore's law could drive VA innovation to the level of person vs. deer performance"

Yes, your 'solution' should know the difference between deers and humans. Ignore the state of today's technology. Think of it as a user with a job to be done.

S/he does not care that the state of computer vision in 2005 or 2015 can't handle that, they just want to get accurate alerts on human threats and none for deers.

If you can convince him otherwise, you are a crafty salesperson but that does not mean that you should expert the market to lower their standards to accept technology that does not do the job they want done.

Then what of the risk from a human donning a deer costume?

It is not better to have the customer look at the pretty deer twice a day as confirmation the system is working? Mark his presence on a calendar in the command center. Haven't seen him in a few days? Then it is time to walk test the system ourselves.

After a mentally challenged man drowned after sneaking into an amusement park at night, I installed a VA system at another that had access to a public body of water. The video analytics system, using thermal cameras along the shoreline, would alert when anyone entered the park from the beach. And, I warned them, it would also alert upon large, warm-blooded animals as well. Sea lions, in particular, were in common residence on the beach. The customer knew and accepted this before installation. Dummy me forgot to schedule the alerts for only at night.

A month later, the customer called to inform me of my error. I apologized and offered to fix it. But that was not why they called. They called to tell me how thrilled they were from catching multiple people sneaking into the park during the day to avoid the cost of admission. They had no idea that had been happening.

So instead of fretting over differentiating between sea lions and people, I set their expectations ahead of time. The result was a successful sale, a happy customer, and a signature account. I won't mention the client's name because, for all I know, they may still be using the system.

I disagree that managed expectations equate to lowered standards. I would contend my first three examples lowered their standards when they went back to their notoriously ineffective Mark I Eyeball systems.

"Then what of the risk from a human donning a deer costume?

It is not better to have the customer look at the pretty deer twice a day as confirmation the system is working?"

The practical risk of a human donning a deer custome is very low and something most users can accept.

The practical problem of getting 1000+ false alerts per year is a real operational issue.

Kevin, you are free to take the position that customers need to adjust (i.e., lower their expectations) but I am saying that video analytics will never go mainstream until video analytics get smart enough to make basic human like distinctions reliably.

I don't think managing means lowering. VA or just CCTV, both require a clear example if reality when selling.

For all of you naysayers, more news / claims that Moore's Law is slowing / dying.

The chips are down for Moore’s law

Ars Technica: Moore’s law really is dead this time