"Analytics", to me, infers the ability for the system to make intelligent decisions with repeatable or predictable results. Intelligent decisions would mean ignoring general motion and only alerting on specific objects of interest. Based on the test results here, the Axis system failed to show significant intelligence, as it had several false alarms. There is nothing to indicate it detected a "person" crawling, just that it saw the crawling as the same kind of random motion it picked up elsewhere and was unable to classify or filter out.
Such a system is, to me, nowhere near "far better than not using analytics", as it just generates random alarms and can't be trusted.
I think that it all depends on your application and reason of using analytics rule. If you want to go through all activity the next day to see what went on, then this analytics could help you accomplish this faster. It would at least reduce drastically the number of events compared to if you would have simply relied on motion detection.
I think that as one of the leading manufacturer of IP cameras, Axis needs to be more competitive in this aspect of their products. As per many who I have talked to that attended the Axis Summit in Jamaica, they mention that they are focusing and investing on improving their analytics. They also mentioned that most partners where not even using analytics yet so I guess some of us are just a little ahead in that sense.
Let's wait and see what this gives in the next couple of months!
Analytics are not and will never (for a long time, anyway) be 100%. Users should expect false alarms regardless of the vendor. What noise would you receive without analytics? This is my point. If this does not make sense, please explain.
Where does it say that they analytics failed to classify the crawling person vs. 'random motion' as you say? The crawling person displayed a green box that turned red upon crossing the line. This suggests the person was detected and triggered the line. Again, not sure what I'm misunderstanding but I'd genuinely like to know.
The shadows crossing the line showed the exact same response:
Given that it failed to properly suppress the shadows and false motion, there is no reason to believe the alarm on the crawling person was anything more than accidental. Going on the results of this test as shown here, the system did not exhibit signs of being able to actually be used to detect crawling people in a manner that would not generate excessive false alarms.
The thing is that if you have used solid analytics like Avigilon, although they are not 100% they are VERY high up there. Avigilon has AI which over time increases efficiency or the analytic rules and eliminates some detections that at the time of install would have generated a false alarm.
At our monitoring central we monitor several hundreds of Avigilon cameras and sometimes we can go many minutes without one single event. A lot has to do with the the way these rules are configured by our integrators and the understanding of how they work.
One thing that I find amazing is that on all analytic solutions that I have tested or used, Avigilon is way ahead as the best solution but what amazes me is that it is also by far the simplest to configure.
The main difference in configuration is that Motion Guard includes perspective setup which VMD4 does not, which improves detection somewhat. We found that it was a bit better at rejecting some false alarms, especially animals.
Fence guard is similar, but allows only tripwires/lines, not intrusion zones, and loitering guard is similar to Motion Guard except with a longer configurable dwell time.
My general recommendation is that you use Guard Suite instead of VMD4 if it's an option.