Axis Perimeter Defender Video Analytics Tested

By IPVM Team, Published Jul 12, 2018, 10:00am EDT

Axis 'high security' video analytics offering is Perimeter Defender, OEMed / developed with Digital Barriers.

But how good is Perimeter Defender? And how does it compare to Axis VMD4 and Guard Suite analytics?

We tested Perimeter Defender in indoor and outdoor scenes to find out, using two cameras:

 

We tested in multiple scenes outdoors and indoors, examining:

  • Configuration and calibration
  • False alert performance
  • Detection range
  • Detection scenarios
  • Comparison to Axis VMD4 and Guard Suite analytics
  • Camera compatibility
  • VMS integration

Readers should also see our other updated camera analytics tests, including Axis Guard SuiteAxis VMD4Bosch IVA Video Analytics And Motion+ VMDHanwha Wisenet X AnalyticsHikvision Intrusion Analytics, and Dahua Intrusion Analytics.

*******

**** ********* ******** **** results **** *********** *** 'high ********'. ***** ** performed ****** **** *** free **** *** *** ~$60 ***** *****, ** still *** *********** ***** alerts ********. ***** *** premium ***** *** **** 'high ********' ****'* ************ for ****-**** ***** ******, Perimeter ******** ***** ***** problematic.

****, *************, ** *** upcoming ******** **** ****, there *** * ***** vendors **** ******* ****** performing ********* *** ***** that **** **** ********, low ***** ***** *********.

** *** *** *********:

  • ******* ******* ***** ******:********* ******** *** *** trigger ** **** ****** sources ** ***** ****** outdoors, **** ** ***** animals, ******* ******, ** shadows.
  • **** ********* *****:********* ********'* ********* ***** was ****** **** **** competitive ********* ******, **** detection ** **** ~*-* PPF, ***** **** ********* required **+ *** ******** detection.
  • ****/******** ***********:********* *********** ****** ******* and ********** ** *** tests, **** *****-***** ******* within * *** ****** of ****** ****, *** ground ***** ******** *******, and *** ******* ********** estimated.

*******, ***** **** *** negatives:

  • ********** ***** ****** ***********: **** ********* ***** ** moving ***** ** **** screen *****, ********* ******** frequently ********* ***** ******. Alerts **** ******* ** less **** *** *** day **** ******** ***** areas **** *** ********* zone.
  • **** ***** ******:******* ** **** *********** camera *********, ********* ******** triggered ********** ** **** dripping **** *** ******'* dome, **** ******** ****** sent ****** ******** ****** over *** ****** ** testing (**** *** *****).
  • ******* ***** **** **** VMSes: **** ******* *** ********* require ** ********** ******** component ** ********** ** order ** ********* ******** to *** ***, ** additional **** *** ******** when ***** ***** ** Nx *******, *** ******** by **** ***** ****** analytics ****** (***** ***, VMD4, ******, ***.). 

****: ** **** * future **** ** *** Axis ********* ** ** Axis ******* ******, ****** the ************* ****** ***** point ** ******* **** limit *************, ********** ** results.

Vs. ***** *****/****

********* ******** *********** ****' other ********* (****/****** *****) in **** ***** ***** resistance *** ********* *****. VMD4 *** ***** ***** both **** *********** ** blowing *****, *******, *** especially ***** *******. ********, the ****** *** ******** at **-** ***, ***** Perimeter ******** ****** **** 5-6 *** *** ******** detection.

Camera *************

********* ******** ** ************** **** ** ****' current * *** * series ****** *** ****** M ****** ******. **** that ****' ********** ******* listing ** *** ** to ****, *** ** verified ************* **** **** product ********** ****** *******. Additionally, **** **** **** may ****** *********** ** other ****** ***** ** customer ******* *** ** such ** ********* ***** contact **** ** *** desired ***** ** *** listed.

**** ****** ******* ** not ***** ** ***** as ***** ***** ***** or **** ************, ***** support ****** *** ******* models, *** ****** **** mid ** **** *** models ***** *** ****** to ** **** ** these ************.

*******

********* ******** ******** **** for ~$*** *** *** camera. ****** ******** *** ten ***** *** *********. This ******* ** ****** than ****' *** ***** Suite (~$** *** ***********). Many *********** ******* ***** analytics ** ** ********** charge **** *** ******** of ***** *******.

Separate Application *** *************

********* ******** ** ********** using * ******** ***********, not ** *** *******' web *********, ******* ** Bosch *** *** ********'* H4 *********. **** *********** does *** **** ** run **** **********, **** events *** ******** **** directly **** *** ******* to ********** ***.

Automatic ***********

********* ******** **** ** automatic *********** ****** ***** analyzes ***** ** * subject ******* ********** *** field ** **** ** interpret *** *****. ***** calibration, ***** *** ********* with ** ***** *** verification, ******* *** ******* at ******* ******, *** marking *** ******* (***** in ****), ***** **** a "*********" ****** ***** indicates *** ******'* ********** in **** ***********. ***** calibration, ***** *** **** feed *** ******** **** used *** *********** **** the ******** ** **** detection ** ***** ******.

** *** *****, ********* calibration ****** ******** *** accurately ***** **** ** was ********* *******, **** subject **** **** ***** at *** ******. ** smaller ****** ******, ****** calibration *** ********, ** the ******** ***** *** properly interpret *** *****.

** ****** *** *********** process ** **** *****:

Improved ***** ***** **********

****** ****' ***** *********, ********* Defender *** *** ****** from ***** ****** ** many ** *** ****** outdoor *******. *** *******, cameras *** *** ***** on ***** ******* ** the **** ***** ** view, ****** ** **** and ***** *****.

****** ******* *** *** trigger *** ****** ** our *****, ******, ***** below.

*******, ********* ******** *** significant ****** **** ***** alerts ****** ** ******* brush, **** ******** ****** on **** ******* ***** days (***** *****). **** when ******** ***** ***** from *** ********* ****, some ****** **** ***** triggered (~* *** ***) due ** ***** ******.

************, **** ***** ********* caused ***** ******, **** as *** ******* ****** down *** **** *****, with ******* ***** ****** during ******** ****** ** the ****** ** *******.

Solid ****** ***********

******** ****** ******** ** larger ******* *****, ********* Defender *** **** ** used ******* (**** ****** calibration).

*******, ********* ******** *** not ***** ** *** typical ***** ***** *******, such ** ****** ******* on *** ***, ***** below:

 

************, ** ****** **** triggered **** ******** ****** on/off, * ****** ****** false ***** ****** ** other *****:

*** **** ****** ** ******** rooms ******* *******/***********:

Detection *********

********* ******** ************ ******* on *** ******* ** ~5-6 *** ****** *** and ***** ***** ******** in *** ********* ***** of **** ******/*********. ***** ranges **** **** ** the ******* **** ** our *****, ****** **** most ***** *********, **** the ********* ** ***** (~4-5 *** ** **** scenes). ******* ****** ********* ranged **** ~** *** to **** **. 

  • *****-*** (***/*:*): ~***' ****/~***' ********
  • *****-*** ** ** (***/**:*): ~***' HFOV/~200' ********

Detection *********

********* ******** ******** **** ********* *********:

  • *********: ******** **** ** object ****** * ****
  • **** ********: ******** **** when ******* ***** **** one ******** **** **** another
  • ***********: ******** **** ******* enter * **** ******* first ******** *******
  • *********: ******** ***** * specific ****** ** ****

** ****** ******** ******** in **** *****:

** *** *****, ***** more ***** ***** ****** without *****, **** ****** and ***** ***** *********** similar ** *********. *** example, ** *** **** crossing ***** *****, *** camera **** ****** **** the ******* ***** **** the ********* **** ** the **** ****** ** the ********. 

VMS ***********

********* ******** *********** ****** ********* on *** ***, **** Genetec *** ********* ********** both ****** *** ******** boxes, ***** ***** *** Network ***** ******* **** events, **** ** ******** boxes *********. ******** ******* Center **** *** ******* either.

Milestone/Genetec ***********

** ********* *** *******, cameras *** ******** **** zone *** ******** *** information, ** **** ** a *****/*** ***** ********* in *** *** ***** corner, ***** *** **** an ********'* ********* ****** than ******* ******** ***** in *****-****** *****.

****** *** **** ** used ** ****** ******, shown ****:

***** ************ ******* ********** software ********* ** *** VMS ******, ****** *** Perimeter ******** ******, ***** translate ******** **** *** proper ****** *** **** VMS. **** **** **** is * *********** ***********, only ******** ********, *** is *** ********* *********.

** ******** ** *** bridge, ******* ***** **** have *** ** **** SDK ******** ********* ** their ********* ******* ** order ** ******* ********* Defender ******, * ~$*** MSRP *******.

Exacq/Network ***** ***********

** ***** *** ******* Optix, ********* ******** ** only ********** ***** ******, with ** ******** ***** displayed. ***** ************ ** not ******* ********** ******** and **** *** ** the ***. ** **** VMSes, ***** **** **** select ********* ******** ** an ***** ******* ** start *********, **** *************, trigger ******, ***., ***** in ***** ****:

Versions ****

*** ********* ******** *** software ******** **** **** during *******:

  • **** *****-***: *.**.*
  • **** *****-*** ****: *.**.*
  • ********* ********: *.*.*.*****

 

Comments (15)

We use this - (well what it really is - Safezone edge from Digital barriers ;) ). Its usually spot on. The bush detection shouldn't have been an issue as if the zone was set correct and drawn around to avoid the area IMHO.

You can easily do a "lite" install to most VMS's by sending TCP notifications in to the VMS. The more complicated full version is more work but also quite useful on larger installations.

The only issue we get is reflections, so if you have a large vehicle near a road the anayltics will see a mirror of this and trigger, same for people walking past. Something any analytic will suffer - poor positioning.

The bush detection shouldn't have been an issue as if the zone was set correct and drawn around to avoid the area IMHO.

I'll let Ethan / Rob talk about the specific test results with 'bush detection' but I want to emphasize a fundamental point.

If an analytic system cannot tell the difference between leaves and a human, that's a major constraint. There are simply too many areas and applications where leaves and trees will be an issue. It also shows the analytics 'intelligence' is quite rudimentary.

bush or human - that is a fair point! lol :)

We removed the brush from the zone during testing and it improved, but did not eliminate false alerts. We still ended up with ~1 per day as the bush encroached too near the zone drawn strictly on the ground.

Yes, it's true that you can draw it further still from the brush, but in real world scenes, this is not always possible, as it may narrow a zone too much and risk missing detections, or brush may simply be in the middle of a zone.

Moreover, others such as Bosch IVA and Avigilon (test coming) did not trigger on these same areas.

Hi Ethan,

i have a question regarding the methodology of the testing. In full disclosure I am a field guy with one of the manufacturers tested.

In all of the tests so far, the cameras have been set to a wide field of view.  I suspect that towards the back of the field of view the stated capabilities of some or all of the analytics are exceeded.  My guess would be that you are attempting to gauge the ultimate detection performance (ppf) on a level playing field?  I’m wondering how you weigh false detections outside of the design envelope of the analytics? I don’t know if there were any occurrences with any of the manufacturers, but I was wondering if it was a consideration.

Thank you

The false alerts we have seen in this as well as other tests on brush have almost all been triggered by brush in the near field of view, not at long ranges. Additionally, we have tested all the analytics in this series in both wide and narrow fields of view to check differences, and performance has been fairly consistent in both wide and narrow AOVs in terms of false alerts. Meaning, if a camera has issues with brush using wide fields of view, it also had issues in narrower fields of view.

Also worth noting, we have discussed our findings with manufacturers as we've been testing these analytics, both tech support and product management where possible. We have adjusted settings and retested based on that feedback, as well. So manufacturers have had opportunity to point out whether we were testing outside of their capabilities.

"Axis Perimeter Defender test results were problematic for 'high security'"... I think any IVA will be problematic for outdoor applications if were used visual cameras, because there are smoke, dust, illumination issues, rain, etc.. a lot of potential troubles that will inevitably produce false alerts. Besides that, if a simple bug lend at the lenses, your detection system will fail... In my oppinion, a system with these fragility can not be used in an 'high security' application.

In my projects, I avoid to use IVA in outdoor and usually use auxiliar systems to detect acuratelly.

Good point Rafael, in addition we have seen Axis AVMD 4 completely miss a person walking across a parking lot on a few occasions.  We tested it side by side in a real environment (office parking, docks) and it missed too many people to use effectively although it did remove a ton of non-critical alerts.  Missing the few real people is a big deal, especially if you're not doing a secondary full time recording.

Let me make clear my comment. I was no criticizing Axis specifically, but, the technology of Video Analysis (does not matter the manufacturer). I dont think it is a good technology for outdoor applications in 'high security' systems using visual cameras. There are several non visual systems much more reliable for this.

 

 

 I dont think it is a good technology for outdoor applications in 'high security' systems using visual cameras.

I agree entirely.  The problem is that software creators, vendors, and even trade magazines are all over how wonderful video analytics are.  This means that when I tell our director and GSOC leadership that I do not recommend analytics because they are not sufficient to remove live monitoring and onsite security for critical locations I get a very aggitated "so I guess xyz vendor, videos online, and the news articles I read are all lying about its effectiveness?"

As I writed in a comment of this article, the most embarrassing situation is when I need to desagree about customer's video analytics wrong perception of effectiveness. The exaggerated marketing (sometimes almost unreal), creates in the customer's mind a unattainable level of accurance, and the installer needs to achieve it... so, it is a frustrated customers industry.
I'm sure that IVA technology has a huge potential for video surveillance applicantions, and, there are some great applications right now, mainlly in retail (that can tolerates a bit less accurance) and indoor controlled environments. But, I dont fell fine in use it for outdoor security applicantions, yet.
I think when deep learning video analysis, like Camio and Briefcam, who can really 'understand' what is happening, not just see a mass of pixels crossing a line or entering in a forbiden area, start to be viable to real time applincations, we will experience a revolution in the video surveillance market, but, while it does not happen, I keep using non visual systems to detect in outdoor.

I have a 50 acres campus I manage. We're currently thinking of using Perimeter Defender. I was wondering how well it performs today a year later?

How about the other available perimeter VCA options on Axis ACAP? There are quite a few of them besides Perimeter Defender. For example Agent Vi. Is there any experience with these other options, how well they perform against some of the criteria in these tests?

UM4, we have not tested Agent Vi yet but I will queue it up internally for testing.

The only other options we have tested are VMD4 and Guard Suite, which did not perform as well as Perimeter defender, which we noted in the report.

That sounds great, thanks Rob!

If I may, here's a wish-list

AgentVi, savVi (listed on the the AgentVi website, but no longer in the ACAP portal, wonder why)

Digital Barriers, SafeZone-edge (what Perimeter Defender is based on, is it better in original form?)

Jemez Technology, Eagle-i-Edge (I heard Axis was testing this on display in one of their experience centers a while back)

Two other criteria I might submit are

1. VCA + Thermal camera performance (a common combination at perimeters)

2. The option for “burning in” the metadata bounding boxes into the image itself before leaving the camera. This is bad practice in some circles, but does ensure that the alarm and bounding boxes show up at the monitoring station (regardless of PSIM) and they they are in sync – this is still limited.

Read this IPVM report for free.

This article is part of IPVM's 6,743 reports, 909 tests and is only available to members. To get a one-time preview of our work, enter your work email to access the full article.

Already a member? Login here | Join now
Loading Related Reports