We use this - (well what it really is - Safezone edge from Digital barriers ;) ). Its usually spot on. The bush detection shouldn't have been an issue as if the zone was set correct and drawn around to avoid the area IMHO.
You can easily do a "lite" install to most VMS's by sending TCP notifications in to the VMS. The more complicated full version is more work but also quite useful on larger installations.
The only issue we get is reflections, so if you have a large vehicle near a road the anayltics will see a mirror of this and trigger, same for people walking past. Something any analytic will suffer - poor positioning.
The bush detection shouldn't have been an issue as if the zone was set correct and drawn around to avoid the area IMHO.
I'll let Ethan / Rob talk about the specific test results with 'bush detection' but I want to emphasize a fundamental point.
If an analytic system cannot tell the difference between leaves and a human, that's a major constraint. There are simply too many areas and applications where leaves and trees will be an issue. It also shows the analytics 'intelligence' is quite rudimentary.
We removed the brush from the zone during testing and it improved, but did not eliminate false alerts. We still ended up with ~1 per day as the bush encroached too near the zone drawn strictly on the ground.
Yes, it's true that you can draw it further still from the brush, but in real world scenes, this is not always possible, as it may narrow a zone too much and risk missing detections, or brush may simply be in the middle of a zone.
Moreover, others such as Bosch IVA and Avigilon (test coming) did not trigger on these same areas.
i have a question regarding the methodology of the testing. In full disclosure I am a field guy with one of the manufacturers tested.
In all of the tests so far, the cameras have been set to a wide field of view. I suspect that towards the back of the field of view the stated capabilities of some or all of the analytics are exceeded. My guess would be that you are attempting to gauge the ultimate detection performance (ppf) on a level playing field? I’m wondering how you weigh false detections outside of the design envelope of the analytics? I don’t know if there were any occurrences with any of the manufacturers, but I was wondering if it was a consideration.
The false alerts we have seen in this as well as other tests on brush have almost all been triggered by brush in the near field of view, not at long ranges. Additionally, we have tested all the analytics in this series in both wide and narrow fields of view to check differences, and performance has been fairly consistent in both wide and narrow AOVs in terms of false alerts. Meaning, if a camera has issues with brush using wide fields of view, it also had issues in narrower fields of view.
Also worth noting, we have discussed our findings with manufacturers as we've been testing these analytics, both tech support and product management where possible. We have adjusted settings and retested based on that feedback, as well. So manufacturers have had opportunity to point out whether we were testing outside of their capabilities.
"Axis Perimeter Defender test results were problematic for 'high security'"... I think any IVA will be problematic for outdoor applications if were used visual cameras, because there are smoke, dust, illumination issues, rain, etc.. a lot of potential troubles that will inevitably produce false alerts. Besides that, if a simple bug lend at the lenses, your detection system will fail... In my oppinion, a system with these fragility can not be used in an 'high security' application.
In my projects, I avoid to use IVA in outdoor and usually use auxiliar systems to detect acuratelly.
Good point Rafael, in addition we have seen Axis AVMD 4 completely miss a person walking across a parking lot on a few occasions. We tested it side by side in a real environment (office parking, docks) and it missed too many people to use effectively although it did remove a ton of non-critical alerts. Missing the few real people is a big deal, especially if you're not doing a secondary full time recording.
Let me make clear my comment. I was no criticizing Axis specifically, but, the technology of Video Analysis (does not matter the manufacturer). I dont think it is a good technology for outdoor applications in 'high security' systems using visual cameras. There are several non visual systems much more reliable for this.
I dont think it is a good technology for outdoor applications in 'high security' systems using visual cameras.
I agree entirely. The problem is that software creators, vendors, and even trade magazines are all over how wonderful video analytics are. This means that when I tell our director and GSOC leadership that I do not recommend analytics because they are not sufficient to remove live monitoring and onsite security for critical locations I get a very aggitated "so I guess xyz vendor, videos online, and the news articles I read are all lying about its effectiveness?"
As I writed in a comment of this article, the most embarrassing situation is when I need to desagree about customer's video analytics wrong perception of effectiveness. The exaggerated marketing (sometimes almost unreal), creates in the customer's mind a unattainable level of accurance, and the installer needs to achieve it... so, it is a frustrated customers industry. I'm sure that IVA technology has a huge potential for video surveillance applicantions, and, there are some great applications right now, mainlly in retail (that can tolerates a bit less accurance) and indoor controlled environments. But, I dont fell fine in use it for outdoor security applicantions, yet. I think when deep learning video analysis, like Camio and Briefcam, who can really 'understand' what is happening, not just see a mass of pixels crossing a line or entering in a forbiden area, start to be viable to real time applincations, we will experience a revolution in the video surveillance market, but, while it does not happen, I keep using non visual systems to detect in outdoor.
How about the other available perimeter VCA options on Axis ACAP? There are quite a few of them besides Perimeter Defender. For example Agent Vi. Is there any experience with these other options, how well they perform against some of the criteria in these tests?
AgentVi, savVi (listed on the the AgentVi website, but no longer in the ACAP portal, wonder why)
Digital Barriers, SafeZone-edge (what Perimeter Defender is based on, is it better in original form?)
Jemez Technology, Eagle-i-Edge (I heard Axis was testing this on display in one of their experience centers a while back)
Two other criteria I might submit are
1. VCA + Thermal camera performance (a common combination at perimeters)
2. The option for “burning in” the metadata bounding boxes into the image itself before leaving the camera. This is bad practice in some circles, but does ensure that the alarm and bounding boxes show up at the monitoring station (regardless of PSIM) and they they are in sync – this is still limited.