Camera Analytics Tested: Axis vs. Bosch vs. Sony

Author: Ethan Ace, Published on Sep 17, 2013

With the move to IP and HD now commonplace, companies are looking for new ways to compete. While analytics has long disappointed, many still hold out hope for it being the next big thing.

Many major camera manufacturers offer analytics built-in or for a minimal additional charge. However, skepticism of their performance is commonplace.

The Test

In this report, we tested 3 camera manufacturers analytics - Axis cross line detection, Bosch's Intelligent Video Analytics and Sony's Video Motion Filter (VMF).

The test focuses on tripwire / perimeter violation analytics as this is most common type used in security to alert against intruders entering into a restricted area.

We tested in 3 different scenes.

To start, we did a simple indoor tripwire at an entrance:

Then we progressed to an outdoor scene of moderate complexity and distance:

Finally, we tried a very long area with vegetation and shadows to maximize the challenge:

**** *** **** ** ** *** ** *** ***********, ********* are ******* *** *** **** ** *******. ***** ********* *** long ************, **** ***** **** *** **** *** ** ***** the**** *** *****.

**** ***** ****** ************* ***** ********* *****-** ** *** * minimal ********** ******. *******, ********** ** ***** *********** ** ***********.

The ****

** **** ******, ** ****** * ****** ************* ********* -**** ***** **** *********,*****'* *********** ***** ************ ****'* ***** ****** ****** (***).

*** **** ******* ** ******** / ********* ********* ********* ** this ** **** ****** **** **** ** ******** ** ***** against ********* ******** **** * ********** ****.

** ****** ** * ********* ******.

** *****, ** *** * ****** ****** ******** ** ** entrance:

**** ** ********** ** ** ******* ***** ** ******** ********** and ********:

*******, ** ***** * **** **** **** **** ********** *** shadows ** ******** *** *********:

[***************]

Key ********

**** *** *** *** ******** **** ***** *****:

******* **************

  • ***** **** ********* ******** **** **** *** ***** ** ********* is ** **** ** ********. **** ******* ***** ** ***** activations *** ** ******* ** ******* ***********. 
  • ***** ***** ******* **** **** ********** *** *******-** ********, *** inside-out, **** *** ****** ****** ****** *** ******. 
  • ***** ****** **** ********** ******* **** ** ****** ******* *** headlights **** ********, *** *** ******. ***** ****** ** ***** to ***** **** ***** ***** ********.
  • *** ** *** ***** ******** **** **** ******* ****** ********* may *** ****** *********** *********** *** ***** **** *********.

 *********** ***********

  • **** ***** ***** **** ********* ******** ******* ** **** ****** in **** ****** ***** ******* ******** **** ****** ***** ***-**** or **** ***-*****. *******, ** ******** **** ***** ***********, ****** by ******* ** ******* ***** ** *********, *** ***** ********* multiple *****.
  • ***** ***-****'* *** ********* ******** ** **** *** ******, *** better **** ****** **** ***-***** ** **** ***** , **** ** ***** or ******** ***********. *******, ** *** ***** ********* ***** *** drastically *******, *** ****** *********** *********.
  • **** ***-*****'* *****-** *** ****** ** *** *****, **** **** to ******** ****** ****** ** **** ***** *****. ******** **** detectable ** ****** *********, ****** ***** ** ***** *** ****. This ** *** ** ***** ** **** ** ****** **** settings ** **** ***-*****'* ********* *** ******** ** ***** *******, which ****** ** ****** ******* ** *** ** *****, *** detected ****** ******* ** ******** ** *** *** *****, ******* detections.

*****

**** ** *** ***** ** ********* ** **** **** **** by ************:

  • **** *****  ***** **** *********:~$** ***, **** ********* ** **-***** ** ********.
  • ***** ***-**** ***: *** ****** ******* **** **** *** ********* ** no ********** ******. *** *** *** ****** ****** *** ** licensed ** ** ********** **** ** ~$***.
  • **** ***-***** ***: **** ***-*****'* *****-** ********* *** ********* ** ** additional **** ** *** ********** *******. ***** ********** ******* ********** shipped **** **** *********.

***************

***** **** ********* *** ***** **** ****** **********, *** **** be ******* *********, ** ******** ********* *** *** ** ******* cameras *** *** ****** ** *****. ***** ****** ********* ******** these ******* *** **** ********* ** ***** ****** ******* **** them ** *** *****.

** **** ******, **** *** ********* ** **** *** *****, Bosch ***-****'* *** *** **** ********, **** ****** ** ***** or ****** ***********. ***** ****** ** ***** ** *** *** light *********** ****** *********, *******.

**** *****  ***** **** ********* *** ******** ** **** ******, *** performed ****** ** **** *** ***** **** ***** ***-****. *******, users ****** ** ***** ** *** ***** *********** *** ** headlights, *******, *** ****** *******, ****** *********.

** ** *** ********* **** ***-*****'* *** ********* ** ****** aside **** **** *** ****** ***** *** ** *** ****** detecting *******.

************* 

** ***** ***********, ** ****** ************* *** **** ** *** three ************* ******.

*****, **** ***** ***** **** *********. ************* ** **** ******, **** including *** **** (* ** * ********) ************* *********, **** ** ***** *****.

****, ***** ***-**** ***. *** ***-**** ******* *** **** ************* *******, ********* multiple *****, **** ****** *******, ****, ********, *** ***** **********. ** **** ********* *********** regarding ******* ******** *** ***** **** ******* *** *****, ****** troubleshooting ****** ******** ** ****** ***** *** *** ******* ****.

*******, **** ***-*****'* *** ***** ******** ********** *** ****** **** and *****, ** **** ** ******** ******** **** ** **** length *** *********. ** ** ***** ***** *** ** *** per ******, *** ******** **** ******. ***** ****** ** ******* when ************* ****** ****, ** ** *** *********** ****** ********* performance.

Outdoor ************

** ****** *** ***** **** ********* ** ******* ****** ******* scenarios.

********/******* ***

*****, ** ***** **** * ******* ***/******** ***** ** ****** up ** ***', ***** ** **** ******** ****:

****** *** ***, ********* ** **', *** ******* *** ********.

****** ***** ******* **** *** ******* *** ** ~***', **** P3354 and ***** ***-**** **** ***** ******* ******** ** *** *******. ** this ******** *** ******, **** ***** ************************** ** **** ** ***** **********, ** *** ******* ********** and ******* *** ****. ***** ***-**** ********* **** ****. *** ***** was ****** ** ******** ****** ******* ** **** *****, ** did *** *******.

*******, **** *** **** ******* *** *** *** ** *** entrance ** *** *** **** *** ****, **** **** ***** *** Bosch ***-**** **** ***** ********, ****** **** *****  ***** ********* ******** ***** as ** *** ** ***'. 

** *****, ** ***** *****, **** ***** *** ***** ***-**** **** *******, while **** ***-***** **** ***. ***** ***-**** ********* ****** **** *** ******* was ****** **** **** *** ******, *** ******, ** ** detected *** ****** **** **** ****** ** *** **********.

** ~***' *** ******, **** **** *****  ********. **** *********** **** slightly ***** *** ** *** ********** ** ********** ****** ****** the ******.

** **** ********, ~***', **** ***** ***** ********. *******, ********** ***** alarms **** ****** ** **** ******* ** *** ******* ****** the ******* ***. ****** *** **** **** **** *** **** partway ******** ****.

**** ***** ********

** **** ****** *** ******* ** *** **** ** *** office, ******* * ******* ******** ****** *** ********** ** *** perimeter, ******* ****** *********** *** ***** **** *********.

 

** ***', **** **** *****  *** ***** ***-**** *********, ***** **** ***-***** *** not (*** *** ** ** *** ****** ******):

 

** ***', ***** ***-**** ********* ** ***** *** ***** ** *** subject, ***** **** *****  *** ****** **:

 

** ***', **** ** *** ******* *********, **** *** ******* simply *** ***** ** ********:

 ** *****, **** *** ******* ******** **, ************* *-* ***, both **** ***** *** ***** ***-**** ********* ******** ** **' *****.

** **', ***** ***-**** ***** *********, ***** **** *****  *** ***. ****** this, ** ******* **** **** ** ****** *** *******.

********* *** *** ******* ******** ***** ***** ****** ***** * lux. ********* ***** ** ***** ***-**** ***** ** **** **', ***** here.

**** ***** *** **** ** ****** *** ******* ******** ** ** 75', *******.

Indoor ************

*** *** *********** ***** ***** ********************* ** ********* ** ******** ******** * *******. ** *** up *** ******* ** ***** ** *** ****** ***** ** simulate ****, ******* *** ******** ** *** ****** ** *** door, **** *** ******* ***** **' ****. ** **** *****, Axis *****  *** **** ***-***** ****** ********, *** ***** ***-**** *** *** *******:

**** ** *** ** *** *** ***** ***-**** ****** *******, **** the ****** **** ***** ******, ******* *** *******'* ***** *** waist. ** *** ***** *****, *** *******'* **** ****** ***** crossed *** ****. ****** ** ******* ** *** *******, ***** NBN-733V triggers ********, ** **** **** ***** . *******, ********* ** *** exact ************ ********, **** **** ******** *** ***** ***** ******** on ****** ****** ******* *** ****. ***** ****** ** ******* to ***** ****.

********** ****

******* *** * ****** *** *** ***** **** *********, ** performed **** ***** ** *** ********** ********** **** ** ***** to *********** **** ****** ****** ********** *** *****.

*****, **** ***** ***** **** *****  ***** **** *********'* *********** ** shadows. ** **** ****, ****** *** **** *** *** ** trigger **** **** *** ******* ***** ****** *** ******, *** motion ** *** ****** ** *** ***** ****** ***** *********** as ** ***** ****** *** **** ** *** ******** *********:

 **** ******** ** ** ***** ****** *** ****, ** ****. 

** **** ****, *** **** *** ***** ** ******* ** the ******* ***** *** ** *** **** ** *** ***** end ** *** ****. *******, *** ***** ******* ****** ** activate ** ** ***** ******** *** ** *** **** ************* to *** *******.

********* *** **** ******** ** ** ***** ** **** ** an ***** ** ****** *** ******. 

*******, ***** *** ****** ****** ***** * ****** ******** **** next ** *** ****, ** *** ******* ******* ** ****** what ***** **** **** ** *** ****.

Test **********

** **** *** ********* ******* *** ******** ******** *** **** test:

  • **** *****, ******** *.**.**
  • ***** ***-****, ******** *.** ***** **
  • **** ***-*****, ******** *.*.*

*** ******* **** ******* ********, **** ********* ************ ** */*** maximum.

 

Comments (19)

Do NOT ask about testing other analytics here. All recommendations and requests should go into this separate dedicated thread.

We do plan to test many other analytics - both from 'regular' camera manufacturer as well as video analytic specialists (VideoIQ, AgentVI, etc.). In addition, we plan to test other aspects of Bosch's IVA (like color detection) in a separate report.

What VMS was used to test these camera-based triggers?

Your summary of competitive performance sums up one of the common challenges of video analytics: striking the right balance between probability of detection and low false alarm rate.

While the analysis is great, I think you could improve it by having some quantitative measures of comparative performance that factors in both probability of detection and the rate of false alarms.

The UK government i-LIDS scheme uses a so-called 'F1' metric that gives a single result factoring in both of these issues. Full details on p32-33 of this doc.

Also, important is to ensure you are using a repeatable scene so the comparison between different manufacturers is meaningful. This can be awkward when using cameras and not encoders but we've had some luck by pointing systems at a monitor under a black cloth.

My personal opinion is that systems need to have an F1 score of > .95 for them to be genuinely useful in an event detection scenario.

Ben, thanks for the feedback on quantitative metrics and repeatable scenes.

As a general rule, IPVM avoids quantitative metrics as they are too reductionistic and prone to vendor misuse. If I say Camera A scored a .96, Camera B scored a .95 and Camera C scored a .92, what does that really mean? Hopefully, it means Camera A is the best but in all situations? and by how much? Also, if we did quantitative metrics/rankings the temptation for manufacturers to misuse this in promotions would be through the roof.

As for repeatability, we test all cameras at the exact same time and place in real world scenes. The Axis, Bosch and Sony cameras are set inches from each other on a tripod and taken to each location. Because of that, we know that the cameras are all looking at and dealing with the exact same conditions.

John, I think IPVM’s tests lack ‘good scientific method’. Firstly, IPVM’s tests are not reproducible and so you can never do the same test again and you can’t compare the results of a new camera with those already tested. Secondly the range of scenarios tested is very small. Analytics has to work 24/7, 52 weeks a year whereas IPVM tests with whatever weather is happening on the day of the test. I have seen many examples of analytics that worked well for a while only to develop a serious fault later on due to an environmental change. IPVM should be looking for these sorts of problems. The solution is for IPVM to build up a library of test videos that are known to present problems and test the cameras in a lab environment using a wide range of test recordings. The i-LIDS test library that Ben White mentioned has over 100 test clips for just one scenario and it includes many examples of things that could cause false alarms – here are just a few of them: wind shaking the camera, small animals in the detection area, defective fluorescent bulb flicking on an off at night, heavy rain, rapid sunlight change due to passing cloud. IPVM provides a valuable service and has rightly gained a lot of influence and respect, but with this comes a responsibility to do the tests well.

How are we going to test IP cameras with recorded video clips - point the cameras at a monitor? I await your solution to this that does not introduce its own major flaws / problems.

A major of any 'smart' IP camera's performance is how it captures video - low light, WDR, bright lights, etc. Using recorded video eliminates key real world differentials of IP cameras.

Today's test is of free / low cost add ons to 'regular' IP cameras. It's the beginning of a new series of test coverage on analytics. We will test more conditions but we start with more simple ones and build up to see how far an analytic can go.

Great report, folks, and one I've been looking forward too for awhile.

Hi John

That's the current big limit of embedded analytic based on ... pixel motion in 2D.

Nobody in real life is using it in outdoor environement on large FOV. This should be kept for "easy" indoor short range detection. Same for most "pixel" autotracking ... when rain, tree, shadows, bugs, sunbeam, reflects generate too many false positive day/night and eventually doens't catch a right negative it's time to go to analytic.

By the way some camera systems now can be set with day or night detection area to get better results. (like shutter/Wdr/auto iris sepcific profils at night when you want t o increase LPR succes rate)

So thiner detection zone, due to thiner IR angle at night, or less sensitive settings due to gain and noise at night ...

"Nobody in real life is using it in outdoor environement on large FOV."

We (VideoIQ) have embedded edge analytics, and frequently cover outdoor areas of 1/2 acre or more with a single camera. Yes, it's far more advanced that what you get for free (or $30), but the technology *is* there.

Good Morning John, Insightful article as always. It appears that all of these cameras were tested in excellent Hawaiian weather. Rain, as we experience in Florida, will impact these results. It would be great to see a comparison of these manufactures in bad weather.

Jeff,

I did not author this report. You will see that Ethan's name is under the byline, not mine.

This report was done in Pennsylvania by Ethan, Figen and Derek. We no longer do any testing in Hawaii but the folks in Pennsylvania will be happy to know that their state looks like Hawaii :)

Rain is a good consideration (snow as well). We will do more environmental tests in the fall (rain) and then winter (snow).

Thanks for the feedback.

John

John,

Yes, that was pointed out to me earlier by an astute client and friend of mine. Having said that, based upon our experience in wet weather conditions, only two products have worked favorably: VideoIQ and IOI Image.

Those are 'real' / specialist analytics. As we continue our shootouts, we'll move 'up' and test Axis and Bosch against VideoIQ, Agent VI, etc.

Agreed.

You might then add a "type of alarm and Meta data part"

What type of trigger may I generate (general or by rule) may my VMS manage it, and may I use advanced search in my VMS/NVR thanks to the collected meta data?

(MD standardization isn't done yet ...)

Sea (see) Sun and ....Video, no test during hurricane :-)

Brian you misunderstood me. Im' talking about Motion detection. Some people call it analytics.. but for me it's improved motion detection, and this does not work outdoor especially at night.

When you set up a depth of view (so 3D Fov), and can detect a shape, a speed, eventually and object category and filter Snow / rain / Shadows and Tree moves from scene, it's obviously more analytic in my mind

Sure there is some better systems on the market but we are talking about these three tested.

Analytic sounds great but is applied to too many low end, cheap engines. (I'm doing analytic training here and teach people how to test in lab outdoor conditions ...)

Marc -

Ah, I see what you mean. Yes, I agree that these things tested here are not what many people would consider "video analytics" in the current state of the industry.

In my training classes, I often point to "cross line detection" as a very poor approach, especially outdoors, though it seems to be the most common example.

When you really dig into it, "cross line detection" seems to be a way to try and filter out the majority of false alarms. Since these systems cannot ever truly classify an object, they try to filter the motion in the scene down to the point that they only look for "large" size objects, moving across a defined boundary line. That essentially distills the motion occuring on hundreds of thousands, or millions, or pixels down to a very small group. Then they look just at pixels moving in a given direction. The problem (IMO), is that you also have a VERY narrow part of the FOV where you are looking for activity. If anything in the scene prevents the cameras from "seeing" an object as it crosses the line (poor lighting, poor contrast, light rain or fog, another object (parked car, etc.) block part of the perimeter) you have a high risk that you're going to miss valid events.

We usually recommend a Region of Interest (ROI) that covers the *entire* secured area, like a large parking lot. Then we look for any person or vehicle activity in that area when the system is armed. In an ideal case, this works exactly like the line cross analytics, as soon as a person steps over the line onto the secure property an alarm is triggered. But for the cases where the person entering the property could not be seen and classified immediately (as described above) you might get an alarm 10 seconds "late", when they're already 50ft into the secure area. This is far better than no alarm at all, because they managed to miss being detecte while going over the very narrow area of the corss-line detection pixels.

So, good, we are talking about the same things !

Login to read this IPVM report.
Why do I need to log in?
IPVM conducts unique testing and research funded by member's payments enabling us to offer the most independent, accurate and in-depth information.

Related Reports

Beware Amazon Go Store Hype (Tested) on Nov 13, 2018
IPVM's trip to and testing of Amazon Go's San Francisco store shows a number of significant operational and economic issues that undermine the...
Genetec Privacy Protector Tested on Nov 12, 2018
Genetec has built Kiwi Security's Privacy Protector into Security Center, an analytic which anonymizes individuals in cameras' fields of view...
Kogniz Silicon Valley AI Startup Profile on Nov 07, 2018
Kogniz is a Silicon Valley company that aims to bring AI analytics to security and surveillance, centering on their own smart cameras: We spoke...
Dahua Dual Imager Dome Camera Tested (HDBW4231FN-E2-M) on Nov 07, 2018
Dahua has introduced a dual-imager dome model, the HDBW4231FN-E2-M, with two independently positionable sensors including integrated IR, not found...
Avigilon Opens Up Analytics And Cameras on Nov 06, 2018
Avigilon is opening up. The company historically famous for advocating its own end-to-end solutions and making it harder for 3rd parties to...
Axxon Face Search Tested on Oct 26, 2018
AxxonSoft has brought facial recognition to their Axxon Next VMS for free with the simply named Face Search, claiming to allow users to find...
Hanwha Dual Imager Dome Camera Tested (PNM-7000VD) on Oct 18, 2018
Hanwha has introduced their first dual-imager model, the PNM-7000VD, a twin 1080p model featuring independently positionable sensors and a snap-in...
Huawei Admits AI "Bubble" on Oct 16, 2018
A fascinating article from the Chinese government's Global Times: Huawei’s AI ambition to reshape industries. While the Global Times talks about...
Dahua Face Recognition Camera Tested on Oct 15, 2018
Dahua has been one of the industry's most vocal proponents of the value that AI creates: As part of this, Dahua has released a facial...
IACP 2018 Police Show Final Report on Oct 08, 2018
IPVM went to Orlando to cover the 2018 IACP conference, the country's largest police show (about as big as ASIS), examining the 700+...

Most Recent Industry Reports

Throughtek P2P/Cloud Solution Profile on Nov 15, 2018
Many IoT manufacturers either do not have the capabilities or the interest to develop their own cloud management software for their devices....
ASIS Offering Custom Research For Manufacturers on Nov 15, 2018
Manufacturers often want to know what industry people think about trends and, in particular, the segments and product they offer.  ASIS and its...
ISC East 2018 Mini-Show Report on Nov 15, 2018
ISC East, by its own admission, is not a national or international show, billed as the "Largest Annual Northeast U.S. Security...
Hikvision Silent on "Bad Architectural Practices" Cybersecurity Report on Nov 14, 2018
A 'significant vulnerability was found in Hikvision cameras' by VDOO, a startup cybersecurity specialist. Hikvision has fixed the specific...
French Government Threatens School with $1.7M Fine For “Excessive Video Surveillance” on Nov 14, 2018
The French government has notified a high-profile Paris coding academy that it risks a fine of up to 1.5 million euros (about $1.7m) if it...
Integrator Credit Card Alternative Divvy on Nov 13, 2018
Most security integrators are small businesses but large enough that they have various employees that need to be able to expense various charges as...
Directory of Video Intercoms on Nov 13, 2018
Video Intercoms, also known as Video Door-Phones or Video Entry Systems, have been growing in the past decade as more and more IP camera...
Beware Amazon Go Store Hype (Tested) on Nov 13, 2018
IPVM's trip to and testing of Amazon Go's San Francisco store shows a number of significant operational and economic issues that undermine the...
Magos Radar Company Profile on Nov 12, 2018
Magos America General Manager Yaron Zussman admits when he first came across Magos, he asked himself: "What's innovative about radar?" Be that as...
Genetec Privacy Protector Tested on Nov 12, 2018
Genetec has built Kiwi Security's Privacy Protector into Security Center, an analytic which anonymizes individuals in cameras' fields of view...

The world's leading video surveillance information source, IPVM provides the best reporting, testing and training for 10,000+ members globally. Dedicated to independent and objective information, we uniquely refuse any and all advertisements, sponsorship and consulting from manufacturers.

About | FAQ | Contact