Camera Analytics Tested: Axis vs. Bosch vs. Sony

Published Sep 17, 2013 04:00 AM

**** *** **** ** ** *** HD *** ***********, ********* *** ******* for *** **** ** *******. ***** analytics *** **** ************, **** ***** hold *** **** *** ** ***** the**** *** *****.

**** ***** ****** ************* ***** ********* built-in ** *** * ******* ********** charge. *******, ********** ** ***** *********** is ***********.

The ****

** **** ******, ** ****** * camera ************* ********* -**** ***** **** *********, *****'* *********** ***** ********* [**** no ****** *********] *** ****'* ***** Motion ****** (***).

*** **** ******* ** ******** / perimeter ********* ********* ** **** ** most ****** **** **** ** ******** to ***** ******* ********* ******** **** a ********** ****.

** ****** ** * ********* ******.

** *****, ** *** * ****** indoor ******** ** ** ********:

IPVM Image

**** ** ********** ** ** ******* scene ** ******** ********** *** ********:

IPVM Image

*******, ** ***** * **** **** area **** ********** *** ******* ** maximize *** *********:

IPVM Image

Key ********

**** *** *** *** ******** **** these *****:

******* **************

  • ***** **** ********* ******** **** **** the ***** ** ********* ** ** high ** ********. **** ******* ***** or ***** *********** *** ** ******* or ******* ***********.
  • ***** ***** ******* **** **** ********** for *******-** ********, *** ******-***, **** the ****** ****** ****** *** ******.
  • ***** ****** **** ********** ******* **** as ****** ******* *** ********** **** possible, *** *** ******. ***** ****** be ***** ** ***** **** ***** where ********.
  • *** ** *** ***** ******** **** that ******* ****** ********* *** *** always *********** *********** *** ***** **** *********.

*********** ***********

  • **** ***** ***** **** ********* ******** objects ** **** ****** ** **** scenes ***** ******* ******** **** ****** Bosch ***-**** ** **** ***-*****. *******, it ******** **** ***** ***********, ****** by ******* ** ******* ***** ** incidence, *** ***** ********* ******** *****.
  • ***** ***-****'* *** ********* ******** ** well *** ******, *** ****** **** either **** ***-***** ** **** ***** , **** ** ***** ** ******** activations. *******, ** *** ***** ********* range *** *********** *******, *** ****** activations *********.
  • **** ***-*****'* *****-** *** ****** ** our *****, **** **** ** ******** detect ****** ** **** ***** *****. Vehicles **** ********** ** ****** *********, though ***** ** ***** *** ****. This ** *** ** ***** ** part ** ****** **** ******** ** Sony ***-*****'* ********* *** ******** ** other *******, ***** ****** ** ****** objects ** *** ** *****, *** detected ****** ******* ** ******** ** set *** *****, ******* **********.

*****

**** ** *** ***** ** ********* in **** **** **** ** ************:

  • **** ***** ***** **** *********:~$** ***, **** ********* ** **-***** ** licenses.
  • ***** ***-**** ***: *** ****** ******* ship **** *** ********* ** ** additional ******. *** *** *** ****** models *** ** ******** ** ** additional **** ** ~$***.
  • **** ***-***** ***: **** ***-*****'* *****-** analytics *** ********* ** ** ********** cost ** *** ********** *******. ***** generation ******* ********** ******* **** **** analytics.

***************

***** **** ********* *** ***** **** useful **********, *** **** ** ******* carefully, ** ******** ********* *** *** of ******* ******* *** *** ****** be *****. ***** ****** ********* ******** these ******* *** **** ********* ** place ****** ******* **** **** ** the *****.

** **** ******, **** *** ********* of **** *** *****, ***** ***-****'* IVA *** **** ********, **** ****** no ***** ** ****** ***********. ***** should ** ***** ** *** *** light *********** ****** *********, *******.

**** ***** ***** **** ********* *** reliable ** **** ******, *** ********* better ** **** *** ***** **** Bosch ***-****. *******, ***** ****** ** aware ** *** ***** *********** *** to **********, *******, *** ****** *******, before *********.

** ** *** ********* **** ***-*****'* VMF ********* ** ****** ***** **** well *** ****** ***** *** ** its ****** ********* *******.

Configuration

** ***** ***********, ** ****** ************* for **** ** *** ***** ************* tested.

*****, **** ***** ***** **** *********. Configuration ** **** ******, **** ********* the **** (* ** * ********) and********** *********, **** ** ***** *****.

****, ***** ***-**** ***. *** ***-**** offered *** **** ************* *******, ********* multiple *****, **** ****** *******, ****, ********, *** ***** **********. It **** ********* *********** ********* ******* detected *** ***** **** ******* *** scene, ****** *************** ****** ******** ** others ***** *** *** ******* ****.

*******, **** ***-*****'* *** ***** ******** adjustment *** ****** **** *** *****, as **** ** ******** ******** **** as **** ****** *** *********. ** to ***** ***** *** ** *** per ******, *** ******** **** ******. Users ****** ** ******* **** ************* object ****, ** ** *** *********** affect ********* ***********.

Outdoor ************

** ****** *** ***** **** ********* in ******* ****** ******* *********.

********/******* ***

*****, ** ***** **** * ******* lot/driveway ***** ** ****** ** ** 250', ***** ** **** ******** ****:

IPVM Image

****** *** ***, ********* ** **', all ******* *** ********.

IPVM Image

****** ***** ******* **** *** ******* lot ** ~***', **** ***** *** Bosch ***-**** **** ***** ******* ******** on *** *******. ** **** ******** and ******, **** ***** ************************** ** **** ** ***** **********, as *** ******* ********** *** ******* the ****. ***** ***-**** ********* **** once. *** ***** *** ****** ** reliably ****** ******* ** **** *****, so *** *** *******.

IPVM Image

*******, **** *** **** ******* *** the *** ** *** ******** ** the *** **** *** ****, **** Axis ***** *** ***** ***-**** **** still ********, ****** **** ***** ***** activated ******** ***** ** ** *** at ***'.

IPVM Image

** *****, ** ***** *****, **** P3354 *** ***** ***-**** **** *******, while **** ***-***** **** ***. ***** NBN-733V ********* ****** **** *** ******* was ****** **** **** *** ******, not ******, ** ** ******** *** object **** **** ****** ** *** foreground.

IPVM Image

** ~***' *** ******, **** **** P3354 ********. **** *********** **** ******** early *** ** *** ********** ** headlights ****** ****** *** ******.

IPVM Image

** **** ********, ~***', **** ***** still ********. *******, ********** ***** ****** were ****** ** **** ******* ** the ******* ****** *** ******* ***. Moving *** **** **** **** *** road ******* ******** ****.

IPVM Image

**** ***** ********

** **** ****** *** ******* ** the **** ** *** ******, ******* a ******* ******** ****** *** ********** at *** *********, ******* ****** *********** for ***** **** *********.

IPVM Image

** ***', **** **** ***** *** Bosch ***-**** *********, ***** **** ***-***** did *** (*** *** ** ** any ****** ******):

IPVM Image

** ***', ***** ***-**** ********* ** track *** ***** ** *** *******, while **** ***** *** ****** **:

IPVM Image

** ***', **** ** *** ******* triggered, **** *** ******* ****** *** small ** ********:

IPVM Image

** *****, **** *** ******* ******** on, ************* *-* ***, **** **** P3354 *** ***** ***-**** ********* ******** at **' *****.

IPVM Image

** **', ***** ***-**** ***** *********, while **** ***** *** ***. ****** this, ** ******* **** **** ** detect *** *******.

IPVM Image

********* *** *** ******* ******** ***** light ****** ***** * ***. ********* range ** ***** ***-**** ***** ** only **', ***** ****.

IPVM Image

**** ***** *** **** ** ****** our ******* ******** ** ** **', however.

IPVM Image

Indoor ************

*** *** *********** ***** ***** ********************* ** ********* ** ******** ******** a *******. ** *** ** *** cameras ** ***** ** *** ****** doors ** ******** ****, ******* *** tripwire ** *** ****** ** *** door, **** *** ******* ***** **' away. ** **** *****, **** ***** and **** ***-***** ****** ********, *** Bosch ***-**** *** *** *******:

IPVM Image

**** ** *** ** *** *** Bosch ***-**** ****** *******, **** *** object **** ***** ******, ******* *** subject's ***** *** *****. ** *** above *****, *** *******'* **** ****** never ******* *** ****. ****** ** partway ** *** *******, ***** ***-**** triggers ********, ** **** **** ***** . *******, ********* ** *** ***** installation ********, **** **** ******** *** cause ***** ******** ** ****** ****** outside *** ****. ***** ****** ** careful ** ***** ****.

IPVM Image

********** ****

******* *** * ****** *** *** cross **** *********, ** ********* **** tests ** *** ********** ********** **** in ***** ** *********** **** ****** common ********** *** *****.

*****, **** ***** ***** **** ***** Cross **** *********'* *********** ** *******. In **** ****, ****** *** **** was *** ** ******* **** **** the ******* ***** ****** *** ******, the ****** ** *** ****** ** the ***** ****** ***** *********** ** he ***** ****** *** **** ** the ******** *********:

IPVM Image

**** ******** ** ** ***** ****** the ****, ** ****.

IPVM Image

** **** ****, *** **** *** drawn ** ******* ** *** ******* moved *** ** *** **** ** the ***** *** ** *** ****. However, *** ***** ******* ****** ** activate ** ** ***** ******** *** of *** **** ************* ** *** cameras.

IPVM Image

********* *** **** ******** ** ** moved ** **** ** ** ***** or ****** *** ******.

IPVM Image

*******, ***** *** ****** ****** ***** a ****** ******** **** **** ** the ****, ** *** ******* ******* no ****** **** ***** **** **** in *** ****.

Test **********

** **** *** ********* ******* *** firmware ******** *** **** ****:

  • **** *****, ******** *.**.**
  • ***** ***-****, ******** *.** ***** **
  • **** ***-*****, ******** *.*.*

*** ******* **** ******* ********, **** exposures ************ ** */*** *******.

Comments (19)
JH
John Honovich
Sep 17, 2013
IPVM

Do NOT ask about testing other analytics here. All recommendations and requests should go into this separate dedicated thread.

We do plan to test many other analytics - both from 'regular' camera manufacturer as well as video analytic specialists (VideoIQ, AgentVI, etc.). In addition, we plan to test other aspects of Bosch's IVA (like color detection) in a separate report.

MM
Mike McCann
Sep 17, 2013

What VMS was used to test these camera-based triggers?

BW
Ben White
Sep 17, 2013

Your summary of competitive performance sums up one of the common challenges of video analytics: striking the right balance between probability of detection and low false alarm rate.

While the analysis is great, I think you could improve it by having some quantitative measures of comparative performance that factors in both probability of detection and the rate of false alarms.

The UK government i-LIDS scheme uses a so-called 'F1' metric that gives a single result factoring in both of these issues. Full details on p32-33 of this doc.

Also, important is to ensure you are using a repeatable scene so the comparison between different manufacturers is meaningful. This can be awkward when using cameras and not encoders but we've had some luck by pointing systems at a monitor under a black cloth.

My personal opinion is that systems need to have an F1 score of > .95 for them to be genuinely useful in an event detection scenario.

JH
John Honovich
Sep 17, 2013
IPVM

Ben, thanks for the feedback on quantitative metrics and repeatable scenes.

As a general rule, IPVM avoids quantitative metrics as they are too reductionistic and prone to vendor misuse. If I say Camera A scored a .96, Camera B scored a .95 and Camera C scored a .92, what does that really mean? Hopefully, it means Camera A is the best but in all situations? and by how much? Also, if we did quantitative metrics/rankings the temptation for manufacturers to misuse this in promotions would be through the roof.

As for repeatability, we test all cameras at the exact same time and place in real world scenes. The Axis, Bosch and Sony cameras are set inches from each other on a tripod and taken to each location. Because of that, we know that the cameras are all looking at and dealing with the exact same conditions.

GT
Geoff Thiel
Sep 17, 2013

John, I think IPVM’s tests lack ‘good scientific method’. Firstly, IPVM’s tests are not reproducible and so you can never do the same test again and you can’t compare the results of a new camera with those already tested. Secondly the range of scenarios tested is very small. Analytics has to work 24/7, 52 weeks a year whereas IPVM tests with whatever weather is happening on the day of the test. I have seen many examples of analytics that worked well for a while only to develop a serious fault later on due to an environmental change. IPVM should be looking for these sorts of problems. The solution is for IPVM to build up a library of test videos that are known to present problems and test the cameras in a lab environment using a wide range of test recordings. The i-LIDS test library that Ben White mentioned has over 100 test clips for just one scenario and it includes many examples of things that could cause false alarms – here are just a few of them: wind shaking the camera, small animals in the detection area, defective fluorescent bulb flicking on an off at night, heavy rain, rapid sunlight change due to passing cloud. IPVM provides a valuable service and has rightly gained a lot of influence and respect, but with this comes a responsibility to do the tests well.

JH
John Honovich
Sep 17, 2013
IPVM

How are we going to test IP cameras with recorded video clips - point the cameras at a monitor? I await your solution to this that does not introduce its own major flaws / problems.

A major of any 'smart' IP camera's performance is how it captures video - low light, WDR, bright lights, etc. Using recorded video eliminates key real world differentials of IP cameras.

Today's test is of free / low cost add ons to 'regular' IP cameras. It's the beginning of a new series of test coverage on analytics. We will test more conditions but we start with more simple ones and build up to see how far an analytic can go.

Avatar
Luis Carmona
Sep 17, 2013
Geutebruck USA • IPVMU Certified

Great report, folks, and one I've been looking forward too for awhile.

Avatar
Marc Pichaud
Sep 17, 2013

Hi John

That's the current big limit of embedded analytic based on ... pixel motion in 2D.

Nobody in real life is using it in outdoor environement on large FOV. This should be kept for "easy" indoor short range detection. Same for most "pixel" autotracking ... when rain, tree, shadows, bugs, sunbeam, reflects generate too many false positive day/night and eventually doens't catch a right negative it's time to go to analytic.

By the way some camera systems now can be set with day or night detection area to get better results. (like shutter/Wdr/auto iris sepcific profils at night when you want t o increase LPR succes rate)

So thiner detection zone, due to thiner IR angle at night, or less sensitive settings due to gain and noise at night ...

Avatar
Brian Karas
Sep 17, 2013
Pelican Zero

"Nobody in real life is using it in outdoor environement on large FOV."

We (VideoIQ) have embedded edge analytics, and frequently cover outdoor areas of 1/2 acre or more with a single camera. Yes, it's far more advanced that what you get for free (or $30), but the technology *is* there.

Avatar
Jeffrey Nunberg
Sep 17, 2013

Good Morning John, Insightful article as always. It appears that all of these cameras were tested in excellent Hawaiian weather. Rain, as we experience in Florida, will impact these results. It would be great to see a comparison of these manufactures in bad weather.

JH
John Honovich
Sep 17, 2013
IPVM

Jeff,

I did not author this report. You will see that Ethan's name is under the byline, not mine.

This report was done in Pennsylvania by Ethan, Figen and Derek. We no longer do any testing in Hawaii but the folks in Pennsylvania will be happy to know that their state looks like Hawaii :)

Rain is a good consideration (snow as well). We will do more environmental tests in the fall (rain) and then winter (snow).

Thanks for the feedback.

John

Avatar
Jeffrey Nunberg
Sep 17, 2013

John,

Yes, that was pointed out to me earlier by an astute client and friend of mine. Having said that, based upon our experience in wet weather conditions, only two products have worked favorably: VideoIQ and IOI Image.

JH
John Honovich
Sep 17, 2013
IPVM

Those are 'real' / specialist analytics. As we continue our shootouts, we'll move 'up' and test Axis and Bosch against VideoIQ, Agent VI, etc.

Avatar
Jeffrey Nunberg
Sep 17, 2013
Avatar
Marc Pichaud
Sep 17, 2013

You might then add a "type of alarm and Meta data part"

What type of trigger may I generate (general or by rule) may my VMS manage it, and may I use advanced search in my VMS/NVR thanks to the collected meta data?

(MD standardization isn't done yet ...)

Avatar
Marc Pichaud
Sep 17, 2013

Sea (see) Sun and ....Video, no test during hurricane :-)

Avatar
Marc Pichaud
Sep 17, 2013

Brian you misunderstood me. Im' talking about Motion detection. Some people call it analytics.. but for me it's improved motion detection, and this does not work outdoor especially at night.

When you set up a depth of view (so 3D Fov), and can detect a shape, a speed, eventually and object category and filter Snow / rain / Shadows and Tree moves from scene, it's obviously more analytic in my mind

Sure there is some better systems on the market but we are talking about these three tested.

Analytic sounds great but is applied to too many low end, cheap engines. (I'm doing analytic training here and teach people how to test in lab outdoor conditions ...)

Avatar
Brian Karas
Sep 17, 2013
Pelican Zero

Marc -

Ah, I see what you mean. Yes, I agree that these things tested here are not what many people would consider "video analytics" in the current state of the industry.

In my training classes, I often point to "cross line detection" as a very poor approach, especially outdoors, though it seems to be the most common example.

When you really dig into it, "cross line detection" seems to be a way to try and filter out the majority of false alarms. Since these systems cannot ever truly classify an object, they try to filter the motion in the scene down to the point that they only look for "large" size objects, moving across a defined boundary line. That essentially distills the motion occuring on hundreds of thousands, or millions, or pixels down to a very small group. Then they look just at pixels moving in a given direction. The problem (IMO), is that you also have a VERY narrow part of the FOV where you are looking for activity. If anything in the scene prevents the cameras from "seeing" an object as it crosses the line (poor lighting, poor contrast, light rain or fog, another object (parked car, etc.) block part of the perimeter) you have a high risk that you're going to miss valid events.

We usually recommend a Region of Interest (ROI) that covers the *entire* secured area, like a large parking lot. Then we look for any person or vehicle activity in that area when the system is armed. In an ideal case, this works exactly like the line cross analytics, as soon as a person steps over the line onto the secure property an alarm is triggered. But for the cases where the person entering the property could not be seen and classified immediately (as described above) you might get an alarm 10 seconds "late", when they're already 50ft into the secure area. This is far better than no alarm at all, because they managed to miss being detecte while going over the very narrow area of the corss-line detection pixels.

Avatar
Marc Pichaud
Sep 17, 2013

So, good, we are talking about the same things !