Camera Analytics Tested: Axis vs. Bosch vs. Sony

Author: Ethan Ace, Published on Sep 17, 2013

With the move to IP and HD now commonplace, companies are looking for new ways to compete. While analytics has long disappointed, many still hold out hope for it being the next big thing.

Many major camera manufacturers offer analytics built-in or for a minimal additional charge. However, skepticism of their performance is commonplace.

The Test

In this report, we tested 3 camera manufacturers analytics - Axis cross line detection, Bosch's Intelligent Video Analytics and Sony's Video Motion Filter (VMF).

The test focuses on tripwire / perimeter violation analytics as this is most common type used in security to alert against intruders entering into a restricted area.

We tested in 3 different scenes.

To start, we did a simple indoor tripwire at an entrance:

Then we progressed to an outdoor scene of moderate complexity and distance:

Finally, we tried a very long area with vegetation and shadows to maximize the challenge:

**** *** **** ** ** *** ** *** ***********, ********* are ******* *** *** **** ** *******. ***** ********* *** long ************, **** ***** **** *** **** *** ** ***** the**** *** *****.

**** ***** ****** ************* ***** ********* *****-** ** *** * minimal ********** ******. *******, ********** ** ***** *********** ** ***********.

The ****

** **** ******, ** ****** * ****** ************* ********* -**** ***** **** *********,*****'* *********** ***** ************ ****'* ***** ****** ****** (***).

*** **** ******* ** ******** / ********* ********* ********* ** this ** **** ****** **** **** ** ******** ** ***** against ********* ******** **** * ********** ****.

** ****** ** * ********* ******.

** *****, ** *** * ****** ****** ******** ** ** entrance:

**** ** ********** ** ** ******* ***** ** ******** ********** and ********:

*******, ** ***** * **** **** **** **** ********** *** shadows ** ******** *** *********:

[***************]

Key ********

**** *** *** *** ******** **** ***** *****:

******* **************

  • ***** **** ********* ******** **** **** *** ***** ** ********* is ** **** ** ********. **** ******* ***** ** ***** activations *** ** ******* ** ******* ***********.
  • ***** ***** ******* **** **** ********** *** *******-** ********, *** inside-out, **** *** ****** ****** ****** *** ******.
  • ***** ****** **** ********** ******* **** ** ****** ******* *** headlights **** ********, *** *** ******. ***** ****** ** ***** to ***** **** ***** ***** ********.
  • *** ** *** ***** ******** **** **** ******* ****** ********* may *** ****** *********** *********** *** ***** **** *********.

*********** ***********

  • **** ***** ***** **** ********* ******** ******* ** **** ****** in **** ****** ***** ******* ******** **** ****** ***** ***-**** or **** ***-*****. *******, ** ******** **** ***** ***********, ****** by ******* ** ******* ***** ** *********, *** ***** ********* multiple *****.
  • ***** ***-****'* *** ********* ******** ** **** *** ******, *** better **** ****** **** ***-***** ** **** ***** , **** no ***** ** ******** ***********. *******, ** *** ***** ********* range *** *********** *******, *** ****** *********** *********.
  • **** ***-*****'* *****-** *** ****** ** *** *****, **** **** to ******** ****** ****** ** **** ***** *****. ******** **** detectable ** ****** *********, ****** ***** ** ***** *** ****. This ** *** ** ***** ** **** ** ****** **** settings ** **** ***-*****'* ********* *** ******** ** ***** *******, which ****** ** ****** ******* ** *** ** *****, *** detected ****** ******* ** ******** ** *** *** *****, ******* detections.

*****

**** ** *** ***** ** ********* ** **** **** **** by ************:

  • **** ***** ***** **** *********:~$** ***, **** ********* ** **-***** ** ********.
  • ***** ***-**** ***: *** ****** ******* **** **** *** ********* at ** ********** ******. *** *** *** ****** ****** *** be ******** ** ** ********** **** ** ~$***.
  • **** ***-***** ***: **** ***-*****'* *****-** ********* *** ********* ** no ********** **** ** *** ********** *******. ***** ********** ******* previously ******* **** **** *********.

***************

***** **** ********* *** ***** **** ****** **********, *** **** be ******* *********, ** ******** ********* *** *** ** ******* cameras *** *** ****** ** *****. ***** ****** ********* ******** these ******* *** **** ********* ** ***** ****** ******* **** them ** *** *****.

** **** ******, **** *** ********* ** **** *** *****, Bosch ***-****'* *** *** **** ********, **** ****** ** ***** or ****** ***********. ***** ****** ** ***** ** *** *** light *********** ****** *********, *******.

**** ***** ***** **** ********* *** ******** ** **** ******, and ********* ****** ** **** *** ***** **** ***** ***-****. However, ***** ****** ** ***** ** *** ***** *********** *** to **********, *******, *** ****** *******, ****** *********.

** ** *** ********* **** ***-*****'* *** ********* ** ****** aside **** **** *** ****** ***** *** ** *** ****** detecting *******.

Configuration

** ***** ***********, ** ****** ************* *** **** ** *** three ************* ******.

*****, **** ***** ***** **** *********. ************* ** **** ******, only ********* *** **** (* ** * ********) ************* *********, **** ** ***** *****.

****, ***** ***-**** ***. *** ***-**** ******* *** **** ************* options, ********* ******** *****, **** ****** *******, ****, ********, *** ***** **********. ** **** ********* *********** regarding ******* ******** *** ***** **** ******* *** *****, ****** troubleshooting ****** ******** ** ****** ***** *** *** ******* ****.

*******, **** ***-*****'* *** ***** ******** ********** *** ****** **** and *****, ** **** ** ******** ******** **** ** **** length *** *********. ** ** ***** ***** *** ** *** per ******, *** ******** **** ******. ***** ****** ** ******* when ************* ****** ****, ** ** *** *********** ****** ********* performance.

Outdoor ************

** ****** *** ***** **** ********* ** ******* ****** ******* scenarios.

********/******* ***

*****, ** ***** **** * ******* ***/******** ***** ** ****** up ** ***', ***** ** **** ******** ****:

****** *** ***, ********* ** **', *** ******* *** ********.

****** ***** ******* **** *** ******* *** ** ~***', **** P3354 *** ***** ***-**** **** ***** ******* ******** ** *** vehicle. ** **** ******** *** ******, **** ***** ************************** ** **** ** ***** **********, ** *** ******* ********** and ******* *** ****. ***** ***-**** ********* **** ****. *** VB600 *** ****** ** ******** ****** ******* ** **** *****, so *** *** *******.

*******, **** *** **** ******* *** *** *** ** *** entrance ** *** *** **** *** ****, **** **** ***** and ***** ***-**** **** ***** ********, ****** **** ***** ***** activated ******** ***** ** ** *** ** ***'.

** *****, ** ***** *****, **** ***** *** ***** ***-**** both *******, ***** **** ***-***** **** ***. ***** ***-**** ********* better **** *** ******* *** ****** **** **** *** ******, not ******, ** ** ******** *** ****** **** **** ****** in *** **********.

** ~***' *** ******, **** **** ***** ********. **** *********** were ******** ***** *** ** *** ********** ** ********** ****** toward *** ******.

** **** ********, ~***', **** ***** ***** ********. *******, ********** false ****** **** ****** ** **** ******* ** *** ******* beyond *** ******* ***. ****** *** **** **** **** *** road ******* ******** ****.

**** ***** ********

** **** ****** *** ******* ** *** **** ** *** office, ******* * ******* ******** ****** *** ********** ** *** perimeter, ******* ****** *********** *** ***** **** *********.

** ***', **** **** ***** *** ***** ***-**** *********, ***** Sony ***-***** *** *** (*** *** ** ** *** ****** ranges):

** ***', ***** ***-**** ********* ** ***** *** ***** ** our *******, ***** **** ***** *** ****** **:

** ***', **** ** *** ******* *********, **** *** ******* simply *** ***** ** ********:

** *****, **** *** ******* ******** **, ************* *-* ***, both **** ***** *** ***** ***-**** ********* ******** ** **' range.

** **', ***** ***-**** ***** *********, ***** **** ***** *** not. ****** ****, ** ******* **** **** ** ****** *** subject.

********* *** *** ******* ******** ***** ***** ****** ***** * lux. ********* ***** ** ***** ***-**** ***** ** **** **', shown ****.

**** ***** *** **** ** ****** *** ******* ******** ** to **', *******.

Indoor ************

*** *** *********** ***** ***** ********************* ** ********* ** ******** ******** * *******. ** *** up *** ******* ** ***** ** *** ****** ***** ** simulate ****, ******* *** ******** ** *** ****** ** *** door, **** *** ******* ***** **' ****. ** **** *****, Axis ***** *** **** ***-***** ****** ********, *** ***** ***-**** did *** *******:

**** ** *** ** *** *** ***** ***-**** ****** *******, with *** ****** **** ***** ******, ******* *** *******'* ***** and *****. ** *** ***** *****, *** *******'* **** ****** never ******* *** ****. ****** ** ******* ** *** *******, Bosch ***-**** ******** ********, ** **** **** ***** . *******, depending ** *** ***** ************ ********, **** **** ******** *** cause ***** ******** ** ****** ****** ******* *** ****. ***** should ** ******* ** ***** ****.

********** ****

******* *** * ****** *** *** ***** **** *********, ** performed **** ***** ** *** ********** ********** **** ** ***** to *********** **** ****** ****** ********** *** *****.

*****, **** ***** ***** **** ***** ***** **** *********'* *********** to *******. ** **** ****, ****** *** **** *** *** to ******* **** **** *** ******* ***** ****** *** ******, the ****** ** *** ****** ** *** ***** ****** ***** activations ** ** ***** ****** *** **** ** *** ******** direction:

**** ******** ** ** ***** ****** *** ****, ** ****.

** **** ****, *** **** *** ***** ** ******* ** the ******* ***** *** ** *** **** ** *** ***** end ** *** ****. *******, *** ***** ******* ****** ** activate ** ** ***** ******** *** ** *** **** ************* to *** *******.

********* *** **** ******** ** ** ***** ** **** ** an ***** ** ****** *** ******.

*******, ***** *** ****** ****** ***** * ****** ******** **** next ** *** ****, ** *** ******* ******* ** ****** what ***** **** **** ** *** ****.

Test **********

** **** *** ********* ******* *** ******** ******** *** **** test:

  • **** *****, ******** *.**.**
  • ***** ***-****, ******** *.** ***** **
  • **** ***-*****, ******** *.*.*

*** ******* **** ******* ********, **** ********* ************ ** */*** maximum.

Comments (19)

Do NOT ask about testing other analytics here. All recommendations and requests should go into this separate dedicated thread.

We do plan to test many other analytics - both from 'regular' camera manufacturer as well as video analytic specialists (VideoIQ, AgentVI, etc.). In addition, we plan to test other aspects of Bosch's IVA (like color detection) in a separate report.

What VMS was used to test these camera-based triggers?

Your summary of competitive performance sums up one of the common challenges of video analytics: striking the right balance between probability of detection and low false alarm rate.

While the analysis is great, I think you could improve it by having some quantitative measures of comparative performance that factors in both probability of detection and the rate of false alarms.

The UK government i-LIDS scheme uses a so-called 'F1' metric that gives a single result factoring in both of these issues. Full details on p32-33 of this doc.

Also, important is to ensure you are using a repeatable scene so the comparison between different manufacturers is meaningful. This can be awkward when using cameras and not encoders but we've had some luck by pointing systems at a monitor under a black cloth.

My personal opinion is that systems need to have an F1 score of > .95 for them to be genuinely useful in an event detection scenario.

Ben, thanks for the feedback on quantitative metrics and repeatable scenes.

As a general rule, IPVM avoids quantitative metrics as they are too reductionistic and prone to vendor misuse. If I say Camera A scored a .96, Camera B scored a .95 and Camera C scored a .92, what does that really mean? Hopefully, it means Camera A is the best but in all situations? and by how much? Also, if we did quantitative metrics/rankings the temptation for manufacturers to misuse this in promotions would be through the roof.

As for repeatability, we test all cameras at the exact same time and place in real world scenes. The Axis, Bosch and Sony cameras are set inches from each other on a tripod and taken to each location. Because of that, we know that the cameras are all looking at and dealing with the exact same conditions.

John, I think IPVM’s tests lack ‘good scientific method’. Firstly, IPVM’s tests are not reproducible and so you can never do the same test again and you can’t compare the results of a new camera with those already tested. Secondly the range of scenarios tested is very small. Analytics has to work 24/7, 52 weeks a year whereas IPVM tests with whatever weather is happening on the day of the test. I have seen many examples of analytics that worked well for a while only to develop a serious fault later on due to an environmental change. IPVM should be looking for these sorts of problems. The solution is for IPVM to build up a library of test videos that are known to present problems and test the cameras in a lab environment using a wide range of test recordings. The i-LIDS test library that Ben White mentioned has over 100 test clips for just one scenario and it includes many examples of things that could cause false alarms – here are just a few of them: wind shaking the camera, small animals in the detection area, defective fluorescent bulb flicking on an off at night, heavy rain, rapid sunlight change due to passing cloud. IPVM provides a valuable service and has rightly gained a lot of influence and respect, but with this comes a responsibility to do the tests well.

How are we going to test IP cameras with recorded video clips - point the cameras at a monitor? I await your solution to this that does not introduce its own major flaws / problems.

A major of any 'smart' IP camera's performance is how it captures video - low light, WDR, bright lights, etc. Using recorded video eliminates key real world differentials of IP cameras.

Today's test is of free / low cost add ons to 'regular' IP cameras. It's the beginning of a new series of test coverage on analytics. We will test more conditions but we start with more simple ones and build up to see how far an analytic can go.

Great report, folks, and one I've been looking forward too for awhile.

Hi John

That's the current big limit of embedded analytic based on ... pixel motion in 2D.

Nobody in real life is using it in outdoor environement on large FOV. This should be kept for "easy" indoor short range detection. Same for most "pixel" autotracking ... when rain, tree, shadows, bugs, sunbeam, reflects generate too many false positive day/night and eventually doens't catch a right negative it's time to go to analytic.

By the way some camera systems now can be set with day or night detection area to get better results. (like shutter/Wdr/auto iris sepcific profils at night when you want t o increase LPR succes rate)

So thiner detection zone, due to thiner IR angle at night, or less sensitive settings due to gain and noise at night ...

"Nobody in real life is using it in outdoor environement on large FOV."

We (VideoIQ) have embedded edge analytics, and frequently cover outdoor areas of 1/2 acre or more with a single camera. Yes, it's far more advanced that what you get for free (or $30), but the technology *is* there.

Good Morning John, Insightful article as always. It appears that all of these cameras were tested in excellent Hawaiian weather. Rain, as we experience in Florida, will impact these results. It would be great to see a comparison of these manufactures in bad weather.

Jeff,

I did not author this report. You will see that Ethan's name is under the byline, not mine.

This report was done in Pennsylvania by Ethan, Figen and Derek. We no longer do any testing in Hawaii but the folks in Pennsylvania will be happy to know that their state looks like Hawaii :)

Rain is a good consideration (snow as well). We will do more environmental tests in the fall (rain) and then winter (snow).

Thanks for the feedback.

John

John,

Yes, that was pointed out to me earlier by an astute client and friend of mine. Having said that, based upon our experience in wet weather conditions, only two products have worked favorably: VideoIQ and IOI Image.

Those are 'real' / specialist analytics. As we continue our shootouts, we'll move 'up' and test Axis and Bosch against VideoIQ, Agent VI, etc.

Agreed.

You might then add a "type of alarm and Meta data part"

What type of trigger may I generate (general or by rule) may my VMS manage it, and may I use advanced search in my VMS/NVR thanks to the collected meta data?

(MD standardization isn't done yet ...)

Sea (see) Sun and ....Video, no test during hurricane :-)

Brian you misunderstood me. Im' talking about Motion detection. Some people call it analytics.. but for me it's improved motion detection, and this does not work outdoor especially at night.

When you set up a depth of view (so 3D Fov), and can detect a shape, a speed, eventually and object category and filter Snow / rain / Shadows and Tree moves from scene, it's obviously more analytic in my mind

Sure there is some better systems on the market but we are talking about these three tested.

Analytic sounds great but is applied to too many low end, cheap engines. (I'm doing analytic training here and teach people how to test in lab outdoor conditions ...)

Marc -

Ah, I see what you mean. Yes, I agree that these things tested here are not what many people would consider "video analytics" in the current state of the industry.

In my training classes, I often point to "cross line detection" as a very poor approach, especially outdoors, though it seems to be the most common example.

When you really dig into it, "cross line detection" seems to be a way to try and filter out the majority of false alarms. Since these systems cannot ever truly classify an object, they try to filter the motion in the scene down to the point that they only look for "large" size objects, moving across a defined boundary line. That essentially distills the motion occuring on hundreds of thousands, or millions, or pixels down to a very small group. Then they look just at pixels moving in a given direction. The problem (IMO), is that you also have a VERY narrow part of the FOV where you are looking for activity. If anything in the scene prevents the cameras from "seeing" an object as it crosses the line (poor lighting, poor contrast, light rain or fog, another object (parked car, etc.) block part of the perimeter) you have a high risk that you're going to miss valid events.

We usually recommend a Region of Interest (ROI) that covers the *entire* secured area, like a large parking lot. Then we look for any person or vehicle activity in that area when the system is armed. In an ideal case, this works exactly like the line cross analytics, as soon as a person steps over the line onto the secure property an alarm is triggered. But for the cases where the person entering the property could not be seen and classified immediately (as described above) you might get an alarm 10 seconds "late", when they're already 50ft into the secure area. This is far better than no alarm at all, because they managed to miss being detecte while going over the very narrow area of the corss-line detection pixels.

So, good, we are talking about the same things !

Login to read this IPVM report.
Why do I need to log in?
IPVM conducts unique testing and research funded by member's payments enabling us to offer the most independent, accurate and in-depth information.

Related Reports

Testing Bandwidth Vs. Low Light on Jan 16, 2019
Nighttime bandwidth spikes are a major concern in video surveillance. Many calculate bandwidth as a single 24/7 number, but bit rates vary...
Avigilon Favorability Results 2019 on Jan 15, 2019
Since IPVM's 2017 Avigilon favorability results, the company was acquired by Motorola and has shifted from being an aggressive startup to a more...
Gorilla Technology AI Provider, Raises $15 Million, Profiled on Jan 15, 2019
Gorilla Technology is a Taiwanese video analytics manufacturer that recently announced a $15 million investment from SBI Group, saying this...
WDR Tutorial on Jan 11, 2019
Understanding wide dynamic range (WDR) is critical to capturing high quality images in demanding conditions. However, with no real standards, any...
Worst Products Tested In Past Year on Jan 09, 2019
IPVM has done over 100 tests in the past year. But which products performed the worst? Which ones should users be most aware of? In this report,...
Managed Video Services UL 827B Examined on Jan 09, 2019
Historically, UL listings for central stations have been important, with UL 827 having widespread support. However, few central stations have...
2019 Video Surveillance Cameras Overview on Jan 07, 2019
Each year, IPVM summarizes the main advances and changes for video surveillance cameras, based on our industry-leading testing and...
IPVM Best New Products 2019 Opened - 70+ Entrants on Jan 07, 2019
The inaugural IPVM Best New Product Awards has been opened - the industry's first and only program where the awards are not pay-to-play and the...
Axis Tailgate Detection Tested on Jan 04, 2019
Axis is aiming to tackle tailgating, one of access control's biggest issues, with the Tailgating Detector ACAP application. This camera app claims...
Camera Course January 2019 on Jan 03, 2019
This is the only independent surveillance camera course, based on in-depth product and technology testing. Lots of manufacturer training exists...

Most Recent Industry Reports

The IP Camera Lock-In Trend: Meraki and Verkada on Jan 18, 2019
Open systems and interoperability have not only been big buzzwords over the past decade, but they have also become core features of video...
NYPD Refutes False SCMP Hikvision Story on Jan 18, 2019
The NYPD has refuted the SCMP Hikvision story, the Voice of America has reported. On January 11, 2018, the SCMP alleged that the NYPD was using...
Mobile Surveillance Trailers Guide on Jan 17, 2019
Putting cameras in a place for temporary surveillance where power and communications are not readily available can be complicated and expensive....
Exacq Favorability Results 2019 on Jan 17, 2019
Exacq favorability amongst integrators has declined sharply, in new IPVM statistics, compared to 2017 IPVM statistics for Exacq. Now, over 5 since...
Testing Bandwidth Vs. Low Light on Jan 16, 2019
Nighttime bandwidth spikes are a major concern in video surveillance. Many calculate bandwidth as a single 24/7 number, but bit rates vary...
Access Control Records Maintenance Guide on Jan 16, 2019
Weeding out old entries, turning off unused credentials, and updating who carries which credentials is as important as to maintaining security as...
UK Fines Security Firms For Illegal Direct Marketing on Jan 16, 2019
Two UK security firms have paid over $200,000 in fines for illegally making hundreds of thousands of calls to people registered on a government...
Access Control Cabling Tutorial on Jan 15, 2019
Access Control is only as reliable as its cables. While this aspect lacks the sexiness of other components, it remains a vital part of every...
Avigilon Favorability Results 2019 on Jan 15, 2019
Since IPVM's 2017 Avigilon favorability results, the company was acquired by Motorola and has shifted from being an aggressive startup to a more...
Gorilla Technology AI Provider, Raises $15 Million, Profiled on Jan 15, 2019
Gorilla Technology is a Taiwanese video analytics manufacturer that recently announced a $15 million investment from SBI Group, saying this...

The world's leading video surveillance information source, IPVM provides the best reporting, testing and training for 10,000+ members globally. Dedicated to independent and objective information, we uniquely refuse any and all advertisements, sponsorship and consulting from manufacturers.

About | FAQ | Contact