Analytics vs Rain: Axis, Bosch & Sony

Author: Ben Wood, Published on Oct 21, 2013

Rain. This was a key objection to our Camera Analytics Test: Axis vs. Bosch vs. Sony. Members rightfully noted it is one thing for an analytics to do well indoors and outdoors at night but what about in the rain?

In this report, we share our findings of testing the same camera analytic in moderate to heavy rain. Here's what the day looked like:

We set up crosslines / tripwires from 20' to 200' away from our cameras.

Additionally, we tested a crossline perpendicular to the camera from 20' to 200' away.

Inside, we share our test results of who performed best and worst, plus how much the rain impacted performance.

****. **** *** * *** ********* ** ********* ********* ****: **** **. ***** **. ****. ******* ********** ***** ** ** *** ***** *** ** analytics ** ** **** ******* *** ******** ** ***** *** what ***** ** *** ****?

** **** ******, ** ***** *** ******** ** ******* *** same ****** ******** ** ******** ** ***** ****. ****'* **** the *** ****** ****:

** *** ************/ ********* **** **' ** ***' **** **** *** *******.

************, ** ****** * ********* ************* ** *** ****** **** **' ** ***' ****.

******, ** ***** *** **** ******* ** *** ********* **** and *****, **** *** **** *** **** ******** ***********.

[***************]

Key ********

**** *** *** *** ******** **** **** ****:

  • ** ***** ********* **** ********* *** ** ****** **** (********-**-*****)** ******** ******* **** ****** ********.
  • ***** **** *********** *** ******** ******* ** **** ****** (** and **** ~***') ** ****, **** ******** ** *** ******* from *** ****** ***** ********* ********
  • **** * ***** *******, *** ***** ***-****'* *** *** **** Crossline ********* ******* ** **** ****** *********** ** ***' *** ********** missed *********** ** ***'. **** ** **** **********, *** **** alert **** *** ***% ***** ***' *** ***' *** **** ********* *** ***** IVA, ************. ********* ******* ***** ** **** ***-*****'* *** ********* to ** *** **** ** ***** ****** (**% ** **', 40% ** **' *** *% ** ***') ********** **** *** no-rain *******. 
  • **** *** ****** *** * *******, ** ***** *********** ** the ********* *********** ** **** ****** **** ********, ******** ** the **** ** **-**** *********.  

***************

** ******** ** *** ******* *************** ** *** ******** ******, these ***** **** **** **** *** ***** ******* ********** **** also ** ** ********* ****** ** ******** **** ********* * reliable *****-**** ********* ******. *** ******* ********* ******** **** ** reduced ** ******* ** ****** ** ***** **** *******, ******** **** ***** factors, **** ** ******** ********, *** *** ******** **********. ********** attention ** ****** **** *** ****** ****** ** ******** ** imaginary ******** ** **** ********* **** ***-***' ****.

*******, ***** ******** / ******, ***** ************** ******, *** *** demonstrate ********* ******** *** ** *** ****.

****

** ****** *** ***** **** ********* ** *** ****** ******* scenarios:

********

*****, ** ***** **** ******* * ******* ******** ***** *** parking ***/********, ** ***** ** *** ******** **** *****. 

*** ******* ******* *** ******** ** ********** ***** ** ******* distances **** **' ** ***' **** **** *** ******.

** **', **** **** ***** *** ***** ***-**** ******** **** full ********, ***** **** ***-*****'* ******* **** *** */**.

** ********* ******* **' *** **', ******* **** ** ***** decreased ************ ** ****** *** *********** ** ***'. ***** *** ***-**** continued ** ******* ******** ** **** *****.

** ***', **** *** ***** *** ***-**** ******* ** **** **** activations. **** ** ***** *********** **** **** **** ******** ** their ******** ** **** ***** ** *** ******** ** **** test.

** ***', **** ** *** ******* ********* ** ***. ** clear **********, *** ***-**** *** ***** ** ** ***% ******** 200'.

********* ******* ***** ** **** ****** **** *** ******* ******** tests **** **** ********* *****. ***** ***** **** ***% ******* rate, ******** ******* **** ******* ** ******. ***** ** **** *** observed *** ***** ****** ** **** ****, ******* **** ** based ** **** ****** ** *** ******* *****.

Vehicle ** ********

** **** ****** *** ******** ** ************ ********* ** ***** *******, ********* ******* ********* ** ***** *********; 25', ***' *** ***'. **** ***** *** ******* ** *** tripwire ****, *** ***** **** *** ** ** *** ********, across *** ***.

******* *** **** ***** ** ***** ***-**** ****** *** ** the ******* *********** ** *** ** *** ***** *********. *** Sony *****, ** *** ***** ****, *** ****** *********** **** at *** ******* **** ***** ** **', *** ********* **** at ****** *********. ***** ******* **** ******* ** **** ** saw ** *** ** **** ****. *********, ** **** *** observed *** ****** ** *** **** ** *** ********* ***** of ****** *******, **** ** ********, ***** ***'.


** ~***' ****'* ******** ***** ** ~**%.

*******, ** ***', *** ***-***** ** ****** ** ****** *** vehicle ** ***.


*** ***** ***** ********** ******* *********** ** ******** ** *** rain:

 

Comments (7)

Figen:

Thanks for the report. Over how many hours was the data collected? I have found that over extended periods of rain >12 hours (It rains a lot in Vancouver) you should get some false positives. When calculating detecting true events accuracy, how many true events were done per test?

Also, all of the testing was using crossing line detection, which I never use. I have a belief about crossing line detection as not being as good as "appearing in ROI". Early on I was finding that trespass by people would be missed because often fencing or hoarding would have storage containers/ obstructions beside them and entry might be obscured and crossing line detection would miss them, while appearing in ROI would detect them. Any thoughts on this?

Hi Robert, we tested over about 5 hours. Results were consistent all the way through, so we felt it was long enough to see patterns. However, we do believe there could be performance differences depending on the type of rain, depending on how heavy it is, how windy, potential for hail, etc.

We typically ran 10 attempts with a human subject at each distance. Vehicles, not as many, because performance was pretty close to predictable based on the first 2-3, due to the much larger object size.

This testing was limited to cross line detection for two reasons. First and foremost, because it is the most common analytic in use in the industry. Second, this is a continuation of our previous camera analytics test, which was limited to crossline.

We will be testing other analytics in the future (starting today actually), and will be using other rules in various conditions.

I did a major analytics deployment some years ago, and in the rain we had signficant issues with directional detection in the rain, especially in evening and dark. Headlight reflections on wet pavement on opposite side of the road on a curve made the analytics think there was movement in the wrong direction consistently. On an interstate there were hundreds of alarms each morning.

Doing tripline for our first effort was a good idea, but I'll bet other functions might be more adversely affected by rain and other weather conditions. I'm sure you will do follow-on studies of more advanced features.

Too bad you didn't test that other ananlytic that would have "learned" that rain is normal...

Hi Marc, we tested virtual tripwires as a continuation of our Camera Analytics Tested: Axis vs. Bosch vs. Sony report. We will be testing AgentVi, VideoIQ, and Bosch IVA in upcoming reports, so we'll be looking at other rules in varying conditions then. It is, unfortunately, very difficult to plan rain testing in Pennsylvania, as we've had one day of steady rain in the past month. But we're getting there!

What, you don't just look at your iPhone or Android app for weather until it says "rain?"

Sounds good. I would be very interested to see performance comparisons of integrated vs. add-on analytics, edge vs. server based, and the like. I've always thought that edge should be better, but of course the algorithms are not identical either.

I had some good success with AgentVI back when they were still Aspectus. I hope you will also include Sightlogix and Puretech, both of whom include geolocation which sometimes is a good adjunct and sometimes a good alternate for radar.

My initial thought was that the rain droplets on the lens would be the issue but according to this test the reduction in analytics performance looked more like the rain itself caused visibilty to be reduced which then reduced analytics performance. This tells me that fog would have the same effect.

Which brings up another question ... what is the effect on the analytics with snow? Would slow falling snow flakes near the camera be detected?

I would think the snow flake question depends on the size and speed of the flake. We'll be testing this when winter comes, assuming we actually get snow this year.

I believe that snow will have more of an obscuration effect than rain, similar to fog, which will have more of an impact on performance that the chance of false alarms from falling snow.

Login to read this IPVM report.
Why do I need to log in?
IPVM conducts unique testing and research funded by member's payments enabling us to offer the most independent, accurate and in-depth information.

Related Reports on Video Analytics

Avigilon Favorability Results 2019 on Jan 15, 2019
Since IPVM's 2017 Avigilon favorability results, the company was acquired by Motorola and has shifted from being an aggressive startup to a more...
Gorilla Technology AI Provider, Raises $15 Million, Profiled on Jan 15, 2019
Gorilla Technology is a Taiwanese video analytics manufacturer that recently announced a $15 million investment from SBI Group, saying this...
Managed Video Services UL 827B Examined on Jan 09, 2019
Historically, UL listings for central stations have been important, with UL 827 having widespread support. However, few central stations have...
IPVM Best New Products 2019 Opened - 70+ Entrants on Jan 07, 2019
The inaugural IPVM Best New Product Awards has been opened - the industry's first and only program where the awards are not pay-to-play and the...
Axis Tailgate Detection Tested on Jan 04, 2019
Axis is aiming to tackle tailgating, one of access control's biggest issues, with the Tailgating Detector ACAP application. This camera app claims...
The Battle For The VSaaS Market Begins 2019 - Alarm.com, Arcules, Eagle Eye, OpenEye, Qumulex, Verkada, More on Jan 02, 2019
2019 will be the year that VSaaS finally becomes a real factor for professional video surveillance. While Video Surveillance as a Service (VSaaS)...
Cisco Meraki New Cameras and AI Analytics on Dec 14, 2018
Meraki has released their second generation of video surveillance with 3 new cameras, AI-based video analytics, and 2 cloud-based storage...
Top 2019 Trend - AI Video Analytics on Dec 10, 2018
160+ Integrators answered: What do you think the top industry trend will be in 2019? Why? AI / video analytics was the run-away winner with...
Alarm.com "AI" Video Analytics Tested on Nov 30, 2018
Alarm.com has announced what it calls an "artificial intelligence (AI) architecture and video analytics service", touting that  Alarm.com's...
Vintra "AI-Powered" Video Analytics Startup Profile on Nov 27, 2018
Vintra is a Silicon Valley startup focused on AI-based video analytics. They had booths at IACP and ISC West demonstrating their hosted or...

Most Recent Industry Reports

Cable Trenching for Surveillance on Jan 21, 2019
Trenching cable for surveillance is surprisingly complex. While using shovels, picks, and hoes is not advanced technology, the proper planning,...
Milestone Favorability Results 2019 on Jan 21, 2019
Milestone's favorability moderately strengthed, in new IPVM integrator statistics over their results from 2016. While the industry has been...
Intersec 2019 Live Day 1 - Massive China Presence on Jan 21, 2019
There’s a massive presence from Chinese or China-focused video surveillance firms, chiefly Hikvision, Dahua, Huawei, and Infinova, at...
The IP Camera Lock-In Trend: Meraki and Verkada on Jan 18, 2019
Open systems and interoperability have not only been big buzzwords over the past decade, but they have also become core features of video...
NYPD Refutes False SCMP Hikvision Story on Jan 18, 2019
The NYPD has refuted the SCMP Hikvision story, the Voice of America has reported. On January 11, 2018, the SCMP alleged that the NYPD was using...
Mobile Surveillance Trailers Guide on Jan 17, 2019
Putting cameras in a place for temporary surveillance where power and communications are not readily available can be complicated and expensive....
Exacq Favorability Results 2019 on Jan 17, 2019
Exacq favorability amongst integrators has declined sharply, in new IPVM statistics, compared to 2017 IPVM statistics for Exacq. Now, over 5 since...
Testing Bandwidth Vs. Low Light on Jan 16, 2019
Nighttime bandwidth spikes are a major concern in video surveillance. Many calculate bandwidth as a single 24/7 number, but bit rates vary...
Access Control Records Maintenance Guide on Jan 16, 2019
Weeding out old entries, turning off unused credentials, and updating who carries which credentials is as important as to maintaining security as...
UK Fines Security Firms For Illegal Direct Marketing on Jan 16, 2019
Two UK security firms have paid over $200,000 in fines for illegally making hundreds of thousands of calls to people registered on a government...

The world's leading video surveillance information source, IPVM provides the best reporting, testing and training for 10,000+ members globally. Dedicated to independent and objective information, we uniquely refuse any and all advertisements, sponsorship and consulting from manufacturers.

About | FAQ | Contact