Motion Detection Performance Tested

By: Benros Emata, Published on Sep 05, 2011

Motion detection is an important element of many, if not, most surveillance systems. It plays a central role in both storage search time reduction. Storage is routinely reduced by 30% - 80% by using motion based rather than continuous recording. Likewise, an investigator can often much faster find a relevant event by simply scanning through areas of motion rather than watching through all video.

At the same time there are a number of challenges associated with using motion detection:

  • Scene Conditions: The accuracy of motion detection and the amount of times motion is detected can vary depending on what's in the scene - people, cars, trees, leaves, etc. - and the time of day - night time with lots of noise, sunrise and sunset with direct sunlight into a camera, etc.
  • Performance of Detector: Motion detetion is built into many surveillance products - from DVRs to VMS systems and now IP cameras. As such, how well each one works can vary significantly.

Our Test

In this report, we share our results from a series of tests we performed to better understand motion detection performance. 

We did a series of tests in different locations:

  • Indoor well light scene to simulate the simplest scene possible
  • Indoor dark scene (<1 lux) to examine what problems low light caused
  • Outdoor parking lot to see how a complex scene with trees, cars and people would perform
  • Roadway to see how a moderately complex scene with periodic cars would perform

Three IP cameras were used with their motion detection enabled to see differences in performance:

With these tests, we answered the following questions:

  • How can one estimate motion percentage accurately?
  • Does motion estimation vary significantly by scene?
  • How accurate was motion detection in each scene?
  • Did certain cameras exhibit greater false motion detection than others? What scenes or conditions drove those problems?
  • Here are the key findings:

    • In simple scenes (e.g. good lighting and narrow FoV), estimating percentage of motion is relatively straightforward and has a low margin for error.
    • In complex scenes (e.g. outdoor, wide FoV, difficult lighting, and vegetation) estimating perentage of motion is difficult and can vary greatly.
    • Variance in performance across cameras increases significantly as the complexity of the scene increases.
    • All three cameras performed ideally (no false positives/negatives) in the very simple daytime indoor test
    • In the parking lot test all three cameras tended to record on motion in near continous fashion, due to vegetation and lighting effects
    • The Arecont Vision camera had significant false positives in low-light/nighttime scenes likely due to noise/gain as well as when sun is setting into the camera's FoV. This resulted in near continuous recording.

    In light of these findings the following is recommended:

    • Unless the scene is very simple, motion detection settings (e.g. masking, sensitivity, and object size) will likely need to be optimized in order to achieve worthwhile benefits
    • For low-light scenes, consider reducing maximum gain. Also, experiment with different camera makes/models as performance can vary considerably in low-light/nighttime scenarios
    • Be careful about estimating motion in complex scenes as one's estimates can be way off

    Daytime Indoor (Artificial Motion)

    Get Notified of Video Surveillance Breaking News
    Get Notified of Video Surveillance Breaking News

    In this scenario, motion detection is configured via the VMS interface on all cameras. The cameras are left to motion record for a period of ten minutes. A subject enters the scene to introduce motion at three evenly spaced intervals during the ten minute span. A snapshot from the motion recorded video and playback timeline is featured below. Note that on the timeline 'blue' represents motion recorded video and 'gray' indicates no video recording.

    [link no longer available]

    Observing the timeline reveals that all cameras reliably detected motion during each of the three subject entries into the scene. Also, just as importantly, no false positives were triggered during times when the subject was out of the scene. The findings are not surprising given the simplicity of the environment (narrow FoV, strong and consistent lighting, no vegetation, and large subject/object).

    Nighttime Indoor (Artificial Motion)

    In this scenario, motion detection is configured via the VMS interface on all cameras. The cameras are left to motion record for a period of ten minutes. A subject enters the scene to introduce motion at three evenly spaced intervals during the ten minute span. A snapshot from the motion recorded video and playback timeline is featured below. Note that on the timeline 'blue' represents motion recorded video and 'gray' indicates no video recording.

    [link no longer available]

    In this low-light environment, motion detection/record behavior now varies distinctly across the three cameras. The Arecont due to noise and lighting effects continually 'believes' motion is occuring and as such effectively records in continuous fashion (high false positive rate). In direct contrast the Panasonic never 'believes' there is motion present throughout the ten minute scenario and as a result has produced no recording at all (high false negative rate). The Axis performs the same as it had in the daytime indoor situation, sensing motion/non-motion appropriately.

    Daytime Outdoor (Natural Motion)

    In this scenario, motion detection is configured via the VMS interface on all cameras. The cameras are left to motion record for a period of 30 minutes. All motion is organic and representative of a real world surveillance scenario. A snapshot from the motion recorded video and playback timeline is featured below. Note that on the timeline 'blue' represents motion recorded video and 'gray' indicates no video recording.

    [link no longer available]

    The frequency of motion detection/recording as seen on the timeline is inordinately high for all three cameras (~80 - 90% motion). The more complex environment (e.g., trees, shadows and other lighting effects) causes quite a number of false positives across the board. Note that the Panasonic nearly continously recorded throughout the 30 minute test, and that the Axis was just slightly less hyper sensitive. The Arecont was the least sensitive among the three but still considerably over sensitive when considering just conventionally relevant or important motion (e.g. human and vehicle subjects).

    24 Hour Roadway (Natural Motion)

    The following is a sequence of four timeline snapshots of two cameras (AV1315DN and P1344) configured to motion record throughout a 24 hour period. The cameras view the same roadway scene with the same FoV/lens angle. Their comparison provides some insights into differences in motion detection/recording sensitivities across different camera makes/models. Note that the VMS is used to initialize the motion detection settings from the camera and are left at the VMS prescribed defaults.

    Late Afternoon (3:30pm - 5:30pm)

    [link no longer available]

    During the ~2 hours of daylight depicted on the timeline the Arecont camera exhibits considerably greater sensitivity than the Axis camera despite near identical FoVs and default (VMS prescribed) motion settings. As a result, the Arecont produced considerable false positives, whereas the Axis is characterized by quite a number of false negatives.

    Evening (6:00pm - 8:00pm)

    [link no longer available]

    The playback is queued up on an Axis false negative, as evidenced by the presence of a vehicle entering the FoV captured by the overly sensitive Arecont camera. Also note how the Arecont begins to nearly continuously record after sunset around 7:15pm (red arrow), while the Axis tends to maintain the same frequency of motion detection/recording.

    The high false positive rate (near continuous record) exhibited by the Arecont after sunset is somewhat predictable given that the Arecont produced similar results in the Nighttime Indoor test.

    Nighttime (8:00pm - 10:00pm)

    [link no longer available]

    Here we see the trend of the Arecont's hyper sensitivity and near continuous recording continue during the nighttime/low-light hours. The playback is queued up on an incident where both the Arecont and Axis detected motion.

    Early Morning (5:00am - 7:00am)

    [link no longer available]

    In this timeline snapshot, the Arecont begins to revert back to a less sensitive detection behavior at sunrise (red line). However, it is still clearly over sensitive and still produces considerable false positives and a fairly heavy duty cycle of recording to non-recording (~1:1 or 50%). In contrast the Axis continues to motion record at roughly the same frequency as in previous time periods.

    Other Natural Motion Scenarios

    The following timeline snapshot depicts the motion detection/recording behavior of two scenes: (1) small parking lot, and (2) indoor office. Motion was organic/natural to get a sense of the expected recording percentages of some typical surveillance scenes.

    Small Parking Lot & Indoor Living Room/Office Space (3:30pm - 5:30pm)

    [link no longer available]

    The parking lot motion detection/recording frequency as depicted on the timeline was generally reliable and performed as expected. Most incidents were of legitimate human and vehicle traffic. Note that the FoV is of only moderate width and is well constrained to the overall area of interest.

    In the indoor office scene, the majority of the incidents were due to legitimate human activity/motion.

    Methodology

    Here are the three (3) cameras used in the 'Motion Based Recording' study:

    • Arecont AV1315DN (online $460) - 1.3MP D/N; 1/2.7" CMOS; MPL4-10; 0.1/0 (Color/BW)
    • Axis P1344 (online $759) - 720p D/N; 1/4" CMOS; F1.2 Computar ; 0.05 Lux (BW)
    • Panasonic WV-SP306 (online $550) - 1.3MP D/N; 1/3" MOS; WV-LZA62/2 lens; 0.3/0.05 lux (Color/BW)

    Here are the firmware versions for each of the cameras:

    • Arecont AV1315DN - fw 65197
    • Axis P1344 - fw 5.22.2
    • Panasonic WV-SP306 - fw 1.30
Comments : PRO Members only. Login. or Join.

Related Reports

Bosch AI Camera Trainer Released And Tested on Apr 09, 2019
Bosch is releasing a highly unusual new AI feature - 'Camera Trainer'. Now, coming as a standard feature in Bosch IVA/EVA analytics, one can train...
Hikvision DeepinMind 2019 Test on Jun 06, 2019
In 2018, Hikvision's DeepinMind AI NVR performed terribly, recognizing vehicles, animals, and other objects as humans, misclassifying demographics,...
Axis Live Privacy Shield Analytics Tested on Jun 25, 2019
Privacy is becoming a bigger factor in video surveillance, driven both by increased public awareness and by GDPR. Now, Axis has released Live...
Wyze AI Analytics Tested - Beats Axis and Hikvision, Now Improved, Now Being Terminated on Nov 25, 2019
When we originally tested Wyze's free person detection deep learning analytics in July 2019, they performed well, with few false alarms, but a long...
Dahua Analytics+ Tested on Aug 07, 2019
Dahua's analytics have performed poorly in past shootouts. But now, they claim their new Analytics+ "algorithms significantly improve accuracy and...
Yi Home Camera 3 AI Analytics Tested on Sep 10, 2019
Yi Technology is claiming "new AI features" in its $50 Home Camera 3 "eliminates 'false positives' caused by flying insects, small pets, or light...
Axis Perimeter Defender Improves, Yet Worse Than Dahua and Wyze on Sep 19, 2019
While Axis Perimeter Defender analytics improved from our 2018 testing, the market has improved much faster, with much less expensive offerings...
Consumer IP Camera Analytics / AI Shootout - Arlo, Google / Nest, Amazon / Ring, Hikvision / Ezviz, Wyze Cam, Yi Home on Sep 26, 2019
AI analytics are hitting the mainstream in the consumer camera market, with entrants Wyze and Yi Home releasing free people detection on their...
Camect "Worlds Smartest Camera Hub" Tested on Oct 18, 2019
Camect is a Silicon Valley startup that claims the "Smartest AI Object Detection On The Market", detecting not only people and vehicles, but...
Halo Smart Vape Detector Tested on Jan 16, 2020
The Halo Smart Sensor claims to detect vaping, including popular brand Juul and even THC vapes. But how well does it work in real world...

Most Recent Industry Reports

Breaking Into A Facility Using Canned Air Tested on Jan 28, 2020
Access control is supposed to make doors more secure, but a $5 can of compressed air may defeat it. With no special training, intruders can...
ROG Security - Cloud AI For Remote Monitoring on Jan 28, 2020
ROG Security is offering cloud-based AI analytics to remote guard companies, by touting having "nothing to install" to "add virtual guards." We...
Brivo Business Profile 2020 on Jan 27, 2020
Brivo has been doing cloud access for more than 20 years. Is the 2020s the decade that cloud access becomes the norm? CEO Steve Van Till recently...
Favorite VMS / NVR Manufacturers 2020 on Jan 27, 2020
In 2018, a new winner emerged and a former top choice declined. Now, there is a new #1, a new top 5 finisher and 2 major VMSes in decline. Our...
"Hikvision Football Arena" Lithuania Causes Controversy on Jan 24, 2020
Controversy has arisen in Lithuania over Hikvision becoming a soccer team's top sponsor and gaining naming rights to their arena, with one local MP...
Axis and Genetec Drop IFSEC 2020 on Jan 23, 2020
Two of the best-known video surveillance manufacturers are dropping IFSEC International 2020, joining Milestone who dropped IFSEC in 2019. The...
Multipoint Door Lock Tutorial on Jan 23, 2020
Despite widespread use, locked doors are notoriously weak at stopping entry, and thousands can be misspent on locks that leave doors quite...
Avigilon Shifts Cloud Strategy - Merges Blue and ACC on Jan 23, 2020
Avigilon is shifting its cloud strategy, phasing out its Blue web-managed surveillance platform as a stand-alone brand and merging it with its ACC...
Verkada Paying $100 For Referrals Just To Demo on Jan 22, 2020
Some companies pay for referrals when the referral becomes a customer. Verkada is taking it to the next level - paying $100 referrals fees simply...
Camera Analytics Shootout 2020 - Avigilon, Axis, Bosch, Dahua, Hanwha, Hikvision, Uniview, Vivotek on Jan 22, 2020
Analytics are hot again, thanks to a slew of AI-powered cameras, but whose analytics really work? And how do these new smart cameras compare to top...