Avigilon Analytic Cameras Tested 2014

Published Nov 12, 2014 05:00 AM

********* ******* *** '**** *** *****'

*** ****** ** **** *******, **** to *** ********* ****** ** ***** supply.

******* *** **** *********** ****** ** *********** ********. ********** *** ******** ** ****************** ******* **** *** ** *****.

***, ******** ** *******, ******** *** released ********** ******* **** ***** ** *********, *** *** ****** *** **** series.

** ****** ***** *** ******** ******* and ****** ** *** ****** **** range ******* ****** ** **** *** how **** ******** ********* ****.

** **** ****** ** ****** *** following *** *********:

  • *** ** ***** ********* ***** ** against ***-******** ******** ******* *** ***** party *******, **** *********, ********* ** the ****** *********?
  • ** **** **** *******'* ******** ****-** models?
  • *** **** ** ***** ********* ********* with ******** ******* ******?
  • *** **** ** ***** ********* ********* with ***** *****, **** ***** *** Milestone?

Key ********

**** *** *** *** ******** **** this ****:

  • ***** *** ******* ********* ********* *** scene ***** *** *** *** ******** analytic ****** **** ******* **** ***** cameras ****** (***** ******* ****, *** non ******** ******** ****** *** *** Hikvision ****** ******** ** ****** **) by ** ** **% ********* ** scene.
  • ********* ****** ***** ** ******** *** generations (**** *** ******) **** ** greatly ********* ***** **** ***** ******** approach *** ****** *** **********/********** ******* high ****** ********* ***** *** ***** present ** *** ******** ******** *******.
  • ******** ******** ****** ********** **** **** Avigilon ******* ******, ****** ******** ******* models. ******** ****** ********* *** ** integrated **** ***** ***** ******* *** ONVIF (******* *, ******* *.*),
  • *******, ** ******** ****** **** ******** in *** ******* ** ***** ************ and ********* ******** ********** *** *****. We **** **** ** *******, ******** and ********* *** ***** ********.
  • ******** ******* ****** *.* ***** ******** setup ********* ** ***, *********** ***** of *******. ************* ********** ******** ********/******* View.

******/*** *********** *******

  • ****** ************* ******* ******* ********** ** ACC *.*, **** ***** *** ********* via *** *** ******, *** ******* associated **** ****** ********, *** ***** separately ******* ****.
  • **** ******* ***** ********** ********** ** the ****** ** *** *** *********** when ********** ******* *** ******. **** Avigilon *** ********* ******* ***** ******* issue ** *** *****.
  • ******* ***** ** *** ****** *** Avigilon/VideoIQ **** *** *** ************* ********* in ***, ******* ******** ****** *********** must ** ********* *** ************ ***** the *** ******.

*******

********* ****** ***** ** *** ******** 3.0W-H3A-BO1 ** ~$*,*** ***, *** ***** than *** ******** ********** ******* ****-**, which *** ** ********* ****** ***** of ***** $*,***. **** **** *** H3A ****** **** *** ******* *** on ***** ******* ** *** ****-**, and ** **** ****** ******** *****-** IR.

**** ******* ** **** **** ***** ~$100-$200 **** **** * ***-******** ******* of *** *** ****** (*.**-**-***, $*,*** ****).

*******,********* ****** *****/*** ** ******* ******* ~$700 ***. *** *.**-***-*** ** ******* **** expensive **** ****, *** *****-** ******** analytics *** ********* **** *** *****, otherwise, *******, ********** ***** ***** ******** software, ***.

***************

********'* *** ******** ******* ******* *********** compared ** **** ******* *********** *** third ***** ******* ******** ** *** Rialto, ** * ***** ***** *****, and **** ********* ***** ** *** 5.4, ****** **** ********** ** ***** seeking ***** *********.

*******, ****** **** ******, ***** *******' on ***** ********* *** *** ********** with ***** ***** *** *******, ****** them ****** **** ** ***-**-*** ******** systems. ***** ******* * *********** *** VideoIQ ** ***** **** ******** *** systems **** **** ** ***** *******. Given **** ******** ** ********** ***** Profile * ********* ******** ******, ** VMSes ****** ** *** ******* *** this ******** ***** *******, *********** ** potentially ********.

******

*** ******** *********** *** ***** ***** of *** *** ******** ******** ****** over *** ******** '********' ******* *********, *** ***** ***** integration, ********* ********'* ******* *********** ***********.

**, *** ****, *** ***** *****, add ******* *** ******** ******** *******, it **** ****** ** **** ****** choice **** ******* *** *********, ********** in * **** ******* ******** ******. However, *** **** *** *** **** that *********** ****** ******* ** ** seen. *** *******,***** ***** ***** *** ******* *** Avigilon *******, ** **** ***** ****** *** Exacq ****** ******** ******** ******* *** Avigilon.

Compared ** ******* **** **

** **** *********, *** ******** *.**-***-*** performed ********** ****** **** *** ******* iCVR-HD***** ** ********** ****** ** *** same ******. ********* *********** ********* *** ** least ** **** *** ** *** H3A ******'* ********** **, ** *** light *********** ** *** ****-** *** poor.

*** ****-** ************ *** *** ****** in **** *** ****: ******* ********* of * ***** ******* ** *** parking ***. ** *** ****** *** detection ***** *** *****.

Analytic *************

****** **** ******* ******** *******, *** Avigilon *** ****'* ******** ***** *** configured ********** ******* ******** ******* ****** Client (******** ** ******* *.*). **** setup ** **** **********, **** **** the **** ****** ******* ***** ** default (*********, *********, *** **** ********). Other *****, ********, *** ******* *********** are ***** ** *** ******** ****.

Range ***********

** ****** *** ******* *** ****** in *** **** ****** ************** ** common ************, ** **** ***** **** vegetation ** ******** *****, ~***' ****, and * ******* *** *** ******* entrance, ~***' ****.

*** ******* **** *** ** *** same ********** *** ** **°.

**** ***** *****

**** ******** **** ***** *********** ****** from ****** ** ****** ** *** field.

** ***** ****** ** ******** **** some ******, ** **** ** **** sample *****.

** **** *****, *** ******** *** analytic ****** ************ **** ******* ******** to *** ****** **** **** *** subject *** ******** *** *****, *** as ** ********** *** *******. *** analytic ****** *** ** *** *** best *********** *********** ** ******* ******.

**** **** **** ******** ***** *** equal ** *** *** ******* ** our ******** ****** ****, *** ***** NBN-932V, *** ******** ******* **** *** old ********** ******* **** ** (~***' range) ***** **** ****** ** ************* the **** ***.

** *****, *********** ******* ************* *** to *** *** ***** ****** ** the *****, ~* *** ** *** near ***** ** ****, **** ** about *.** *** ** *** *** of *** *****.

** ************ ** *** ******* *********** somewhat ** ***** ** ****** ******, but *** ****** ** ********** *** enough ** ******** ******* ******. *** H3A ****** ***** ************ ***** ******* in **** *****, ****** ******** ***** was **** **** **% ** *** daytime ***********.

******* ***

****, ** ***** ** *** ******* lot/entrance *****. *********** ********* *** ***** in **** ******** ****:

**** ***** ** **** ***** *** scene, **** ***** *** ******* ******** in *** ******* ***, *** ******** passing ** ** ******** ******** ** the ********.

*****, *** ******** ******** ****** ************ other ******* ** *** *****, ******** detecting ******** ****** *** ** ***** 315', *** *********** ** ***** ***'. The *** ***** **** ******** ********* on ******** ** *** ******* ****** the *****, ***** ***' *****. *********** vehicle ******** *** ** ***** ***', though ** ***** *** **** ****** this *** ** *** ******* **** area ****.

*******, ** ****** ** **** ***** at *****. ***** ****** **** ***** 1-5 *** *** ** *** ********'* outdoor ********.

*****, ******** ***** ********* ***** ********* by **% ** **** ** **** scene. *** **** *** **** ** crossing ********. *********** ***** ***** ** all ******* *** ************* *******.

******** ****** ****** *** **** **** the ******* **** **** ********* ** detect *** ** *** ****** ******** effects ** ********** *** **********, ** issue ** ***** ******* ** *** test ** *** ******. ** ****** reliably ******** ******** ** *** ***** when ****** ****** ** ****, ***** as * **** ** *** ***** below.

Physical ********

*** ******** ************ ** ******** ****** cameras **** ** ***** ********* ** the **** ** ***-******** ********. *******, note **** ** ******* ****** ** all **** *******, ** ** **** slot *** **** ***** **** ** the ******* **** (**** *****), ***** was *** ***** ** *** *.**-**-** models ** ********** ******.

**** **** **** ******* **** *** integrate **** ***, ****** *** ** set ** ** ****** **** ****, on *****, ** **** *** ****** loses ********** ** *** ******. ********** must ** ********* ******* *** *** interface ** ** ******** *** ** card.

Test **********

*** ******* **** ****** ** ******* settings ****** *** ******* ***** ***** was ************ ** */***.

*** ******* ****** **** *** **** horizontal *** ** **°, **** **** the ***** ***** *** ****** *** same.

*** ******* **** ****** ***** ******* firmware:

  • ******** *.**-***-***: *.*.*.**
  • ******** *.**-**-***: *.*.*.**
  • ******** ****** **: *.*.*.*****
  • ********* **-**********-*: *.*.* ***** ******

******** ******* ****** *.*.*.** *** **** for *******.

Comments (25)
Avatar
Luis Carmona
Nov 12, 2014
Geutebruck USA • IPVMU Certified

What about I/O ports? Alarm signals over IP wouldn't be so important if there are at least I/O ports on the cameras.

(1)
Avatar
Ethan Ace
Nov 12, 2014

Both dome and bullet models have 1 alarm in/1 alarm out. However, there's no event setup on the camera to allow you to close that output on an analytic event. It has to be set up in ACC.

Avatar
Luis Carmona
Nov 12, 2014
Geutebruck USA • IPVMU Certified

" It has to be set up in ACC."

Tricky tricky tricky...

(1)
(1)
Avatar
Roshan Blessen Mathew
Nov 13, 2014

Manufacturer to look out for

UI
Undisclosed Integrator #1
Nov 13, 2014

I hope your comment is meant to be positive, example, watch for good things. Otherwise, off the wall comments like this deminish the value of the hard work the team attempts to do will evals.

(2)
Avatar
Andrew Bowman
Nov 13, 2014

Thanks for the report gents. I guess we need to start looking at other options because I have no interest in becoming an Avigilon dealer. I wish Sightlogix was more cost competitive...

(1)
JM
J. Michael Rozmus
Nov 13, 2014

This is a useful comparison of sensitivity between successive generations of the VideoIQ technology. But it is important to remember that detection range from a small number of trials is a risky basis for designing a real world solution. We need to know know what is the range at which the probability of correct detection (POD) is high enough and the false alarm rate (FAR) is low enough to be acceptable. (See 1 and -- more technical -- 2)

POD vs. FAR is not easy to measure. Recordings of intrusion events -- and perhaps some events that tend to result in false alarms -- need to be repeatedly fed to the video analytics processing. This is probably impractical for the Avigilon cameras, but it is probably not too difficult to feed recorded video to the Rialto.

Mike Rozmus

CEO, AgilityVideo LLC

JH
John Honovich
Nov 13, 2014
IPVM

Mike, the fundamental differentiator we have found in testing various analytics (and motion detection) is false alarm factor / rate.

What this test shows is that the new Avigilon analytic camera, like the old VideoIQ one continues to be quite strong at minimizing false alarms while providing fairly long / wide range. At the same time, it does this with no calibration. This is quite uncommon from what we have seen.

Anyone deploying video analytics should test it at their own facilities to determine how well it works at their site.

I strongly recommend not testing any analytics by running canned video clips through it as it is unrealistic.

(1)
GT
Geoff Thiel
Nov 14, 2014

John, I agree with the Michael Rozmus "I don't see how [IPVM's] test tells us much about False Alarm Rate"

IPVM's tests are perfectly valid for measuring range and sensitivity in favourable conditions such as an empty car park and an empty field in fine weather. However, such conditions are 'virtually sterile' with nothing else going on that might influence the VA performance. As a result the tests are seriously incomplete - and give no indication what will happen as the weather and other conditions change. Here are some of the things that IPVM should have tested:

Busier conditions - all the tests have been carried out with just one or two targets. This is OK for certain perimeter protection scenarios. But how does the camera perform in a city street or shopping mall environment - does it get confused if people pass each other or stand close to each other?

Tree foliage movement due to wind. This is a very common source of false alarms in landscaped properties with trees and other vegetation.

Rapid lighting change due to the sun coming out from behind a cloud. This is a common source of false alarms and is also very likely to cause a temporary loss of sensitivity until the new background scene is learnt.

Camera sway caused by wind. This is likely to be devastating to VA performance unless the input video stream to the analytics is stabilised. Any externally mounted camera, whether it is on a bracket, pole or gantry is likely to move in windy conditions and many VA systems will be blinded in such circumstances.

And the list goes on - sensitivity to birds and wild animals, sensitivity to rippling water, sunlight glint on glass surfaces, people and vehicle shadows, moving car headlight beams, flags and balloons moving in the air.

IPVM is dismissive of testing video analytics with video clips. This puts you at odds with AgilityVideo LLC, VCA Technology Ltd and I would say virtually every video analytics developer on the planet whether they are working for a company, government or university. It is simply not possible to test video analytics thoroughly unless you use a video clip library to test over a range of conditions. Furthermore, if you are comparing different products against each other at different times, it is the only way of doing it in a scientifically rigorous and repeatable way.

Geoff Thiel, CEO, VCA Technology

(2)
(1)
(1)
JH
John Honovich
Nov 14, 2014
IPVM

"IPVM is dismissive of testing video analytics with video clips."

Yes, we are dimissive of video clips as pointing IP cameras at a monitor is unrealistic.

That said, testing live video in busy environments and with shaking poles, etc. are all valid and are things we will cover in upcoming analytic tests.

(1)
(1)
GT
Geoff Thiel
Nov 17, 2014

John, you are correct in one respect - using video clips for testing does have some limitations, but it is easy enough to add additional tests that cover the missing areas.

The 'elephant in the room' is the wide range of 'Missing Tests' in IPVM's current reviews of analytic devices. I listed a dozen of the more important missing False Alarm tests in my previous response.

One of the key differences between a true VIDEO ANALYTIC system and VIDEO MOTION DETECTION system is in this critical area of false alarm performance. On a suitable day with no wind, no lighting changes and no precipitation a VMD may work just as well as an analytics system. But as we all know, after a few days of use, the difference will become very clear with the VMD generating many more false alarms.

You suggested that in future IPVM might introduce shaking pole tests and busier environments. That's good, as far as it goes, because you could improve the test coverage a bit without relying video clips. But it would still leave two major problems: a) reproducibility – every test is a ‘one-off’ and so it is impossible for IPVM to exactly repeat the tests on another camera at a later time. b) Even if you add these two extra tests, the false alarm test coverage will still be well under 50%. For example, you would still not be covering some really common problems such as lighting changes, foliage movement, and precipitation. Ideally IPVM should be testing at least a dozen different scenarios to get the FAR test coverage nearer to 100%. Clearly the only practical solution to this is to build up a library of suitable video clips and use them for testing.

I would really like to see IPVM get the science of its analytics testing right – you provide the security industry with a valuable service. Certainly, live testing has its place as it is the only way of measuring certain range and illumination performance issues. However, testing with video clips is equally vital, because it is the only way of subjecting a camera to the effects of wind, rain, sunshine, different shadows, different lighting, etc in a manageably short period of time.

I therefore suggest that IPVM should combine live testing and video clip testing to get the full picture instead of just part of it.

  1. Live testing – for range and sensitivity during day, night and IR illuminator conditions. In other words something very similar to what IPVM did with the Avigilon H3.

  2. Testing with a library of recordings. These can cover all the remaining areas where lighting and resolution are not a significant issue. You need a suitable test room that has good quality blackouts and replay monitor facilities. It should be configured so that several cameras can be tested at the same time. Experience has shown that such tests can give an accurate and reproducible measure of POD and FAR performance. Furthermore, if you include a reference camera (one that you have tested before) in a side-by-side test with a new camera you can compare the two sets of reference camera results to double check that the test conditions have remained constant.

The alternative is for IPVM to extend its live testing with pole sway and crowds etc. This would be time consuming and you will struggle to get the FAR coverage up to even 50%, laying IPVM’s test methodology open to further criticism from industry. Furthermore, you will never be able to re-run exactly the same tests in the future, making any sort of meaningful product comparison over time impossible.

One final point, video analytics are also available as hardware encoders (Rialto I4 in Avigilon’s case) or as PC server software in other cases. These could be tested 100% from video clips and would provide a pure comparison of the video analytic engines on offer, without taking the camera performance into account. As one of the providers of video analytics software, VCA Technology would be interested in participating in such trials.

(1)
JH
John Honovich
Nov 17, 2014
IPVM

Again, the problem with video clips is that they cannot meaningfully be used with IP cameras. We can't test the Avigilon analytic camera by using the Rialto as a proxy. It's crazy to point analytic IP cameras at monitors playing back video.

Also, many analytic systems claim they need time (hours, etc.) to learn a scene making it inevitable they criticize us for showing random 30 second clips one after the other to make judgments on performance.

As for VCA, there's just not enough industry interest to warrant participation.

JV
Jeroen Vendrig
Jan 06, 2015

Geoff, I agree that testing from video clips is the better way to measure progress. But I don't understand why it would be IPVM's job.

Firstly, there is an industry responsibility for self assessment on a reference benchmark. The videos need to be fed to the algorithms directly (I hope the "pointing camera at a monitor" approach isn't used in any real tests!), which for integrated systems can be done by the manufacturers only. Even if everybody plays fair, this is not ideal of course, as it would test a component in an integrated system.

Secondly, governments have stepped in, most notably the UK Home Office/CAST/CPNI ( [Withdrawn] Imagery Library for Intelligent Detection Systems - GOV.UK ). Their approach can be criticised because perfect evaluation is near impossible. But in return the industry should criticise itself for not making many successful submissions for certification. I understand there were 0 successful submissions in most CPNI categories. So there's a great opportunity to participate in an independent trial already.

Once the industry has shown it has reached a higher level of maturity, it is time to come back to this thread and ask IPVM to do more.

JH
John Honovich
Jan 06, 2015
IPVM

Jeroen,

It does not matter what others in the industry do, we are not going to test using video clips. There are real problems to doing so.

Again:

  • Video clips will deliver unrealistic results to any system that claims to learn or calibrate because you are switching / showing new scenes every 10, 30 seconds, etc.
  • Video clips that are known to manufacturers will allow them to game the system, which multiple analytic vendors have admitted to do so for other test processes using clips.
JV
Jeroen Vendrig
Jan 07, 2015

John,

As I said, I don't believe it's IPVM's job to do such evaluation.

However, may I recommend your team does an interview with the people from the UK Home Office (who originally set up the i-Lids challenge) on how they do evaluation and certification? If nothing else, I think it will make an interesting read for IPVM's audience. They can shine some light on the assumptions made in this discussion thread. E.g. their videos are much longer than just 30 seconds, and the first part of a video is reserved for learning/calibration. The videos are selected to cover a wide variety of circumstances, e.g. sun/rain/snow. I recall each of their scenarios has three data sets: one to train your analysis models, one to do self-assessment, and a secret data set used for independent verification. There is no capturing from a monitor involved.

To be sure, there are flaws in their setup, and their focus is on the needs of the UK (with a strong bias to analog cameras), especially protection of critical infrastructure rather than security of corner shops. However, if we are waiting for the perfect evaluation setup, we will probably not move forward at all.

In the example of i-Lids, the vendor will have to explain why its positive self-assessment on the public data set didn't lead to certification on the private data set, which should be a deterrent for cheaters. In addition, gaming a large public data set, using a real product, is not as easy as it may sound. I'm not familiar with the small data sets you refer to.

[Disclosure: I have not participated in the Home Office's certification process, but I do really appreciate their efforts to tackle this huge problem. So I'm a fan.]

JH
John Honovich
Jan 08, 2015
IPVM

Jeroen,

I contacted them a while back. They declined to go into any details not publicly disclosed and they also declined to give any details about how well manufacturers did. They said I would have to contact manufacturers individually to find out their scores. I was not sure what else to do at that point.

JV
Jeroen Vendrig
Jan 08, 2015

That's too bad, but thanks for trying!

- Jeroen

GT
Geoff Thiel
Jan 13, 2015

Jeroen

Re ILIDS, we put our product through the ILIDS 'Sterile Zone' certification process (for intruder detection performance) and so I can shed some light on the tests. The training set, which is the only video that you get to see, consists of about 200 clips of about 5 minutes each. Most of the 5 minutes is 'lead-in' time to give the camera time to settle down and learn the scene, with the ‘intrusion event’ coming at the end and typically lasting 10-30 seconds. About half of the clips are 'true alarm' tests where there is an intruder in the scene and the other half are 'false alarm' tests where there are no intruders but there are lighting changes, rain, snow, wild animals/birds, camera sway, flickering flood-lights, video interference or something else that may cause a false alarm. There are only two different camera views (called stages) in the Sterile Zone test, and the manufacturer is invited to provide a configuration file appropriate for each stage with all the detection zones and other setting optimised for each view. The actual test is done with a different set of video clips taken from the same two cameras views. The manufacturer is not allowed to be present during the testing and you never see video test clips used. I suspect the Home Office also randomised the test sequences as an additional safeguard against cheating.

Bearing in mind that the UK government spent several million pounds of public money on the ILIDS program, the UK Home Office is incredibly cautious about who they will talk to and what they will say. If you are a bona-fide manufacturer, or an accredited research establishment, then there is no problem with getting the data set whether you are based in the UK or overseas. However if you are anyone else such as an integrator, user or journalist they won’t tell you anything. And as John Honovich discovered, they don’t even publish a list of certified companies. The justification for the secrecy is the UK’s data protection laws. The ILIDS imagery includes a lot of video from city centre cameras where people’s faces and vehicle license plates are identifiable – consequently the data must be kept private. Personally I think the Home Office has been far too cautious and could have given the program more publicity while still protecting the confidentiality of any sensitive data.

When we did the test, the Home Office only ran one Sterile Zone test session per year and I believe we were one of about 30 companies in that test session. From that, I guess that well over 100 products have been tested over the past 6 years, although some of these would have been re-tests for products that had previously failed. The test is pretty tough, and if our experience is anything to go by, getting the false alarm rate down to an acceptable level was definitely the most difficult thing to get right. As far as I can tell by trawling the internet, only about 10 companies including VCA Technology, claim to be ILIDS certified. Geoff Thiel, CEO, VCA Technology.

(1)
JM
J. Michael Rozmus
Nov 13, 2014

We agree that false alarm rate (FAR) is the biggest differentiator among seemingly similar video analytics. And I have heard good things about the VideoIQ technology in this regard. But I don't see how this test tells us much about FAR. Most false alarms are caused by challenging events such as:

-- animals moving near the far range of human detection, especially directly at the camera

-- trees or other vegetation moving in a strong wind

-- a blowing piece of trash that is approximately the size of a human being

-- headlights reflecting from wet pavment, or any light reflecting from a puddle

-- a spider web reflecting NIR illumination at night

I also agree with you that a trial at the actual user site is an essential proof of performance. But recorded video can be very useful for comparison testing. For example, recordings of the false alarm stimuli listed above might enable a revealing comparison of different video analytics. We use a lot of recorded video sources at AgilityVideo in testing and improving our SmartCatch video analytics software.

Avatar
Ethan Ace
Nov 13, 2014

In my experience and seen in testing, VideoIQ's false alarm rate is very low. We made mention of this in the test of the Rialto.

Neither the Rialto nor the new Avigilon H3A triggered on reflected lights or foliage in our test scenes. Blowing leaves did not trigger it (and those are plentiful right now).

Obviously we do not test analytics in every possible test scenario so we can't account for all possibilities as John said above. Without spending substantially more time in these tests to account for more environmental factors, spiders spinning webs, etc., we can only point out major issues. Here, we pointed out none because there were none in these scenes. That is not to say there will never be any, because someone on some site will always eventually find something.

(1)
AT
Andrew Thomas
Nov 14, 2014

Analytics is going to remain subjective for the foreseeable future. Any test video that could become a standard, could easily be exploited with and analytics engine that is optimized for the test video. Ultimately the customer experience in their environment is going to be the deciding factor. All of the technical specifications, and standards mean nothing to the person who uses the system and benefits from its capabilities. Example. When was the last time you bought a stereo system by looking at the spec sheets only. Listening to the system is the final decision-maker.

(2)
AT
Andrew Thomas
Nov 17, 2014

Using clips, or rstp streams assumes the analytics is stand alone and accepts this. Assuming more and more analytics will be integrated at the sensor level, and support larger sensor sizes, so real world testing will be the only option.

BD
Benjamin Davis
Jan 08, 2015

I'm just going to through this out there with the following caveat. I'm just a DIY guy, not an integrater. And thus a much more limited range of experience. I run a 5 camera system at my home using Hikvision cameras and Milestone.

So speaking as an end user, I'll just say that this looks extremely interesting to me. The biggest is issue with my (any?) system as some of you mentioned is the false alarm rate. Specifically, the alarms that are tied to email notifications. I don't really care if the cameras are recording all the time due to squirrels and wind... Hard Drives are cheap.

What I want to know is if PEOPLE are in back yard. Or even cooler, A person is loitering by front window or rear alley gate. Any system system that does even a B- job at proving these alerts is incremntally better then what I have now.

The ease of setup being my own support group is also very appealing.

Avatar
Paul Grefenstette
Feb 07, 2015

Avigilon 2MP analytics video from our Griffon Systems Office

John here is a link from some video we pulled last year testing the Avigilon 2mp bullet camera with analytics -- its continues to get better in detection but it's not full proof yet-- it works better covering a wider shot and not having the cars or people walking directly toward the camera -- I will post some more video in different conditions -- feel free to recommend different tests

mH
matthew Harris
Feb 08, 2015
IPVMU Certified

You did not mention the focal length or horizontal field of view on target as related to your tests. It's my understanding that this is critical as performance falls of rapidly below 7 PPF. Would you please elaborate on this. Thanks.

matt HarrisHarris