Analytics Use Questions

Now that analytics have been on the market for more than a decade, my questions for integrators and end users are:

1) What analytics (trip wire, object left behind etc) have you successfully implemented if any?

2) Are these analytics being used for pro-active video monitoring or forensic?

3) If the anaytics are generating a video alarm, how is it being displayed to the guard staff (pop up video etc)?

4) What percentage of false positives are you seeing?


Summing up what we are seeing:

1) Tripwire / aka cross line detection is what we have seen the most successfully implemented. I bet successful object left behind implementations exist, but they are rare. If you set a short alert period on objects, you'll get lots of nuisance alerts, if you see a long period, it's too late for the exceedingly rare real threat.

2) Analytics are typically being used for pro-active monitoring because that is where the big value is (i.e., in stopping things right away) rather than documenting disasters. There are a handful of companies specializing in analytics for forensic (like 3VR and Briefcam).

3) Display is driven by what the operator set up. If there is a command center with a dedicated operator, it's probably a pop up or new entry in a list of alarms.

4) False positives are all over the place. Systems that are acceptable / tolerated typically do not have more than 1 false alarm per camera per day. Bad systems can have multiple ones per hour, depending on the time of day, weather, setup of systems, etc. (just ask the White House)

Related: Market Guide and Favorite Video Analytics 2014 has survey results / statistics on usage.

I'll answer to number 4), analytics still have problems with flashes of light, whether from street lamps or lightning, reflections off rain, and bugs that fly in front of cameras during certain seasons (white or IR light, try to keep it at a distance from the camera itself). If the scene is steady and weather clear, you get what I can 80% accuracy. We use VMS native analytics as 3rd party solutions still haven't made integrations that easy. Avigilon’s Rialto seems to have the best idea but haven't tested it yet.

We have done a variety of analytics indoors and outside with thermal and optical cameras. This has all been done for live viewing to assit operators in recogignizing certain situations. In essence a viewing aid. The best advice is don't commit to any manufacture/ technology with out a live demo in the situation you are trying to use it in. Everybody has an opinion on what is acceptable but only you the customer can make that determination. Its virtually impossible to not have some sort of false trip at some time while still detecting what you need, its a balance.

We have used a variety of technologies and manufactures for analytics to give the customer what solution they are looking for. Pop ups, flashing lights, alarm messages, emails, texts, integration into lighting systems its all been done. This may not be the answer s your looking for but I feel its honest gien the question, With unlimited budget and time anything is possible.

The primary challenge with analytics is the deployment and configuration of the rules based systems out there. As an integrator we have deployed and watched several systems fall flat on their faces in terms of functionality and ease of use. A year back we tested a fully automated product. It was literally a plug and play type system with minimal configuration and already comes integrated into the Milestone interface. The plus side it would pick up anything it considered to be an anomaly from the norm of activities in front of that camera. The downside you couldn't write specific rules to certain scenes. We installed a unit and had it running across 100 cameras and the false alarm rates were very low. The surprising thing was over 98% of the events that popped up, were in fact relevant to the operators. Name of the product, Icetana from Australia.

D, for Icetanta, can you elaborate on what type of events were part of the 98% relevant ones? How did the system know what was 'relevant'?

I'm in for more details as well. Unfortunately their website doesn't really elaborate on whether or not you have the option of setting rules, regions of interest, etc. Seems like an interesting take on analytics in general though.

John a few examples included, in a hotel environment the system picked up a lost and unaccompanied child walking around in circles in the lift lobby area (before the kid was reported missing), jet skiis passing through a normally untrafficked water area, some drunk people picking up a vase and running across the hallways, a couple loitering in the corridors in the late hours and the man lifts her skirt up, driving against traffic in the parking lot and a number of other examples. The difference we see in this product is frankly the fact that it automates intelligence based on what in theory is a pretty basic concept. The system gets to learn what a typical scene looks like and keeps a memory of that area, so when something out of the ordinary occurs, it identifies the difference in scenary. As per our technical understanding that does not mean the system is able to distinguish what is actually happening in the scene, but rather the difference occuring in the scene. For example, in a metro station a camera looking at the waiting platform would "learn" that a typical crowd scene is people standing within a certain area waiting for the trains to pull up. If someone suddenly jumps onto the tracks, the system is able to recognize that the tracks don't typically have people walking on them, so it identifies that as an anomaly and alerts the control room with a brief 45 second pop up. The operator can choose to acknowledge and react and add a comment to that event, or else it just disappears and is logged as an event. Take the example of the unaccompanied child above. The system is used to seeing adults or even kids standing around waiting for the lifts. When it "saw" a child walking around in circles, and "identified" the height of the child, and the fact they were unaccompanied, it took these factors into its assessment and concluded, this lobby does not typically have 4 year olds walking around alone in circles unaccompanied by an adult. That said, you cannot write rules for the system to look out for. So in the case of one of our clients they have a helicopter landing and taking off on a regular basis from the helipad. Because this is a regular event, the systems started ignoring the event as an anomaly. Whereas the client wanted a logged event every time the chopper landed. So there are pros and cons to the system, but given we are in the Middle east and the capability of operators in the control room are fairly limited and the volume of cameras installed at particular sites run into the hundreds even thousands, we would say Icetana is a good choice due to the non-human interference factor. Ideal would be to deploy a combination of both rules based and automated analytics technologies on sites, but to date analytics is seen as a "nice to have" not essential part of the security infrastructure yet.

So Icatena is doing what BRS Labs in the US claims to do, yes?

I am not criticizing. I am just amazed to see anyone else trying or doing that.

I'm not familiar with the capabilities of BRS so unfortunately cannot comment. But as an analytics sceptic since 2007, and having toyed with different systems including Iomniscient, Delopt, Allgo Vision, and ISS, we have settled to proposing only Ipsotek (UK) and iCetana (Australia) to our customers sometimes in combination. Mostly because Ipsotek is rules based and iCetana automated.

Frankly to date we still treat analytics as a value add to the overall security infrastructures we deploy. If you are interested in knowing more about iCetana, I can put you directly in touch with their Australian sales rep. I'm sure they would be happy to provide you with a demo unit for trial purposes. We have deployed their server on 3 different sites to date, and on all 3 sites (2 hotels, and a car dealership) it has performed.

The real question which I suspect is impossible to answer is "What is the system missing whilst analyzing the video streams?" I can only pass judgement on the alerts received of which most have some relevance to security and operations.

One last thought, in terms of use in the control room, I find it very effective from an attention alerting perspective. The fact that the screen remains blacked out and only on alerts does a video pop up in a cube, means the operator doesn't become overwhelmed with alerts and eventually disables the system or completely ignores it. The video disappearing after 45 seconds, yet still logged in the events list makes it a useful tool for even the laziest of control room operators.

Und D, based your experiances, could you give a rough estimate on the accuracy rate of Ipsotek and iCetana? 80%? 90%? Do they tend to false detect more than miss detection, or the other way around? Thanks.

Luis it is impossible for anyone to really give you a statistic on what these systems are missing, unless you sit in front of the recorded video and do a manual viewing of hours of video and note what you deem as important, versus what the system was programmed to pick up. On the flip side today we had an interesting meeting with the guys from Ganetec. They basically have a different concept to anyone we seem to have come across. They are in essence a Analytics platform that is fully integrated with Milestone. On this platform they take modules from various different analytics providers based on the particular feature strength that provider is known for. So they aren't an analytics provider but rather a platform. Has anyone used them or have any comments regarding this approach?