Member Discussion

Standardized Testing For Video Analytics Accuracy?

Is there industry-wide accepted standardized testing for determining video analytics accuracy? We are doing our own testing but it would be better to confirm to a standard.

Hello Undisclosed Manufacturer,

i-Lids is the only one I am aware of



In short, no.

And I'm not sure if it would be of much benefit. The best test is probably an on-site evaluation at your site.

There are things like i-Lids, but IMO that's not a very realistic test scenario. It may or may not have scenarios that match your environment, anticipated camera coverage area, or other factors.

The bigger question is "what do you mean by video analytics"? This could range from detecting people indoors with a cheap camera to detecting wrong objects on a moving conveyor belt.

I think i-Lids is helpful in the UK / Commonwealth but limited in acceptance / impact elsewhere.

A key question is what do you want to accomplish with your testing:

  • Marketing benefit - use it to convince prospects to buy?
  • Technical benefit - use it to ensure the product works well in real use?

For the former, i-Lids is the 'best' of a very limited number of options and still not that big of a marketing impact.

For the later, there's nothing 'accepted' but my advice is not to sandbag it. Manufacturers frequently choose easy scenarios and then fool themselves into thinking that it works in general. Pick a few harder customer sites and then send a senior engineering / product person there for a few days to test performance and see what the customer's concerns / feedback is.

Hi, what types of video analytics you wanna to test? We have some data-sets we could provide (and we're working on making those data-sets free to the community, but it will take some time).

If you need a testing video data, we have a database of public live streaming cameras in +120 countries ( accessible via API (and RTSP/RTMP/..)


iLids is pretty good in terms of checking how the analytics perform against outdoor disturbances such as rain, fog, snow, fast lighting changes (sun, clouds, etc...), foliage movement, little animals (lots of foxes and rabbits!), day-night passage. These are the things you do not want your analytics to trigger on. Detecting is easy, not detecting what you do not want to detect is hard. And the iLids dataset is pretty good on this last part.