Critical Or Pointless? Video To Thousandths Of A Second (0.000s)

Question for the group: In what situations is video searchable or indexable to thousandths of a second needed or useful?

Many, if not most, VMSes display three decimals of granularity in video search widgets. Why?

I remember 'searchable video to 0.001s' being written in to a few RFPs several years ago, which at the time excluded several platforms. But other than that, I am confused on the real value of this feature.

Thanks for sharing your thoughts.

Network Time Protocol is a network-based protocol for synchronizing and managing time over networks. Although NTPv4 has sub-nanosecond precision, it can coordinate time across systems at the millisecond level.

I wonder if it's possible that these systems have chosen to use NTP, and have chosen to display the expected precision achievabe, which is on the order of three decimal places.

To pick a nit further that is actually 30 frames with implied starting frame at 0.00 - anyway as stated the value is minimal to the user if there are not 100fps or more. However it makes sense on development side to do it so design does not have to change for 100fps+ cams. Specifying this without 100FPS+ cams doesn't make much sense with regard to the value of the system spec'd... just another weird spec. Or is it?

With three decimals, at 30fps, it will be like 0:031, 0:061, 0:091.

Actually, just to pick a nit, you'd be counting 0.033 (one frame), 0.067, 0.1, 0.133, 0.167 ... up to 0.967 (29 frames).

Of course, that's also assuming no dropped frames, no variation in network latency (for IP cameras), and about a bazillion other factors...

Right that's why I stated most security applications could be rounded to hundredths. As far as a spec in video surveillance it seems strange unless they were spec'ing some really special 100+ FPS cams as well...

Sounds like one of those goofy things that gets put in RFPs by someone who wants to weigh the bid for someone they favor.

Seems utterly pointless if your video CAN'T go over 30fps anyway, although if you really need that kind of granularity, I suppose it's going to be more consistent than specifying video frames, assuming your framerate itself is variable from one camera to the next (ie. some are 30fps, some are 10fps, some are 4fps, etc.)

The next point is, does it really matter, if your time sync isn't accurate to that degree? Or for that matter, if whatever you're comparing to isn't perfectly in sync as well? Like if you're needing to match video to sales receipts, the POS system would have to be synced to within 1/1000s of the DVR for that kind of resolution to be relevant.

Thanks, Joshua.

So, for verification of frames rates, how does the third decimal help?

With two decimals, at 30fps, it will be like 0:03, 0:06, 0:09.

With three decimals, at 30fps, it will be like 0:031, 0:061, 0:091.

I am sure there's some variation, instead of perfectly equal intervals every time, but I am not sure who would benefit from knowing that.

Is the use case only for 100fps+ or?

Verification of frame rate and time of exact frames within video. It could be rounded to hundredths in most cases for security - but then if any cameras recorded on platform have 100fps or more the programming would need to change. In machine vision it has to go to hundred thousandths of a second.