Verification of frame rate and time of exact frames within video. It could be rounded to hundredths in most cases for security - but then if any cameras recorded on platform have 100fps or more the programming would need to change. In machine vision it has to go to hundred thousandths of a second.
Sounds like one of those goofy things that gets put in RFPs by someone who wants to weigh the bid for someone they favor.
Seems utterly pointless if your video CAN'T go over 30fps anyway, although if you really need that kind of granularity, I suppose it's going to be more consistent than specifying video frames, assuming your framerate itself is variable from one camera to the next (ie. some are 30fps, some are 10fps, some are 4fps, etc.)
The next point is, does it really matter, if your time sync isn't accurate to that degree? Or for that matter, if whatever you're comparing to isn't perfectly in sync as well? Like if you're needing to match video to sales receipts, the POS system would have to be synced to within 1/1000s of the DVR for that kind of resolution to be relevant.
Network Time Protocol is a network-based protocol for synchronizing and managing time over networks. Although NTPv4 has sub-nanosecond precision, it can coordinate time across systems at the millisecond level.
I wonder if it's possible that these systems have chosen to use NTP, and have chosen to display the expected precision achievabe, which is on the order of three decimal places.