Recently while helping a non-tech proprietor friend configure their newly acquired NVR, I had one of those unexpected moments that seem to come when you are explaining something you know comparatively well to someone who doesn't. And then they ask the question: "How come if the frame rate is set to 10 FPS are there many seconds where there are only 7 and some where they are over 11?" And I gave a smug, assured answer that the configured framerate was actually just a target, and the indicated framerate just a average. And then came the "well whatever, but its way jerky" and "the old one didn't do that". Mind you there is no integrator to blame for failed expectations; the patient self-medicated thru the internet because of confidence gained in mastering the old analog system. So we played with the frame rate a bit but still there was significant "jerkiness" and the occasional quarter second hiccup. We have all seen this to some degree, so maybe were hardened to it, but seeing it thru the eyes of someone naive can make you think again.
So how do you know what's normal jitter and what's indicative of something actually awry? 10%, 20%, if you can see it at all? Does the recorder use a camera-set timestamp of the frame to eliminate jitter on playback? Are there any IP cameras/nvr's known to be super smooth? As good as a consumer's camcorder? Thanks.