I'm embarrassed to ask this question, but I can't get it out of my mind.
I went to a seminar from one of our manufacturers reps today, which was excellent.
One of the manufactures gave a presentation about the high frame rate that their camera is capable of as compared to other manufacturers. His comments left me wondering if I have been wrong for so many years, and I need an answer.
Here's the scenario:
A typical camera is capable of 30 frames/second.
He asked this question: What happens to the frame rate if you turn on WDR.
Answer: It cuts the frame rate in half, (15FPS).
You program your VMS to record at (9 FPS).
That leaves you 6 FPS for everything else. (15 FPS for WDR, 9 FPS for recording = 24 FPS)
Your live stream is now 6 FPS because that is what you have left.
Now you have a customer log in through the web app, but there are no Frames left.
He acknowledged that some cameras have dual stream, so it's still possible, but he went on to say that his camera can do 60 FPS, and the next version coming soon will do 120 FPS. Given the FPS the new cameras can accomplish, you won't have any issues not having enough frames for whatever your situation is.
Is this how IP video works? Do the frames get divided out by the number of users live streaming and the recording server? I asked 4 respected industry associates this question today and they all had the same reaction I had, which is why am asking all of you.
Is this correct, and if so, how many of us don't know it?