Or am I doing something terribly wrong. I've noticed that several major VMS products really stress a client workstation CPU when displaying multiple camera images for live viewing. Much worse than the old DVR days. I tend to believe that it is a function of multi megapixel cameras and H.264 compression.
One example is the control room of a correctional facility we installed with 2 monitoring clients PC's (Dell I5 processors and Nvidia graphics cards). Each workstation has 2 large LCD displays where they monitor the facility. If we place, for example, 16 camera views on each monitor, the CPU utilization goes to 100%, yet the network utilization is only 2-3%. So the bandwidth is low, but the decoding must just really stress the CPU. The only solution we've found, after discussing it with the VMS company is to slow the live view frame rate way down. It greatly reduces CPU load, but causes the live view to be very choppy. We've seen this with other VMS's as well.
Anyone else noticed this or discovered any cures for it, (short of installing Quad Xeons)?