When we tested several different VMS platforms this was usually the case. Exacq was the only platform we found that wasn't a resource hog while viewing multiple HD cameras. Unfortunately for you if you are using milestone the only solution you have is to build an insanely fast box to run it, use low quality camera video streams or switch video compression from H.264 to MJPEG.
As for why Exacq is able to run more efficiently then milstone its is simple. Exacq was build from the ground up without using developer resources like Microsoft .net or direct x. Using developer resources allows companies to get to market quick but unfortunately when software is written using these resources you end up with bloat ware. Software with all sorts of extra dependencies that aren’t fully used. Thats why the exacq client install is 25MB in size and the milestone smart client is 220MB. Thats a whole lot of extra code running in the background and loaded into memory.
Why are you running at "full hd" to the client? Presumably by that you mean 1080p, which is also likely the resolution of your monitors, meaning that you're displaying a lot less than 1080p by having 4 cameras up on 1 screen. However, you're still moving all that data through the network and to the client. Can you setup the client to view a lower-res camera stream, while still possibly recording 1080p?
Just to give a point of comparison, here are some findings when using Milestone with 4 streams configured as follows:
Stream 1: Axis Q6035-E 1080p, H.264, 15 fps, comp. 35
Stream 2: Axis P3367 5MP (running in 3MP mode), H.264, 10 fps, comp. 30
Stream 3 and 4: Axis P3364 720p, H.264, 15 fps, comp. 30
- Only workstations running a Core i5 or better are able to display all 4 streams on SmartClient without causing a significant slow-down
- Anything with less than a Core i5 requires a different Milestone view configured for down-sampling (image quality set to "Super-High" or "High")
- Upgrading RAM / video card does not seem to change performance, only CPU
- Even on a Core i7, I still get ~ 1/3 CPU usage across all cores, but there is little noticeable overall slow-down
- Try setting "Maximum Decoding Threads" to Auto (in SmartClient Options --> Advanced)
FWIW I have Milestone running on and Intel i7 3960x hex core (a leftover machine I had). I tested 16, 3 mp cameras running and ended up with close to 90% CPU but no slowdown. I also have a completely dedicated network to the cameras.
Wth a sub stream of 704x480 (which ic what I use) CPU bounces around 20%.
"Maximum Decoding Threads" is set to Auto
IPVMU Certified | 04/16/14 02:02pm
Here are some questions:
1) Are you using camera side / server side motion detection?
2) Are you running the Smart Client from the same box you're recording to?
3) What is an example bitrate for one camera in your setup?
4) Which CODEC are you using? H.264? MJPEG?
Milestone only supports multi-live streaming in the corporate edition.
Right now when I reset to 2048 x 1536 live and watch the lawn crew trim the shrubbery it is just under about 5 Mbit per sec. These are Hivision bullets so they are only 20 fps. I use 4 Tycon outdoor GigE PoE switches at the corners of the property. Each of the four go back on a dedicated outdoor CAT5 to the server which has multi-NICs (We are an IT company and I just had the i7 system around from a previous project).
I record full resolution on 22 camera. I normally live-view them all at 740x480 onto two 30 inch screens. I got about the same CPU load when I viewed the full streams in 16 browser windows. Since I'm not a professional IPVM integrator I never really did the formal measurements I would do for an IT integration project plus the machine was already in our integration lab so these are FWIW level comments.
Seneca | IPVMU Certified | 04/17/14 03:39pm
I test VMS Clients in our lab so that we can give a realistic quote to our customers.
The posts by Phil and Brian are very close to what I see in the lab.
The SmartClient is HIGHLY CPU centric ... and I saw that going as high as Dual E5 Xeons did NOT prove worth the cost. Go for a maximun I7 proc instead as Phil had mentioned.
Another thing to beware of is the interaction of the Client and Server. If you have defined an Image Quality setting of ANYTHING other than 'Full' this note in the manual applies....
While using a reduced image quality helps limit bandwidth use, it will—due to the need for re-encoding images—use additional resources on the surveillance system server.
This means that part of you issue could in fact be on the server side.
If your Viewstation has enough memory, you can play with the Video Buffering settings to help smooth out the image.
re-encoding images—use additional resources on the surveillance system server.
Thank you for confirming my suspiscions. for regular operation I set the substream resolution on the camera and the setting for substream resolution on Xprotect to be identical and it cuts down the trans-coding processor load.
With single socket i7 processors at 8 cores as of about August, hi resolution streams won't be an issue. HOWEVER, my biggest complaint with Milestone is that the internals to maintain connection to the cameras over the network are extremely sensitive to any minor network glitch. Looking at detailed network stats did not show anything from an IT network perspective but I would loose connection. As described above, with the dedicated, isolated multiple 1GigE to 10 GigE NICS all of my Hikvsion cameras have been reliably connected.
Mike, thank you very much for the helpful tips. Because the nature of our systems integration business, all our Windows machines are i7 quad or hex cores with at least eight gig of memory even our Ultrabooks have dual core i7 processors.
Just because of the set up around here we have not been using the web client and the smart client at the same time but I will take your advice when I get a little time and move the display servers to their own dedicated machines.