What Real Impact Does Utilizing The Graphical Processing Unit (GPU) Make?

For the manufacturer's and developer engineers, how do you measure or anticipate the amount of benefit the GPU makes? Why is using the GPU such a benefit? If I have a 2Ghz CPU and a 500Mhz GPU, should one expect a 25% jump in processing efficiency? Is it essentially like just adding another core? Or is it typically only used for specific functions? Inquiring minds want to know.

For the complex computing tasks the GPU provides real benefits.

Regarding Why? check the article and other materials on NVIDIA web site.


From personal experience, very rough numbers that are actual for today, for example, the specific version of deep learning facial recognition needs around 300ms on i7 to process one frame per core (in real life it will be something like 12fps total) and 35ms per frame on the old GeForce GTX 650 that has 384 CUDA Cores (~25fps). If to use something like TESLA GPU Accelerators for workstations then there will be 2400 - 2800 CUDA cores available.

My understanding is that it can be significant, but it's almost a moot point because a lot of VMSes don't offload to the GPU anyway. Genetec has been making some noise about doing GPU decoding, but it requires NVIDIA Maxwell chipsets, and you're looking at another $200 at least (some are over $1,000) for the video card, and it requires a bigger power supply, in some cases much bigger.

We haven't tested GPU performance. We had a discussion about this in 2014 in which only Avigilon was cited as supporting NVIDIA CUDA. It's been on the list to potentially test, but it doesn't seem like much significant has happened in GPU support, so it's pretty low priority.

My understanding is that it can be significant, but it's almost a moot point because a lot of VMSes don't offload to the GPU anyway.

But what about viewing clients?

Fwiw, Genetec is saying in your link that they now support embedded Intel GPUs like the new Milestone smart client.

Milestone's new 2016 client uses Intell's Quick Sync system which basicly uses the on-board Intell Graphics chip to enhance preformance. Claiming up to 75% reduction of CPU load.

All nice, but I rather had it that they made it work on Nvidia cards. It can be intresting if you're using the correct CPU's of Intell.

It would great but the software just is not there for it yet. Unfortunately other than security video applications there really isn't a market for it. There isn't anything else i can think of other tan maybe some multi-screen jumbo-trons that you could use it.

would be nice though, if i had to ballpark it I think a 2GB Nvidia card for about $150 could do about 64 720p cameras with no problem and leave the processor nice and cool. That assuming all the software was perfect though.