Nvidia Deep Learning Developer Kit Examined

By Brian Karas, Published Apr 17, 2017, 10:29am EDT

Can building a deep learning video analytics application be as easy as the typical VMS installation? Maybe not quite, but Nvidia is aiming to make it much easier with their developer kits and shared software libraries that can enable a decent linux user to have an analytics application up and running in a day.

At ISC West 2017, Nvidia hosted a 90 minute "hands on" development workshop featuring their development kits and examples on training the unit to classify various images and to identify objects in live video. 

IPVM attended this workshop to better understand how Nvidia is approaching the surveillance market and promoting their products to manufacturers and integrators.

****** ********

****** ******** ~** ************ **** ****** *** ********** ****, ***** *****, ******* Ubuntu *** **** ******'* "******" **** ******** ******** kit ******* *********. 

********* **** *** ******* a ************ *** ****** of ********* ******* ** what ** ******* ********'* ****** **** ******** intro ******, ***** **** ********** on ****-***** ******** ** artificial ************/**** ********.

Pre-Trained *******

*** **** ******* **** pre-loaded **** ********* **** previous ************** ********, ***** Nvidia had *** * ****** of ****** ****** **** a ******** *******, ** a ******* ********* ******** to ** ********** ********. The ***** ******** ******* is ********* *************** ********* and *** **** ******** from **** ** *****, depending ** *** ****** of ****** ***** *** into ** *** *** analysis ********. *** ****** of *** ******* ** a ********** ***** **** that ******** * **** of ********** ********* *** each ****** *** ****** has **** ******* ** recognize/classify. ***** ***** *** be *********** ** ***** powered *******, **** ** those ********* *** *** processor, ******** ***** ******* to **** ********* *** same ******* ** ***** images ** ***** ******.

Image ************** ********

** **** ** *** workshop, ********* ***** **** static ****** ********** * single ***** ** ********* fruits (*******, ******, ***.) to * ****** **** would ******* ** ******* known ******* *** **** light ** *** ************* to *** ************** ****. Next, ****** **** ******** objects **** *** **** a ****** **** ***** output *********** ** ***** requested ******* **** ***** in *** *****. ** each ** ***** ***** the ******** **** *** the ***** *** ~*-* seconds.

** *** ******* ********, the ******* ****** *** used ** ******* **** video. ******* ** ****** in ***** ** *** camera ******* ****** ** glass *******, ********* *** classifier *** **** ********** trained ** *********, *** supplied **** (* ****** script) ***** **** * red *** **** *** bottles. *** ****** ***** be ****** ** ****** the ***** ** ************ of *** ***, ****** a ***** ** *** objects ***** ** *** image, ** **** ***** programs ***** ** *********** a ***** ****** ****: 

****** *** **** ***** was ********* ** ~****, analysis **** *** **** slower, **** * ******* 2 ****** ***, ****** analyzing ****** ******* *********, though **** *** ********* due ** *** ********** nature ** *** **** scripts.

Workshop *********

*** ******** ********* ** demystify ******** ** **** neural ********, *** ******** how ****** ******** *** *********** tools *** ** ******** to ****** ******* ********/********* in *** ******** ******** employing **** **********. *******, several ********* ********* **** the ***** *****-******* ********, spending **** **** ****** to **** *********** *** execute ******* **** ******* with ****** **** ****** network ************.

Developer *** ********

*** ***** ********** ** experimenting **** **** ********, Nvidia's developer **** *** ****** training ******* * *** to *** ******* ******* requiring ** ********* ********** in ****** ******** *** machine ******** ***********. *******, one ***** ***** **** significant ********** ** **** and ********** ******** ** attempt ** ****** * custom ********* *******/******** *** a ****-***** ***********. *****, understanding *** ************ ** the **************/******** *** ****** recognition ********* ***** ** helpful ** ********** **** neural *******-***** ******** *** customer ************.

Deep ******** ***

***** ******* ** ********** with ****** **** ******** ***** can *** **** *********** ********* **** ****** *** ~$***. If *** ** *** want ** ***** $*** just ** **********, ** ****** *** **** ****** pre-loaded *** ** **** ** on ****** (****: ******** AWS *******). ************, ***** with ** ****** ******** **** and * ** *** Ubuntu ******* (****** *******, or ******* *******) *** ******** DIGITS *** ****** [**** no ****** *********] (******** ************ in ****** ********* *******).

Nvidia Marketing **** ** ********

**** ****** *** ******** may **** **** **** attendees ********, ** *********** Nvidia's marketing ***** ** **** awareness ** ********. *** company ******* ~$**,*** ***** of ********* **** *** room *** * ** minute *******. ****** **** ********* a *********** ******* ** their ***** ** ***** companies **********, *** ****** ********* like********, ********* ****** ********** ** * variety ** ******** ** an ******* ** ********* the ********* ** ***** products.

***** ********* *********/******** *** ******** *** **** ********* *** security ******, *** *** not **** *** **** presence **** ****** *** ** ISC ****. *******, **** like ***-***** *** *********** are ***** **** ********* with *** ******* ****** of * ******, *** not *** ******** ********** inside, ** ******* ** be **** ** ****** ********* spend ** *** ******** industry **** ***** ********* sales ** ****** ******.

More ** **** ********?

 

 

Comments (5)

Fascinating stuff.  I am in no way a developer/coder, or even a moderately competent Linux user, but I do envy the opportunity to play around with this.  Since Intel and Nvidia are heavily involved in deep learning does AMD/ATI have any foothold at this point?  Based on what I've read on here it looks like Nvidia has a significant head start.

Agree: 1
Disagree
Informative
Unhelpful
Funny

Nivdia started doing high-performance computing on GPUs about ten years ago, in fact before deep learning was widespread (to an extent the availability of the Nvidia platform enabled deep learning to be pursued). So they have advantages in mind share, marketing, software. AMD deep learning offering is called Radeon Instinct; it may well be that the AMD hardware turns out to be better for some DL applications (and worse for others, i.e. "horses for courses") but I'm not sure there's currently a good understanding of what those applications might be.

Agree
Disagree
Informative
Unhelpful
Funny

I saw the conference at ISC West and it was min blowing.. I hope to see more article about AI in general and intelligent city from Nvidia.

Agree: 1
Disagree
Informative
Unhelpful
Funny

Does anybody know if the code that they used to demonstrate the TX-1 is available? A quick look on the NVidia website didn't reveal anything. It would be fascinating to reproduce this.

Agree
Disagree
Informative
Unhelpful
Funny

Yes, I believe all the code is available.  You'll need to signup as a developer, and then you should be able to get it from the Digits download page.

Agree
Disagree
Informative
Unhelpful
Funny
Read this IPVM report for free.

This article is part of IPVM's 6,958 reports, 927 tests and is only available to members. To get a one-time preview of our work, enter your work email to access the full article.

Already a member? Login here | Join now
Loading Related Reports