Dahua Emotion Recognition Analytics Tested

Avatar
Rob Kilpatrick
Published Dec 21, 2020 16:02 PM

*** ******* ** ** ******** **** sophisticated ***********. *** *******, ***** ******* the ******* ** *********** ******* *****, calm, ********, *********, *****, ***, ******, and *********.

IPVM Image

** ****** ******** ***-*********-********** ***** *********, *********:

  • *** ******** ** ******* *********?
  • **** ******** **** ********* ** *******?
  • *** **** *** ***** ** ******** impact ********?
  • *** **** ********/*** ****** ********?
  • **** ****** **** ****** ********?
  • ** **** ********* ****** *********?

*******

*****'* ******* ********* **** ******** ********, though *** **** ****** ************** *** 'calm' *** ****** *** *** *******/************** faces, ***** ** *** **** ****** state *** ******. ***** *** ******** were ******** **** ** *** ****, while *****, *********, ***, ******, *** surprised ****** ******.

**** ***** ******** ** ******** ** the *********** *** **** ******** ***** offered:

IPVM Image

******* *** ****** ******** ** ******** emotions *** **** * **** ** detected, **** **** *******, ****** ****, or *** ***, ***** **** *** tendency ** ******** ******** ** "****" in ***** *********, *** ******** *** appear ********, ***** **** ******** *** likely ** ** ************** **** ** the ****.

Best ******** ** **** ***/******* ***** (*** ***** *********** ******)

** *** *******, ** ***** **** a ****** *****, **** ***, ** a *** ***** ** *********, **** subjects ****** *** ****** ** ***** range ** ********* ******** ******* **** as ***** ****** ** *** ***.

**** ** ***** **********, ******* ********* was ****** **********. *** **** *********** consistently ********** ******** **** *****:

IPVM Image

*** **** (******** *** ****** ********** users ** **** ***** **** **********, discussed *****):

IPVM Image

*******, ***** *********** **** ** *****, sad, *********, ** ****** **** **** often ******** ** ****, ** ********* happy.

IPVM Image

Accuracy ***** ** ******* ****** *** ***** ***

** ****** ** ********* ******* **** ~45°, **** ******** **** ********** ** Calm, ********** ** ***** ****** ****** expression.

IPVM Image

*** ****** **** ********** *** ***** of ********' ***** ** **** ****** testing:

IPVM Image

*********, ** *** ****, ******** **** only ********** ** **** ** *****.

IPVM Image

Incorrect ************** **** **** ********

***** *** ****** ******** ** ******** emotions *** **** * **** ** detected, ** ********* ** ******** **** with **** ** *** *******'* **** obscured. *** *******, **** ********* ******* by *********, *** ****** ***** ********** the *******, *** **** ** **** and *****.

IPVM Image

Reduced ******** ** ****** **** ****** ****

****** *******, ****** **** ****** **** expressing * ****** ******* **** ****** and ******* ****** ********** ****, ***** the ****** **** * ***** *** angry **** **** *** ********** ** angry.

IPVM Image

No *** ** ***** **************

** *** *****, ** **** *** ever ********** ** *** ** *****, instead, **** **** ****** ********** ** confused, *****, ** ****. *****, *** person *** ** ***** **** *** is ********** ** ****.

IPVM Image

Confused ******** ** ****** ******* ****** ****

****** ******* ******* *** ***** **** were ******* * ***** ******** **** that *** *** ******** ** ****** hair **** ********** ********.

IPVM Image

****** **** ****** **** ******** *** had ****** ****, ***** ******* ********* did *** ********** ****** *** *******.

IPVM Image

Disgusted ************ *********

*** ********* ******* *** ********* ******** on * ****** ******* ******* *** scene **** * ***** ********* **********.

IPVM Image

*******, **** ** *** **** *** disgusted ********** *** *** ********, **** on ** *********** ******* ** *** it ** ********.

IPVM Image

Simple *****

******* ********* ***** ** ******, *** user **** **** ** ***** ******** in ********, *** **** ** **** show *** ******** ********* *** ******* classification ** *** **** **** ******.

Tech ******* *** ***** ********

***** *** **** ******* ***** *** was ****** ** **** ******* ******* detection, ****** ******* * ******* **** of *** ****, ***** ***** ********** did *** ***** ** *** ******* to ******* ** *** ***********.

*******

*** ***** ***-*********-*** *** ** ***** online *** ~$*** ***.

Versions ****

*** ********* ******** **** **** ****** testing.

  • ***** ***-*********-***: **.***.*******.*.*, ***** ****: ****-**-**
Comments (11)
UM
Undisclosed Manufacturer #1
Dec 21, 2020

For the life of me, I can't think of a real world use case in the video security world where these analytics would improve a live monitoring or investigative/forensic situation. Particularly in light of the inaccuracies and impossibility to detect someone's real emotional state from a video camera. I am "surprised" and "disgusted" at the same time more frequently than I'd prefer. How does that get resolved with any analytic?

There may be use cases in interrogations (lie detection perhaps) or marketing analysis but that's about it...I think. And that's based on body language as much as it is facial expression.

LMK what I'm missing.

AM
Andrew Myers
Dec 21, 2020

Here's a couple uses:

If you're a super invasive retailer, you could use this to build profiles of shoppers and more accurately predict what they want.

If this were extended to microexpressions, you might be able to predict if someone in a crowd was about to start trouble. I'm pretty sure there was a paper several years ago about using emotional analytics to spot terrorists before they acted. (Though given some of the stories I've heard, the underlying emotions could be totally different than what you'd expect.)

Of course, if the analytic isn't accurate, then it's hard to see it being useful.

(1)
MS
Mihai Simon
Dec 22, 2020

This was the funniest article in a long time... I love the faces (and the corresponding misclassifications). I bet you guys had a great time testing... thank you for making us smile at the end of 2020!

(1)
Avatar
Humberto Macedo
Dec 22, 2020
Blue Tunnel Corp.

Any actor can make expressions of feelings that they are not really feeling and anyone can train to do so and to CHEAT any system. It is much easier to train someone to perform as an actor than it is to train someone to fly an airplane, remember 9/11. Why insist on an easily deceptive system that clearly will not deliver the expected results?Totally misleading. It is NOT SECURITY. It is a FAIRY TALE. What really exists and works are tiny movements caused by facial muscles that DO NOT DEPEND on the person's will. Which means that from an internal emotional sensation, the person INVOLUNTARILY produces a tiny muscular movement on the face. This movement is detected by people who are experts in the subject, or by someone who has studied this to perceive people's reactions at business meetings and conduct the themes according to the receptivity or not of the people, expressed by these imperceptible and again INVOLUNTARY facial movements. This does exist, it is real. But ONLY PEOPLE can perceive these muscle movements, TECHNOLOGY DOES NOT do this yet for a very simple reason, because there are MORE THAN 250,000 known micro movements, each one expressing a result, which would need to be written in the algorithm in order to individualize these patterns for each person. That simple. Easy and quick to do, isn't it?🤔

(1)
KT
Kris Tibbitts
Dec 24, 2020

Yes, it would be impossible to get anywhere near 100% accuracy. But even if you can get a measly 50% accuracy when live monitoring or getting notifications, it could possibly prove helpful. If it prevents one violent act, I think it could all be worth it.

KT
Kris Tibbitts
Dec 24, 2020

To be fair, it would be better to test this in real life. Trying to make expressions for feelings you are not really having is not an accurate test. I would love to see this used for a period of time in a real life situation and then compare what it finds by someone who is trained to read expressions. I do understand that this would be difficult. And I am sure it would not be much better, but it would be the right way to test it.

JH
John Honovich
Dec 24, 2020
IPVM

Trying to make expressions for feelings you are not really having is not an accurate test

Kris, if I understand you correctly, you believe Dahua realized that Rob (below) was just faking being upset and that Dahua realized that Rob was actually happy deep inside?

IPVM Image

While I am partially joking, my point is that any of these video-based emotion detection systems are inherently claiming to recognize the visible expression of emotions, not the actual emotional state of the person. That's fair, no?

To be clear, because of this, there is an inherent risk in any video-based system since it assumes the visible appearance of 'happiness' is equal to actual happiness when for many reasons people might give the appearance of being happy when they are not (or vice versa).

That said, if a system categorizes Rob's expression above as happiness, it indicates it has some fundamental problems.

Btw, we tested this real-life as well with various recorded videos and the patterns were the same.

KT
Kris Tibbitts
Dec 27, 2020

Yeah, I am not saying that it would turn out any different. Was Rob happy? When was the last time you saw an angry face actually look like that? Maybe the system was correct. (half joke) That is my point. It is just hard to judge the system when not using it the way it was intended. I would prefer to see the real world results rather than posed facial expressions. Also, I am sure it will get better over time. Again, no where near perfect, but better. Enough to possibly a first line of defense.

JH
John Honovich
Dec 27, 2020
IPVM

It is just hard to judge the system when not using it the way it was intended.

Kris, my point is that is the way it is intended. Dahua is not claiming that they can read minds. Dahua is claiming that when presented with an expression that stereotypically looks like 'calm', or 'anger' or 'surprise', etc., they will report appropriately.

IPVM Image

In this case, the happy icon represents the lips physically turned upward, which people typically associate with happiness. Rob's lips are turned downward, which typically is associated with being unhappy:

IPVM Image

And, again, we tested it in 'real life scenarios' as well, with the same fundamental results, but we are showing these simple, ideal ones because if we show a real-life one where Rob is angry but he is farther away or at an angle or his head is partially obscured, people will complain that it's just because of the bad camera positioning.

Enough to possibly a first line of defense.

Sure, if you want to use it, that's fine. Ultimately, like the flawed gender video analytics we tested, it depends on what accuracy one needs and how much they can tolerate mistakes.

KT
Kris Tibbitts
Dec 27, 2020

Not saying that it works... But also, an emoji is not necessarily an actual representation of what the system may be looking for. What you may consider as the "stereotypical" expression may not be what the system is looking for. He looks like he has and itch in his mustache and is trying to scratch it without touching it to me. I think that would be more of an annoyed feeling though. Regardless of how you guys did it and posted your findings for this article, I would still really be interested to know what it would do over a period of time in the real world. It is obviously something that you can not recreate so it is not fair to you guys. This is not a feature that I would ever need to use or sell, just interesting to see what is being tried with technology.

I wonder if it saw his beard as a smile. (serious) You should have him shave then try it again. (joke)

JH
John Honovich
Dec 27, 2020
IPVM

I wonder if it saw his beard as a smile. (serious) You should have him shave then try it again. (joke)

We addressed this in the report, copied below for your convenience:

During testing, people with facial hair expressing a facial emotion were missed and instead mostly considered calm