Dahua Emotion Recognition Analytics Tested
By Rob Kilpatrick, Published Dec 21, 2020, 11:02am EST (Research)The advance of AI promises more sophisticated recognition. For example, Dahua markets the ability to distinguish between angry, calm, confused, disgusted, happy, sad, scared, and surprised.
We tested the Dahua IPC-HDBW7442H-IPZ running these analytics, examining:
- How accurate is emotion detection?
- Does accuracy vary depending on emotion?
- How does the angle of incident impact accuracy?
- How does distance/PPF impact accuracy?
- Does facial hair impact accuracy?
- Do face coverings affect detection?
Summary
Dahua's emotion detection made frequent mistakes, though its most common categorization was 'calm' for people who had neutral/expressionless faces, which is the most common state for people. Happy and confused were accurate most of the time, while angry, disgusted, sad, scared, and surprised worked poorly.
This chart provides an overview of the performance for each category Dahua offered:
Because the camera attempts to classify emotions any time a face is detected, even when blocked, facing away, or low PPF, along with its tendency to classify subjects as "Calm" in these instances, the analytic may appear accurate, since most subjects are likely to be expressionless most of the time.
Best Accuracy At High PPF/Shallow Angle (But Still Significant Issues)
In our testing, we began with a simple scene, well lit, at a low angle of incidence, with subjects facing the camera at close range to eliminate possible factors such as harsh angles or low PPF.
Even in these conditions, emotion detection was mostly inaccurate. The only expressions consistently accurately detected were Happy:
And Calm (although the camera classified users as Calm under many conditions, discussed below):
However, other expressions such as angry, sad, surprised, or scared were most often detected as calm, or sometimes happy.
Accuracy Worse At Harsher Angles and Lower PPF
At angles of incidence greater than ~45°, most subjects were classified as Calm, regardless of their actual facial expression.
The camera even classified the backs of subjects' heads as calm during testing:
Similarly, at low PPFs, subjects were only classified as Calm or Happy.
Incorrect Classification When Face Obscured
Since the camera attempts to classify emotions any time a face is detected, it attempted to classify even with much of the subject's face obscured. For example, when partially blocked by cardboard, the camera still classified the subject, but only as Calm and Happy.
Reduced Accuracy On People With Facial Hair
During testing, people with facial hair expressing a facial emotion were missed and instead mostly considered calm, below the person with a beard and angry face were not classified as angry.
No Sad Or Angry Classification
In our tests, no face was ever classified as sad or angry, instead, they were mostly classified as confused, happy, or calm. Below, the person has an angry face but is classified as calm.
Confused Accurate On People Without Facial Hair
People walking through the scene that were showing a clear confused face that was not obscured by facial hair were accurately detected.
Though when people were confused and had facial hair, Dahua emotion analytics did not accurately detect the emotion.
Disgusted Inconsistent Detection
The disgusted emotion was sometimes detected on a person walking through the scene with a clear disgusted expression.
However, most of the time the disgusted expression was not detected, even on an exaggerated attempt to get it to classify.
Simple Setup
Emotion detection setup is simple, the user must turn on video metadata in settings, and then it will show the metadata including the emotion classification on the live view screen.
Tech Support And Dahua Feedback
Dahua USA tech support tried but was unable to help improve emotion detection, beyond getting a clearer shot of the face, while Dahua management did not reply to our request to comment on the performance.
Pricing
The Dahua IPC-HDBW7442H-IPZ can be found online for ~$920 USD.
Versions Used
The following versions were used during testing.
- Dahua IPC-HDBW7442H-IPZ: V2.800.0000000.5.R, Build Date: 2020-05-25
1 report cite this report:
Comments (11)
For the life of me, I can't think of a real world use case in the video security world where these analytics would improve a live monitoring or investigative/forensic situation. Particularly in light of the inaccuracies and impossibility to detect someone's real emotional state from a video camera. I am "surprised" and "disgusted" at the same time more frequently than I'd prefer. How does that get resolved with any analytic?
There may be use cases in interrogations (lie detection perhaps) or marketing analysis but that's about it...I think. And that's based on body language as much as it is facial expression.
LMK what I'm missing.
This was the funniest article in a long time... I love the faces (and the corresponding misclassifications). I bet you guys had a great time testing... thank you for making us smile at the end of 2020!

12/22/20 03:22pm
Any actor can make expressions of feelings that they are not really feeling and anyone can train to do so and to CHEAT any system. It is much easier to train someone to perform as an actor than it is to train someone to fly an airplane, remember 9/11. Why insist on an easily deceptive system that clearly will not deliver the expected results?Totally misleading. It is NOT SECURITY. It is a FAIRY TALE. What really exists and works are tiny movements caused by facial muscles that DO NOT DEPEND on the person's will. Which means that from an internal emotional sensation, the person INVOLUNTARILY produces a tiny muscular movement on the face. This movement is detected by people who are experts in the subject, or by someone who has studied this to perceive people's reactions at business meetings and conduct the themes according to the receptivity or not of the people, expressed by these imperceptible and again INVOLUNTARY facial movements. This does exist, it is real. But ONLY PEOPLE can perceive these muscle movements, TECHNOLOGY DOES NOT do this yet for a very simple reason, because there are MORE THAN 250,000 known micro movements, each one expressing a result, which would need to be written in the algorithm in order to individualize these patterns for each person. That simple. Easy and quick to do, isn't it?🤔
To be fair, it would be better to test this in real life. Trying to make expressions for feelings you are not really having is not an accurate test. I would love to see this used for a period of time in a real life situation and then compare what it finds by someone who is trained to read expressions. I do understand that this would be difficult. And I am sure it would not be much better, but it would be the right way to test it.