Hi, what do you think about the fact that Verkada is not classifying gender sometimes? The underlying question is: do video analytics users expect their system to provide a value even when the data is challenging (with high risks of error), or instead to not classify any attribute that is uncertain?
David, thanks for your first comment and a good question. To build on your question, what are people expecting to use gender analytics for? Is it primarily demographics? Or is it "show me men wearing X" or?
A lot of this seems to be more entertainment (i.e., interesting to demo) than actual use. To be clear, I am not doubting that some users make use of gender recognition but I suspect most do not.
I think that not classifying gender all the time might be fine, but it would be better if there was a specific "Unknown" category, instead of simply not classifying. As it stands now, you can search for male or female, but if you want to see non-classified person detections you can only see them by turning all filters off, so you're looking at female, male, and unclassified all at once.
Thanks Ethan. I agree, the system should provide data with high confidence and when it's uncertain, just say so. Just like a witness of an event would be sometimes able to provide information with a high level of accuracy and sometimes would not. IMHO bad data is worse than no data. Allowing the users to search specifically for "unknown" data becomes necessary with this approach.
We have been using an "Undetermined" category for some face objects, such as gender, glasses, face covering.
It has been primarily used in search instead of alerts, the idea being that you start with a search as specific as possible (Male, with glasses, blue shirt, moving in X direction), and then back off some of the specifiers depending on what you are able to get from a given scene.
The ideal case, of course, is that we have enough detail to make a definitive decision on these factors, but the reality that David points out is that many times you cannot make a perfect decision, so we are testing "Undetermined" as a way to articulate when the product knows it is unable to make a hard decision. This can be due to low resolution, shadows and poor lighting, occlusions, or other factors.
Note that we have a gender category for both Face and Person, where person takes more general appearance factors into consideration (including hair).
As an example, you can see in this scene the product classifies my face as Undetermined gender, due in part to the low resolution and angle, but when using a Person search, accurately classifies me as Male (insert low-effort jokes here).
Overall, I think that gender classification is one of the areas where we (the industry as well as Vintra) would likely see a lot of improvement in the next several years.
I think data that is uncertain is useless for an end user. I would prefer manufacturers have a reliable service/product than to have to question itโs veracity. i know false alarms in any type of system contributes to a culture that tends to ignore any output of that system(s). I think Verkada and others of that ilk are best served providing high performing products versus beta testing on their customers. Gender analytics seem like a niche category. I canโt fathom what value do you get from an analytic classifying gender.
Say I am looking for a female with a red shirt and blue jeans. Being able to search the cameras for these descriptions can be useful but not sure how many people are using this feature. This is something that can be done with Avigilon's Appearance Search.
I think we can all agree that a system with 100% accuracy would be the best. But given that this will probably never happen (for a lot of reasons), the question is to know what is the next best option. It doesn't make the system useless if it solves your problem 90% of the time. Forensic search is a common usage of gender analytics.
It doesn't make the system useless if it solves your problem 90% of the time.
David, it might 'work' 90% of the time, as long as most everyone has gender-stereotypical haircuts.
Edit: To add, I think this could anger people for being mischaracterized. It's one thing for computers to make mistakes, it's another if the viewer looks at it and thinks it's obviously wrong and possibly offensive.
" I canโt fathom what value do you get from an analytic classifying gender."
To know gender of a person crossing camera field of view could be useful for marketing strategy purpose, in some countries to ensure non mixity entrance for cultural reasons, to be alerted if a man attempt to intrude in a women only area...
We were playing around with this a few years ago. My favorite misidentification was when the system identified our Sales Leader (a male in his mid 30's) as a female. We did some digging and it appears that the algorithm gave some extra weight to his skinny jeans. He's been gone from our company for about 2 years, and I still love talking about it.
I don't understand why there is a need to classify on sex. I have been telling manufacturers for a long while that sex classifications cheapen the overall analytics package when they do not work. Until it works near flawlessly I feel like gender classifications are just a "sure, why the hell not" feature that few ask for, few need, and does more harm than good.
For example, a few years back we had a pilot of an otherwise functional facial recognition platform that the customer rejected. The reason was not because the facial recognition analytic was faulty. The rate of correct identification was very high for 2016. The primary reason is that a single high ranking team member was being classified as 5'3 woman when this person was a 5'11" male. Hair length was not the root cause. This misidentification became a running joke to the point it was the entire brand image of the product. While this was not the sole person being misclassified it was definitely the wrong person. The sex and height classification item has since been removed from the facial recognition analytics package.
I don't understand why there is a need to classify on sex
The two main use cases I have seen are for post-event search and for retail analytics applications (which are also primarily post-event).
IMO, we are in the early days of the next phase of analytics, where systems will be able to provide more details about particular attributes of an object, and also start to figure out some context of an overall scene, not just that there are specific objects in the scene.
I would not sell a system today solely based on a demand for gender classifications, as I agree it is unlikely to work reliably enough over to time to satisfy a customer focused on that one specific use case. But when implemented correctly, it can augment other functionality for the user.
While this comment is rather insensitive, I do think these features sticking to "appearance" over gender identification leads to a better user experience.
"They were wearing a red shirt and blue jeans" is something that can be designed with a greater accuracy and less bias than "a woman stole my wallet."
๐Too funny. Since no one will say it, i will. All that comes to mind is that YouTube video of the guy/gal in GameStop throwing a fit identifying as โmaโamโ ๐๐๐๐
Would it be possible to run that maโam thru the algorithm to see what it determines? ๐
One purpose not yet mentioned was the desire for live video marketing based on the subject in front of the monitor.
By identifying age, gender, race and other possible โstyleโ attributes, you could display several out of thousands of programmed marketing messages.
In my case it would advertise retirement communities, golf pants and heating pads.
By chance is the person on the left available? Asking for a friend.
ACC currently employs age/gender as a facet in the descriptive search feature. With a highly interactive function like descriptive search, the generic age/gender models only have to be good enough to initiate a search and get a reasonable set of images from which to further refine the search. Once an image is selected from initial search results the search quickly converges. The metric of success on searches is not so much initial accuracy but on how quickly an operator can find what they are looking for. For the same reason, gender results are much more accurate in an appearance search that is initiated from live or recorded video. As ACC introduces features such as natural language search or demographic analysis, more accurate gender models will be used.
AGAIN. Facial recognition algorithms either meet physics and mathematics requirements or ARE NOT facial recognition technology. That simple!
Agree
Disagree
Informative
Unhelpful
Funny
Create New Topic
Subscribe to IPVM Research to read the full report.
Why do I need to subscribe?
The IPVM Research Service includes products tests and shootouts plus competitive and financial analysis, helping decision-makers better evaluate purchasing, partnering, developing, and/or competing against companies in physical security.
Comments (30)
David Cochard
Hi, what do you think about the fact that Verkada is not classifying gender sometimes? The underlying question is: do video analytics users expect their system to provide a value even when the data is challenging (with high risks of error), or instead to not classify any attribute that is uncertain?
Thanks.
Create New Topic
Undisclosed Manufacturer #2
We were playing around with this a few years ago. My favorite misidentification was when the system identified our Sales Leader (a male in his mid 30's) as a female. We did some digging and it appears that the algorithm gave some extra weight to his skinny jeans. He's been gone from our company for about 2 years, and I still love talking about it.
Create New Topic
Sean Nelson
12/14/20 03:43pm
Its too easy to post jokes on this topic and the pictures involved, so I'm not even going to do it.
Create New Topic
Undisclosed Integrator #3
I don't understand why there is a need to classify on sex. I have been telling manufacturers for a long while that sex classifications cheapen the overall analytics package when they do not work. Until it works near flawlessly I feel like gender classifications are just a "sure, why the hell not" feature that few ask for, few need, and does more harm than good.
For example, a few years back we had a pilot of an otherwise functional facial recognition platform that the customer rejected. The reason was not because the facial recognition analytic was faulty. The rate of correct identification was very high for 2016. The primary reason is that a single high ranking team member was being classified as 5'3 woman when this person was a 5'11" male. Hair length was not the root cause. This misidentification became a running joke to the point it was the entire brand image of the product. While this was not the sole person being misclassified it was definitely the wrong person. The sex and height classification item has since been removed from the facial recognition analytics package.
Create New Topic
Undisclosed Manufacturer #4
Better add Transgender to the filters or the wacko police are coming after you. Did you test a trans?
Create New Topic
Undisclosed Integrator #7
๐Too funny. Since no one will say it, i will. All that comes to mind is that YouTube video of the guy/gal in GameStop throwing a fit identifying as โmaโamโ ๐๐๐๐
Would it be possible to run that maโam thru the algorithm to see what it determines? ๐
Create New Topic
Undisclosed Distributor #9
But then, if the system gets it wrong (in their mind) they will sue
Create New Topic
Undisclosed Manufacturer #11
One purpose not yet mentioned was the desire for live video marketing based on the subject in front of the monitor.
By identifying age, gender, race and other possible โstyleโ attributes, you could display several out of thousands of programmed marketing messages.
In my case it would advertise retirement communities, golf pants and heating pads.
By chance is the person on the left available? Asking for a friend.
Create New Topic
Hamish Dobson
ACC currently employs age/gender as a facet in the descriptive search feature. With a highly interactive function like descriptive search, the generic age/gender models only have to be good enough to initiate a search and get a reasonable set of images from which to further refine the search. Once an image is selected from initial search results the search quickly converges. The metric of success on searches is not so much initial accuracy but on how quickly an operator can find what they are looking for. For the same reason, gender results are much more accurate in an appearance search that is initiated from live or recorded video. As ACC introduces features such as natural language search or demographic analysis, more accurate gender models will be used.
Create New Topic
Humberto Macedo
12/22/20 03:02pm
AGAIN. Facial recognition algorithms either meet physics and mathematics requirements or ARE NOT facial recognition technology. That simple!
Create New Topic