False Facial Recognition Arrest Of Pregnant Women

JH
John Honovich
Aug 06, 2023
IPVM

Related: Facial Recognition-Based False Arrest Implicates Clearview AI, Reports NY Times, CNN Interesting Story: "A False Facial Recognition Match Sent This Innocent Black Man To Jail", Fever Tablet Facial Recognition Misidentifies Teenager, Alleges Racial Profiling.

Interesting NY Times report: Eight Months Pregnant and Arrested After False Facial Recognition Match - The New York Times

Porsha Woodruff thought the police who showed up at her door to arrest her for carjacking were joking. She is the first woman known to be wrongfully accused as a result of facial recognition technology.

Details on the match:

A woman who matched the description given by the victim dropped off his phone at the same BP gas station, the police report said.

A detective with the police department’s commercial auto theft unit got the surveillance video from the BP gas station, the police report said, and asked a crime analyst at the department to run a facial recognition search on the woman.

According to city documents, the department uses a facial recognition vendor called DataWorks Plus to run unknown faces against a database of criminal mug shots; the system returns matches ranked by their likelihood of being the same person. A human analyst is ultimately responsible for deciding if any of the matches are a potential suspect. The police report said the crime analyst gave the investigator Ms. Woodruff’s name based on a match to a 2015 mug shot. Ms. Woodruff said in an interview that she had been arrested in 2015 after being pulled over while driving with an expired license.

Face rec company is DataWorks Plus and on LinkedIn showing 60 employees:

IPVM Image

The NY Times explains that they used an older 2015 mugshot photo of the woman falsely arrested rather than her newer driver's license photo:

IPVM Image

Matching people by facial recognition is fundamentally difficult because (1) many people look like many other people and (2) even the same person can look significantly different depending on lighting, angle of incident, facial expression, etc. For example, at first, I wasn't sure that the two photos above were of the same person, but the NY Times says they are.

I suspect most of the time, the police make the right match, but it's really hard not to ever make mistakes with this, and when they do, the damages to the person falsely arrested can be severe.

(1)
(1)
Avatar
John Evans
Aug 08, 2023
JCI / Software House

It's easy to blame the technology; but the shoddy police work done in this instance was absolutely galling. The simple fact they ignored she was 7.5 months pregnant at the time of the incident and never took the time to ask the victim if the woman he was with was pregnant beggars belief.

(2)
(1)
CM
Chris McKechnie
Aug 09, 2023

I think a lot of the time, analytics like facial recognition are sold as these amazing technologies that are perfect. What I tend to tell people when selling facial recognition is that it's like a person holding a photo up to a screen and comparing the two. About as accurate too because humans can make mistakes. What people have to remember is that these are tools to be used, and are only as good as the people using them. If used incorrectly (like this clearly was), then it's going to produce very poor results. Additionally, there is no excuse for the enormous oversight of the woman in question being visibly pregnant and the eye-witness not mentioning it.

I've worked with enough of these systems to see when facial recognition is reviewing footage/pictures it gives probabilities and is only as the information provided to it. If it's given pixel junk, then it's going to have a lower probability of being accurate, vs an 8MP straight on profile of someone's face compared to a high resolution photo on file. I'm guessing their threshold might be really low because they're desperate to try and find someone to hold accountable.

When it comes down to it, these technologies are meant to be used to generate a lead, something in the toolkit of an investigator, not a substitution for a human officer. All of that said, I think the most concerning part about the story is that it made it all the way to court. That so many people just let this slide, the police, the prosecutor, the judge. All of them looked at this and were like, "Yep, this makes sense.".

(2)
(1)
New discussion

Ask questions and get answers to your physical security questions from IPVM team members and fellow subscribers.

Newest discussions