A Detroit woman has filed a lawsuit against the city after facial recognition technology incorrectly identified her as a robbery suspect, leading to her arrest while she was eight months pregnant. The case adds to growing concerns about the accuracy and bias of AI-powered identification systems used by law enforcement agencies.
Porcha Woodruff, 32, was preparing her two children for school on February 16, 2023, when six police officers arrived at her home with an arrest warrant for carjacking and robbery. Woodruff, visibly pregnant, initially believed the officers were joking. Despite her protests and pointing to her condition, officers handcuffed her in front of her children and detained her for 11 hours.
The arrest stemmed from a facial recognition match generated by Detroit Police Department software. Detective LaShauntia Oliver ran video footage from a gas station through the system, which returned Woodruff’s photo as a potential match. Oliver then used an eight-year-old booking photo of Woodruff in a lineup presented to the robbery victim, who identified her as the perpetrator.
Critical details were overlooked during the investigation. The video footage showed a woman who was clearly not pregnant. Woodruff’s current driver’s license photo, which would have shown her pregnant state, was available but not used. When Woodruff and her fiancé urged officers to verify whether the suspect in the warrant was pregnant, they refused.
Woodruff was released on a $100,000 personal bond and required hospital treatment for dehydration and contractions following her detention. The Wayne County Prosecutor’s Office dismissed the case on March 6, 2023, citing insufficient evidence.
The lawsuit, filed in U.S. District Court for Eastern Michigan, names Detective Oliver and the city as defendants. It alleges negligence, false arrest, and violations of the Elliott-Larsen Civil Rights Act. Woodruff’s legal team argues that the department’s reliance on facial recognition technology without adequate verification procedures caused significant harm.
Woodruff is the sixth known person to report false arrest based on faulty facial recognition matches, and the first woman. All six individuals have been Black. Research from the National Institute of Standards and Technology indicates that facial recognition algorithms are 10 to 100 times more likely to misidentify Black and Asian individuals compared to white people.
Former Detroit Police Chief James Craig previously acknowledged that the department’s facial recognition system would produce incorrect matches 96 per cent of the time if used as the sole basis for arrests.
In response to these incidents, the Detroit Police Department has revised its policies. Officers may no longer arrest individuals based solely on facial recognition results. Photo lineups generated from such searches also cannot serve as the primary basis for arrests without additional corroborating evidence.
The case highlights ongoing debates about AI deployment in criminal justice. Civil liberties organisations, including the ACLU, have called for federal moratoriums on facial recognition technology in policing. They argue that current systems lack sufficient accuracy safeguards and perpetuate racial disparities in law enforcement.
