Need Help? (410) 625-9409

Failing to disclose use of facial recognition technology in criminal cases deprives defendants of a fair trial, Appellate Court of Maryland rules

September 2, 2025

In a criminal case, the prosecution must disclose evidence that is favorable to the defense and that could affect the case’s outcome. This precedent, set in Brady v. Maryland, is intended to ensure a fair trial. Yet prosecutors often skirt the rule by withholding information about their use of facial recognition technology to identify suspects. In August, the Appellate Court of Maryland reaffirmed the State’s disclosure obligations in Johnson v. State of Maryland, holding that a trial court abused its discretion when it denied Mr. Johnson’s motion to dismiss his case after the State failed to timely disclose its use of facial recognition technology. In its decision reversing Mr. Johnson’s conviction, the Appellate Court agreed with an amicus brief filed by the Public Justice Center, Maryland Criminal Defense Attorneys’ Association, and the Baltimore Action Legal Team.

Mere days before Mr. Johnson’s trial, the State disclosed its use of facial recognition technology to identify him without providing additional details on the specific software used or whether the technology generated or should have generated additional leads beyond Mr. Johnson. The Montgomery County Circuit Court judge, concerned by the late disclosure, offered the defense a continuance (the opportunity to postpone) for further discovery but denied Mr. Johnson’s request for dismissal based on Brady or discovery violations. Mr. Johnson, however, chose to proceed with trial as he had been in jail for a year and recently had been acquitted of unrelated burglary charges. The trial court convicted Mr. Johnson, and he appealed.

In support of Mr. Johnson, the PJC and allies filed an amicus brief to address the documented issues with facial recognition technology. Written by PJC Murnaghan Appellate Advocacy Fellow Sahar Atassi, the brief showed that despite the increasing prevalence of this technology, its use is rarely disclosed to individuals facing prosecution, even though studies have highlighted its susceptibility to error, inconsistencies across different programs, and racial biases. The risk of misidentification—especially among people of color—raises serious concerns. Because law enforcement’s reluctance to disclose the use of facial recognition technology prevents meaningful scrutiny, depriving people of their right to a defense and a fair trial, we argued that the State’s failure to timely or sufficiently disclose its use of facial recognition technology constitutes a Brady violation and a breach of discovery obligations.

In addition, we argued that, when considering whether the offered continuance was an adequate remedy for the Brady or discovery violation, the Court should consider how pretrial detention itself undermines the fairness of the proceeding and harms individuals, their families, and communities. Detaining a person before trial makes it more difficult to mount a defense, increases the odds of conviction, and lengthens jail and prison sentences. It also causes loss of jobs, homes, and child custody; debt; and harm to physical and mental health while incarcerated. By removing people from their communities, pretrial detention burdens families and harms the education and future income of the kids of those detained. These harms fall hardest on Black and brown communities, as law enforcement and the courts disproportionately arrest Black and Latine people, hold them pretrial, and set higher bail and punishment, despite similar criminal backgrounds and charges as white people.

The Appellate Court agreed with concerns raised in our amicus brief, including that the State’s late disclosure deprived Mr. Johnson of a meaningful opportunity to challenge the AI-generated identification and reversed his conviction. Because the State, as it maintained at trial and conceded on appeal, was unable to produce any further information about the facial recognition system, including the name of the program used or other information about how that technology generated Mr. Johnson’s identity, the Court concluded that the proposed continuance offered no way to a fair trial. Notably, the opinion acknowledged the growing concerns surrounding the reliability of facial recognition technology and AI and emphasized the need for meaningful scrutiny when such technologies are used in criminal prosecutions.

Thank you to PJC paralegals Carolina Paul and Omar Arar, institutional giving manager Robin McNulty, and former administrative coordinator Becky Reynolds for their assistance with the brief.