True Crime 9 min read

Facial Recognition Jailed an Innocent Grandmother for 163 Days

facial recognition
🎧 Listen
Mar 29, 2026
Reading mode

On July 14, 2025, Angela Lipps was babysitting four children at her home in Carter County, Tennessee, when a team of U.S. Marshals arrived at her door. They arrested her at gunpoint. The charge: bank fraud in Fargo, North Dakota, a city she had never visited, in a state where she knew no one. The sole basis for the warrant was a facial recognitionThe automated identification of individuals by analyzing facial features in images or video using AI algorithms. A match is an investigative lead, not proof of identity. match that turned out to be wrong.

Fargo police had been investigating a series of fraud incidents from April and May 2025. Surveillance footage showed a woman using a fake U.S. Army military ID to withdraw thousands of dollars from a Fargo bank. Investigators sent a still image from that footage to a facial recognition company. The software returned a match: Angela Lipps, a 50-year-old grandmother of five, living more than 1,200 miles away.

A Fargo detective reviewed the match against Lipps’ social media photos and Tennessee driver’s license. In his charging document, he wrote that she appeared to be the suspect based on “facial features, body type and hairstyle and color.” No one from the Fargo Police Department ever called Lipps to ask a single question before seeking an arrest warrant.

What followed was 163 days in jail.

108 Days Without a Voice

After her arrest, Lipps was held in a Tennessee jail as a fugitive from North Dakota. She was denied bail. She could not challenge the charges because they existed in a jurisdiction more than a thousand miles away. For 108 days, she sat in that cell while Fargo officers made no effort to retrieve her, interview her, or verify whether the facial recognition match was correct.

On October 30, officers from North Dakota finally transported Lipps to Fargo. The next day, she made her first court appearance. It was also the first time anyone from law enforcement spoke to her about the case.

Nearly two months later, on December 19, Fargo police finally sat down with Lipps for a formal interview. By that point, she had been incarcerated for 158 days. Her defense attorney, Jay Greenwood, had already obtained her bank records. They showed, conclusively, that Lipps was in Tennessee at the time the fraud occurred in Fargo.

On Christmas Eve, the charges were dismissed.

What She Lost

Lipps was released from jail with no money and no coat, in a North Dakota December. Fargo police provided no assistance with her return trip. The F5 Project, a Fargo-based nonprofit that assists people affected by incarceration, stepped in: founder Adam Martin drove Lipps to Chicago so she could make her way back to Tennessee. Her defense attorneys paid for a hotel.

By the time Lipps returned home, 163 days had passed. She had lost her house. She had lost her car. She had lost her dog. She had missed holidays with her grandchildren. A GoFundMe was created to help with expenses.

What the Police Said

Fargo Police Chief Dave Zibolski claimed investigators had “conducted additional investigative steps independent of AI” before seeking the warrant, but declined to specify what those steps were, citing an ongoing investigation. Defense attorney Greenwood was blunt in his assessment: “He’s just using an artfully crafted way of saying, ‘Yeah, our police officers looked at her social media profile and it seemed to check out.'”

The charges were dismissed “without prejudiceA legal term for a court dismissal that does not bar the same charges from being filed again in the future, leaving the case open to reinstatement.,” meaning they could theoretically be refiled. Zibolski stated that Lipps “has not been eliminated in our investigation thus far.” As of March 2026, no apology has been issued.

On March 11, 2026, the same day the story broke publicly, Zibolski announced his retirement. The Fargo Board of City Commissioners subsequently held a closed-door executive session to discuss “reasonably predictable or pending civil or criminal litigation involving Angela Lipps.” Attorney Eric Rice of St. Paul, Minnesota, is investigating potential civil rights violations on Lipps’ behalf.

Facial Recognition and the Pattern of Wrongful Arrests

The Lipps case is not the first time facial recognition has put the wrong person in handcuffs. It is at least the eighth documented case in the United States.

In 2020, Robert Williams was arrested outside his Detroit home, in front of his wife and daughters, after a flawed identification process matched him to blurry surveillance footage from a shoplifting case. He spent 30 hours in detention. In 2023, Porcha Woodruff, eight months pregnant, was arrested in Detroit on a carjacking charge based on the same technology. Randal Quran Reid, a Georgia resident who had never been to Louisiana, was jailed for nearly a week after Clearview AI’s software matched him to a crime in Jefferson Parish. Nijeer Parks spent ten days in a New Jersey jail in 2019.

At least seven of the eight documented cases involve Black individuals. Lipps’ case may be the first publicly reported instance involving a white woman, which says something about how long the technology’s failures went unnoticed in other communities.

The Technology’s Known Flaws

The problems with facial recognition in law enforcement are not new information. In December 2019, the National Institute of Standards and Technology published the most comprehensive evaluation to date: 189 algorithms from 99 developers, tested against 18.27 million images of 8.49 million people. The results were stark. Many algorithms were 10 to 100 times more likely to produce a false positive match on Black or East Asian faces compared to white faces. African American women showed the highest misidentification rates. American Indian individuals had the highest false positive ratesThe proportion of negative cases a classifier incorrectly labels as positive. A high rate means the model is flagging too many things that do not qualify. among U.S.-developed systems.

NIST researcher Patrick Grother noted that algorithms developed with more diverse training data produced more equitable results, and that the most accurate systems also tended to be the fairest. The problem is that police departments do not necessarily use the most accurate systems, and even accurate systems produce false positives that require human verification, which is exactly the step that keeps failing.

Every facial recognition vendor and every law enforcement guideline says the same thing: a match is an investigative lead, not probable causeThe legal standard requiring police to have reasonable, factual grounds to believe a specific person committed a crime before making an arrest or obtaining a warrant.. It requires corroborationAgreement among multiple sources or witnesses. The assumption that if several independent sources confirm something, it is likely true. However, corroboration is unreliable when sources share a common origin, leading to false confidence.. As Manjeet Rege, director of the Center for Applied Artificial Intelligence at the University of St. Thomas, told MPR News about the Lipps case: “That is where a human would come in and say, ‘OK, now let’s look at other information. Is this person actually at that location?'”

In every documented wrongful arrest, this step was skipped or performed so superficially that it might as well not have happened. Scrolling through someone’s Facebook photos and deciding they look similar is not corroboration. It is confirmation biasThe tendency to search for, interpret, and recall information in ways that confirm your existing beliefs, while ignoring evidence that contradicts them. with a badge.

The Structural Problem

More than 20 U.S. jurisdictions, including San Francisco, Boston, and Pittsburgh, have banned or restricted police use of facial recognition. The bans exist because the failure mode is predictable: a probabilistic tool generates a lead, officers treat it as a conclusion, and the burden of proof shifts to the accused, who may lack the resources to prove their innocence from inside a jail cell.

Lipps was fortunate only in the narrowest sense. She eventually found attorneys willing to pull her bank records. Many defendants, particularly those relying on public defenders with crushing caseloads, might not have been so lucky. The technology’s error rate is one problem. The institutional willingness to act on its output without basic verification is the larger one.

A phone call would have sufficed. A single phone call to Angela Lipps before filing the warrant would have revealed that she lived in Tennessee, had no connection to North Dakota, and had never held a fake military ID. That call was never made. Instead, a grandmother spent 163 days in jail, lost everything she owned, and was released on Christmas Eve without so much as a ride home.

What Happened

Angela Lipps, a 50-year-old grandmother of five from Tennessee, was arrested at gunpoint by U.S. Marshals on July 14, 2025. Fargo, North Dakota police had used facial recognitionThe automated identification of individuals by analyzing facial features in images or video using AI algorithms. A match is an investigative lead, not proof of identity. software to match her to surveillance footage from a bank fraud case. Lipps had never been to North Dakota.

She spent 163 days in jail. No one from the Fargo Police Department ever contacted her before her arrest. Her bank records, which proved she was in Tennessee during the crimes, were not checked until her defense attorney obtained them months later. The charges were dismissed on Christmas Eve 2025.

What She Lost

By the time Lipps was released, she had lost her home, her car, and her dog. She was stranded in Fargo with no money and no winter coat. A local nonprofit drove her to Chicago to help her get back to Tennessee.

The Broader Pattern

This is at least the eighth documented case of a wrongful arrest in the United States based on facial recognition. A 2019 study by the National Institute of Standards and Technology found that many facial recognition algorithms were 10 to 100 times more likely to misidentify Black or East Asian faces compared to white faces. Every major guideline says a facial recognition match is a lead, not proof. In every documented wrongful arrest, police treated it as proof anyway.

What Happens Next

An attorney is investigating potential civil rights violations. The Fargo City Commission has held a closed-door meeting about anticipated litigation. The police chief who oversaw the case announced his retirement the same day the story became public. More than 20 U.S. cities have banned police use of facial recognition. Fargo is not among them.

How was this article?
Share this article

Spot an error? Let us know

Sources