Human trafficking detection has a problem that no algorithm can solve: the people most likely to be trafficked are the people machines are least likely to see. Despite billions invested in artificial intelligence and pattern recognition systems, official records capture as little as 6% of actual trafficking victims in some jurisdictions[s]. The gap between technological promise and investigative reality has never been wider.
The International Labour Organization estimates that 27.6 million people were trapped in forced labor on any given day in 2021, generating $236 billion in illegal profits annually[s]. That figure has risen 37% since 2014, even as AI tools have become more sophisticated. The systems designed to find victims are failing at scale.
Human Trafficking Detection and the Bias Problem
AI tools are trained on data that reflects historical power imbalances. Facial recognitionThe automated identification of individuals by analyzing facial features in images or video using AI algorithms. A match is an investigative lead, not proof of identity. datasets overrepresent white, male faces because they draw from public figures and media coverage[s]. The result: systems that perform worst on the populations most vulnerable to trafficking.
Research confirms the pattern. AI systems are biased toward white, conventionally attractive, gender-binary presenting individuals because these are the people who have dominated publicly visible data[s]. Marginalized groups, who make up the majority of trafficking victims, are the exact people AI is least likely to accurately identify. Human trafficking detection fails precisely where it is needed most.
The Digital Ghost
There is a fundamental flaw in using AI to scan for trafficking victims online. Traffickers deliberately erase their victims’ digital presence. They confiscate phones and documents, isolate victims from family and friends, and ensure they leave no trace that algorithms could find[s]. These “digital ghosts” have no social media posts to analyze, no advertisements to scrape, no patterns to match. There is no value in using AI to focus on data that does not exist.
Meanwhile, law enforcement lacks the training to identify cases manually. A National Institute of Justice study found that officers across multiple jurisdictions were unable to identify labor trafficking[s]. The crimes are complex, the victims are often unaware they qualify as trafficking victims, and the offenses get recorded as something else entirely.
When Detection Tools Cause Harm
Human Rights Watch has warned that surveillance systems designed for human trafficking detection often cause the harm they claim to prevent[s]. Women of color, migrants, and queer people face profiling and persecution under regimes that cannot distinguish between consensual adult sex work and trafficking. A 2022 study found that ad-scrapingThe automated collection and analysis of online advertisements using software tools, often used in law enforcement to detect suspicious activity. tools were “ineffective” and “exacerbate harm” due to misalignment between developers and the communities they claim to assist.
The privacy risks are equally severe. AI scrapes data from public sources without the ability to assess what information should remain confidential[s]. Survivors, whose autonomy has already been violated, find their information in places they never consented to.
Traffickers Fight Back
As detection tools evolve, so do the criminals. Traffickers now weaponize the same AI that investigators use against them. They deploy translation tools to craft culturally nuanced recruitment messages across multiple languages, making their deception increasingly difficult to detect[s]. They use deepfakesA synthetic image, video, or audio created using artificial intelligence to replace a person's likeness with someone else's, often making it difficult to distinguish from authentic content., AI-generated images, and voice synthesisThe artificial generation of human speech using technology, allowing computers to convert text into spoken words or mimic specific voices. to impersonate trusted figures. More than 20,000 AI-generated child sexual abuse material images were discovered on a single dark webAnonymized sections of the internet accessible through specialized software (like Tor) that hide user identity and location. While legitimate uses exist, it hosts illegal markets and exploitation networks. forum in 2024.
The National Center for Missing and Exploited Children reported a 1,325% increase in CyberTipline reports involving generative AI in 2024 alone[s]. The technology intended to save victims is being turned against them.
What Would Work
The problem is not that AI is useless. The problem is where it points. Current systems focus on victims rather than perpetrators, on advertisements rather than money flows, on pattern matching rather than network disruption. Researchers argue that AI should target the infrastructure of trafficking operations: blockchain transactions, transport routes, housing facilities, and the digital dustTraces of digital activity that criminals inadvertently leave behind despite attempts to erase their online presence, useful for law enforcement investigations. traffickers leave behind[s].
The technology exists to follow every cryptocurrency wallet tied to a trafficking advertisement on the open or dark web. But over half of the AI tools cataloged by Tech Against Trafficking were no longer available as of 2024, their companies either out of business or no longer supporting the platforms. Without sustained investment in human trafficking detection that targets perpetrators rather than victims, the gap will only grow.
Human trafficking detection systems fail at rates that would be unacceptable in any other law enforcement domain. A National Institute of Justice study across three jurisdictions found that official records captured as little as 14% to 18% of potential trafficking victims, with police records specifically capturing no more than 6%[s]. The undercount stems from three compounding failures: lack of officer training, inadequate offense coding systems, and the complexity of proving force, fraud, or coercion.
The International Labour Organization’s 2024 report quantifies the scale: 27.6 million people in forced labor globally, generating $236 billion in illegal profits[s]. Profits have increased 37% since 2014, with per-victim revenue rising from $8,269 to nearly $10,000 (inflation-adjusted). Sexual exploitation accounts for 73% of total illegal profits despite representing only 27% of victims, due to the massive differential in per-victim revenue: $27,252 versus $3,687 for other forms of forced labor.
The Algorithmic BiasWhen machine learning systems produce outcomes that reflect or reinforce existing socioeconomic, racial, and gender biases present in their training data. in Human Trafficking Detection
MIT researcher Joy Buolamwini’s work on facial recognitionThe automated identification of individuals by analyzing facial features in images or video using AI algorithms. A match is an investigative lead, not proof of identity. bias has direct implications for trafficking investigations. Government benchmark datasets like IJB-A, despite explicit efforts at geographic diversity, were more than 80% lighter-skinned individuals[s]. The bias cascades through multiple mechanisms: public figure datasets inherit the “power shadows” of who holds political office; media attention skews image availability toward already-visible populations; facial detection algorithms fail more often on darker-skinned faces, filtering them out of training sets entirely.
The DARPAThe Defense Advanced Research Projects Agency, a U.S. government agency that funds high-risk, high-reward research projects for national security. Memex program, which developed one of the most sophisticated trafficking detection pipelines over three years, acknowledged this directly. Researchers found that “automatic trafficking detection is an important application of AI for social good” but “also provides cautionary lessons for deploying predictive machine learning algorithms without appropriate de-biasing”[s]. Their system, integrated into a search platform containing over 100 million advertisements and used by 200+ law enforcement agencies, required extensive post-hoc bias mitigation.
The Digital Ghost Problem
AI-driven human trafficking detection operates on a fundamental assumption: that victims leave digital traces. Traffickers systematically eliminate those traces. They confiscate electronics, control all communication, and separate victims from any support network that might notice their absence[s]. Victims become “digital ghosts” with no footprint for algorithms to analyze.
The technical constraints compound this problem. End-to-end encryptionA secure communication method where only the communicating parties can read the messages, preventing intermediaries from accessing the content., while essential for legitimate privacy, “thwarts the ability for others to see the conversations from the mobile platform to a mobile platform, thus making the examination of the device that much more vital”[s]. NCMEC reported that its overall CyberTipline reports dropped from 36.2 million in 2023 to 20.5 million in 2024, not because crimes decreased, but because “some platforms aren’t reporting as they should” due to encryption and reduced submissions[s].
Surveillance Systems That Harm
Human Rights Watch documented how human trafficking detection tools often “fail to distinguish between consensual adult sex work and human trafficking,” leading to surveillance of marginalized communities[s]. A Department of Homeland Security campaign instructed hotel staff to report “signs of trafficking” based on indicators like requesting additional towels, waiting at a bar, or using cash. These tropes cause disproportionate surveillance of poor, racialized, and transgender sex workers while conflating standard safety tactics with trafficking indicators.
A 2022 academic study on ad-scrapingThe automated collection and analysis of online advertisements using software tools, often used in law enforcement to detect suspicious activity. technologies found “misalignment between developers, users of the platform, and sex industry workers they are attempting to assist,” concluding these approaches are “ineffective” and “exacerbate harm”[s]. The systems are optimized for metrics that do not correlate with victim identification.
The Adversarial AI Arms Race
Traffickers have adopted AI faster than law enforcement. The 2025 Trafficking in Persons Report notes that criminals “weaponize AI to enhance their operations, using translation tools to craft culturally nuanced messages that resonate with victims in their native language”[s]. AI-generated deepfakesA synthetic image, video, or audio created using artificial intelligence to replace a person's likeness with someone else's, often making it difficult to distinguish from authentic content., synthetic voice, and text-to-image tools enable sextortionA form of cybercrime where attackers threaten to release sexual or intimate images to coerce victims into providing money, more images, or other demands. and CSAM production at scale, with over 20,000 AI-generated CSAM images discovered on a single dark webAnonymized sections of the internet accessible through specialized software (like Tor) that hide user identity and location. While legitimate uses exist, it hosts illegal markets and exploitation networks. forum in 2024.
NCMEC’s 2024 data showed child sex trafficking reports increased 55% year-over-year, while generative AI reports surged 1,325%[s]. The Thorn analysis noted that while AI-generated content “remains a small percentage of total reports, it’s a clear signal that AI-generated child sexual abuse material (AIG-CSAM) is growing.”
Resource Allocation Failure
Department of Justice data reveals a structural problem: 75% of the $361 million granted in fiscal year 2022 for countering trafficking went to victim services[s]. Victim services are essential, but this allocation mirrors a counternarcotics model that does not transfer: in drug trafficking, reducing demand (treating consumers) makes sense because consumers choose to participate. In human trafficking, victims are the “product,” not the consumer. Applying demand-reduction logic to trafficking victims is a category error.
Tech Against Trafficking cataloged over 300 tools for fighting trafficking, of which only about 10% used AI. As of 2024, over half of those AI tools were no longer available[s]. Without sustained government investment, the most promising technologies disappear.
Where AI Could Work
The more appropriate application of AI would target traffickers rather than victims: blockchain analysis of cryptocurrency wallets tied to trafficking advertisements, transport pattern detection, housing facility identification, and the “digital dustTraces of digital activity that criminals inadvertently leave behind despite attempts to erase their online presence, useful for law enforcement investigations.” that criminals cannot fully eliminate[s]. Generative AI could even automate the creation of search warrants and subpoenas from digitally collected evidence, reducing administrative burden while protecting civil liberties.
The technology exists. The funding does not. Until human trafficking detection shifts from victim surveillance to perpetrator interdictionLaw enforcement efforts to intercept and seize illegal drugs before they reach their intended markets or users., the 94% gap will persist.



