Machine learning technology is opening up new strategies to find and prosecute the men who profit from the worst of the illegal sex trade

“Hey there. You available?”

“Yes I am… how long did you want company for?”

“Was hoping for an hour, maybe half. Depends.”

“Half. 125 hour. 200 donation.”

“Gotcha. Can you stay for half hour then?”

“You’re not a cop or affiliated with any law enforcement, right?”

“Ha. No. You?”

In the thrum of texts that flit across cell towers every day, arranging sex from online ads, this was just another exchange. What followed, however, was anything but ordinary.

On June 4, 2015, a 33-year-old woman pulled off a road in Pennsylvania’s Lehigh Valley and knocked on the door of room 209 at the Staybridge Suites. She expected an assignation with the man who had sent the earlier texts. What she got, once she accepted payment, was a sting. The man was a state trooper.

So far, a pretty normal prostitution bust. But then things took a turn. The woman told the troopers that she actually hadn’t been the one texting. The texter was a man she knew as “GB,” and he was waiting for her in a silver Acura out in the hotel parking lot. She also wasn’t the only woman GB was working. There were others, including a teen the woman once saw GB smack to the ground. There were drugs involved, too.

When troopers found GB in the car, he was on his phone posting a sex ad on the classified advertising site Backpage. There was another young woman with him. The troopers arrested him, took the phone, and wrote him up on charges of promoting prostitution and related offenses.

But Lehigh County Deputy District Attorney Robert Schopf saw the potential for something more. After further investigation, Schopf amended the charges to include sex trafficking in individuals and geared up for a hard legal fight. Unlike promoting prostitution — often a misdemeanor in Pennsylvania, and even if it rises to felony, the maximum sentence is seven years — trafficking adults is always a felony, with a penalty of up to 10 years. But for that Schopf would have to prove that GB, aka Cedric Boswell, now 46, of Easton, Pennsylvania, had not only knowingly promoted the women for sex, but had actually led them into it by fraud, force, or coercion. And to do that, Schopf would have to convince a jury that would be thinking, as he put it, “‘Well, wait a minute. She walked out of that hotel room every day. Where are these chains?’”

Boswell’s case seemed no different. A forensics dump of his cell phone produced 6,306 images, mostly of scantily-clad women. It also turned up a promising favicon, or favorite icon:, the page on the now-defunct website where sex workers, pimps, and traffickers could place ads for women selling sexual services. At the time, Backpage was the Amazon of the escort industry, the first place people would go to sell sex — and the first place law enforcement would go to track them down. In fact, Backpage was where the Pennsylvania State Troopers had found the original ad that led them to the GB bust.

The phone was full of evidence, but what the police had was circumstantial. The challenge for prosecutors was connecting those photos to actual escort ads. “We just don’t have that amount of time to manually look through the [images on the phone] and then try to compare: Is this the girl that I’m seeing on Backpage?” says Julia Kocis, director of Lehigh County’s Regional Intelligence and Investigation Center (RIIC), a web-based system run by the DA’s office that aggregates local crime records and supports law enforcement. “Where do I even begin to find them?”

But Kocis had an idea. She had heard about a new tool called Traffic Jam that was specifically designed to aid sex trafficking investigations with the help of artificial intelligence. Traffic Jam had been created by Emily Kennedy, and had grown out of her study of machine learning for criminal investigations as a research analyst at the Robotics Institute at Pittsburgh’s Carnegie Mellon University — as well as what Kennedy called “a lot of talking to detectives about their pain points” around sex trafficking cases.

Kocis, who was by then working closely with Schopf on the case, decided to try a free trial of Traffic Jam. “I ran the images and phone number through the tool,” she says, “and it brought back the ads he’d posted in minutes. Then I futzed around with it, and it showed a map of where the phone number was used to post girls at different locations, and over time.”

Kocis couldn’t believe how effective the tool was — and neither could Schopf. “Julia sent it to me and it was just awesome,” he says. “When you have something that critical, [the Traffic Jam evidence] is a smoking gun. I mean, there was absolutely no way for [Boswell] to get out from underneath that evidence.”

The original 33-year-old woman was reluctant to testify, but she agreed after Schopf told her about the corroborating evidence gotten with the help of Traffic Jam. After two hours of deliberation, on April 19, 2016, a jury found Cedric Boswell guilty of several crimes, including trafficking in individuals. He is currently serving a sentence of 13 to 26 years in state prison.


raffic Jam is part of a cluster of new tech tools bringing machine learning and artificial intelligence to the fight against sex trafficking, a battle that over the nearly 20 years since the Trafficking Victims Protection Act was signed into law has been stuck in a weary stalemate. A.I. is no magic solution to a highly complex problem, but for early adopters it is catalyzing the rate of sex trafficking investigations, allowing one DA’s office to conduct 10 times as many investigations as they used to, while making them 50 to 60% faster. Driven by everyone from Traffic Jam’s Kennedy — whose program has helped crack at least 600 cases — and celebrities like Demi Moore and Ashton Kutcher to the government’s Defense Advanced Research Projects Agency (DARPA) and major corporations like IBM, these machine learning tools work the crime from every angle: finding victims, following money trails, and confronting johns. They can do in seconds what could take investigators months or even years, assuming they ever get to it at all. And the A.I. never takes a coffee break.

A Traffic Jam analysis showed there were 133,000 new sex ads a day on Backpage before it closed. Six months later, that number had risen to 146,000 on leading escort sites.

The technology has arrived in time to help investigators navigate a newly chaotic underworld. After growing pressure from Congress, federal authorities last April shut down, a move they hoped would stymie traffickers posing their victims as willing sex workers. But the closure — which, it should be noted, was opposed by many sex workers who are in the field by choice and who feared it would force them to return to walking the streets — seems to have backfired. According to several new A.I. tools, the commercial sex business is not only more robust than ever, it has now splintered. A Traffic Jam analysis showed there were 133,000 new sex ads a day on Backpage before it closed. Six months later, that number had risen to 146,000 on leading escort sites, taking into account the common practice of smaller sites reposting ads. “Some new sites were getting so much traffic they kept crashing,” says Kennedy. “I think it will continue to spread out, but always evolve.”

The result is that law enforcement, which relied for years on Backpage for leads, now don’t know where to start their investigations. “Traffickers adapt and they learn what you’re doing,” says Nic McKinley, the founder of DeliverFund, a nonprofit that assists with these sex trafficking investigations. “If your technology can’t also learn, then you’ve got a problem. A.I. has to be be part of the solution.”

It increasingly is. In 2014, when Kennedy founded Traffic Jam, Kutcher and Moore launched their sex trafficking investigation tool Spotlight, which is used by more than 8,000 law enforcement officers. That same year DARPA launched a $70 million effort dubbed Memex that funded 17 partners to develop other search and analytic tools to help fight human trafficking. The campaign ended in 2017, but several new machine learning tools came out of the effort that are assisting the National Center for Missing & Exploited Children (NCMEC), the Manhattan DA’s office, and the Homeland Security Investigations (HSI) field office in Boston (at least according to DARPA — HSI doesn’t confirm their investigative tools). “DARPA would send folks up to literally sit next to our analysts and watch as they went step by step through an investigation,” says Assistant District Attorney Carolina Holderness, chief of Manhattan’s human trafficking unit. “Then [they’d] go directly back to the software developers and adjust the tool.”

Holderness’s team tried out two Memex products, but TellFinder, the one they settled on, was created by the nonprofit arm of a Toronto software company called Uncharted that makes tools for law enforcement. David Schroh, who heads up TellFinder’s six-person team, says they were drawn to the project after realizing how little technology was being employed to fight sex trafficking. Similar to Traffic Jam but with a broader scope, TellFinder slaloms through terabytes of the deep web to search where Google can’t. Along the way, its machine learning algorithms match language patterns in similarly written ads and recognize faces. “We make it very easy for someone trying to look for a person to find all the ads she’s been in,” says Chris Dickson, TellFinder’s technical lead.

Even in its pilot stage, TellFinder’s speed proved it could make or break a case. In November 2012, a man named Benjamin Gaston kidnapped a 28-year-old woman who had advertised sex online. He took her wallet and phone, raped her, and forced her to prostitute for him. Desperate after two days of captivity, she tried to escape out the window, only to plummet six stories to the ground. “She survived,” says Holderness, “but broke every bone in her body, just about.”

By the time of the trial in 2014, the woman had made a miraculous recovery. After that fall, she told the jury, she’d never engaged in prostitution. But the defense then showed up with recent sex ads that made it appear that she had willingly gone back to the life. The ads would not only undermine her credibility as a witness, they also offered exculpatory evidence for the defense’s argument that she’d only jumped out of the window because she was emotionally unstable and a drug abuser.


Back at the DA’s office, Holderness’s team put the ads in TellFinder. “We knew right away they were being posted in places where the woman was not located,” she says. “Someone else was using her photograph, using her ad, to get customers for herself, which is a common practice.” The defense backed down, and Gaston got 50 years to life.

“To be able to do that overnight — it actually saved that whole prosecution,” says Holderness.