On Thursday, authorities in Canada announced the bust of an enormous international child pornography operation. It was the end of a three-year investigation into a website that trafficked in illicit videos of young boys. More than 300 people have been arrested in connection with the videos, 76 of them in the United States.
Although busts like this one end with press conferences and high-profile trials, they begin far away from the public eye, with one of the most difficult jobs in the world: content moderation.
The rise of Internet porn has created a shadow industry of people whose job it is to screen vast numbers of images for child pornography.
Richard Brown knows how difficult it can be to see this kind of content. He used to be in charge of the Internet Crimes Against Children task force for the New Jersey State Police. Part of his job was to look through the hard drives of suspects, image by image. And it was hard to forget what he saw.
"I have 2 boys," he says, "and I remember being ultra-protective of my boys during the time that I was involved in this type of work, and I think that's pretty common."
Now, Brown is a law enforcement liaison for the International Center for Missing and Exploited Children. That organization is developing a program that can help police departments automate the reviewing of images of child sexual abuse.
Internet search providers like Google and Microsoft are also investing in similar programs. Samantha Doerr, the director of public affairs and child protection at Microsoft, explains that automation is important because "unlike any other kind of offensive content online, the image itself is a crime scene, and every new viewing of that image is a re-victimization of that child."
The Microsoft system, known as PhotoDNA, was co-developed by a team at Dartmouth and has since been donated to the National Center for Missing and Exploited Children. Earlier this year, Twitter began using it to scan every photo that's uploaded to its site.
The program works by scanning known images of child pornography and giving them a unique signature that goes into a database. If that image appears on another site, it is instantly flagged and removed. Google has its own proprietary tagging system that works in a similar way.
But in order for an image to be identified as child pornography in the first place, a person has to review it. The people who do that work for tech companies are employed all over the world, and very little is known about them, says Sarah Roberts of Western University in Ontario, Canada.
Roberts studies the workers who are part of the new content moderation industry, and she explains that one reason so little is known about them is that most companies require their employees to sign nondisclosure agreements.
"They're precluded from speaking to the media, and it is difficult to reach out and find them," Roberts says. "I think there's an aspect of trauma that can often go along with this work and many workers would rather go home and tune out not talk about it. So I think the unknown aspect of this is by design. It's no mistake that it's difficult to find workers who will talk to you about this."
Many of the workers Roberts has spoken to anonymously have said they feel stigmatized because of the content they come in contact with through their jobs.
"It's exacting a toll on these workers, and because this industry is so new and the need for this work is so new, I think the jury is out as to what the real implications are going to be for these people later on in their life," she says.
But the demand for content moderators is only growing. In March, the eight tech companies who belong to the Technology Coalition against child exploitation online came out with guidelines for how to support employees who come in contact with child pornography as part of their jobs.
The guidelines suggest employees take their minds off traumatic content by, for example, taking a 15 minute walk or engaging in a hobby. The guidelines also say companies should have a counseling hotline for employees.
Roberts says providing resources may not be enough.
"If someone were to access some of these support services," she says, "there may be an implicit suggestion that they're not cut out for the kind of work they're trying to do for a living."
Copyright 2021 NPR. To see more, visit https://www.npr.org.