Issue of the Week: Human Rights

A Criminal Underworld of Child Abuse, The Daily podcast, The New York Times

We return now to revisiting our posts of comments and articles in The New York Times Exploited series on child sexual abuse on the internet, the most extraordinary series to date on the worst, out of control abuse, of hundreds of millions of children, starting with babies, in history:

Issue of the Week: Human Rights, Disease

Published March 5, 2020

Tragically, our greatest fears expressed in the post on the global danger of pandemics from February 5 this year have come more and more true.

We noted at the time that the first discovered case of Coronavirus–Covid-19–in the US, was here in Seattle, where we live.

Little did we know that only weeks later, we would be living at the epicenter of the outbreak in the US, where the great majority of cases and deaths have occurred so far.

It spreads daily elsewhere.

From the poorest of the poor to the richest of the rich, Coronavirus is here.

Living in this initial epicenter in the US, is a humbling experience. Life is changing radically here by the day, or by the hour.

Hopefully, the worst of what we know, or don’t know, will not occur. Hopefully, after bringing tragedy for too many, this experience will bring us closer to desperately needed change, here and around the world, as we outlined in our February 5 post, and in all the inter-related issues our work has focused on from the start.

Today, the Director General of the World Health Organization (WHO), Dr. Tedros Adhanom Ghebreyesus, addressed a media briefing on Covid-19.

The following is an excerpt:

This epidemic can be pushed back, but only with a collective, coordinated and comprehensive approach that engages the entire machinery of government. 

We are calling on every country to act with speed, scale and clear-minded determination. 

Although we continue to see the majority of cases in a handful of countries, we are deeply concerned about the increasing number of countries reporting cases, especially those with weaker health systems. 

However, this epidemic is a threat for every country, rich and poor. As we have said before, even high-income countries should expect surprises. The solution is aggressive preparedness. 

We’re concerned that some countries have either not taken this seriously enough, or have decided there’s nothing they can do. 

We are concerned that in some countries the level of political commitment and the actions that demonstrate that commitment do not match the level of the threat we all face. 

This is not a drill. 

This is not the time to give up. 

This is not a time for excuses. 

This is a time for pulling out all the stops.

The degree to which this call is being heeded varies vastly, in different nations and by different governments, and within different nations and different governments.

We will of course revisit this issue as it unfolds–what our experience has been and will be, and what the experience of all humanity has been and will be in this global outbreak.

Today, however, we have a different issue to revisit.

In 2016, the World Health Organization announced “1 in 2 children aged 2-17 years suffered violence in the past year”.

This remains an active ongoing statistic on the WHO site.

Half of all children, as we’ve said often.

And this doesn’t even include newborns and one-year-olds.

From WHO:

Violence against children includes all forms of violence against people under 18 years old. For infants and younger children, violence mainly involves child maltreatment (i.e. physical, sexual and emotional abuse and neglect) at the hands of parents and other authority figures. Boys and girls are at equal risk of physical and emotional abuse and neglect, and girls are at greater risk of sexual abuse. As children reach adolescence, peer violence and intimate partner violence, in addition to child maltreatment, become highly prevalent.

Violence against children can be prevented. Preventing and responding to violence against children requires that efforts systematically address risk and protective factors at all four interrelated levels of risk (individual, relationship, community, society).

Imagine the media worldwide covering every day as the top headline, the Director General of the World Health Organization saying:

More than half of all children in the world, from birth to 18, are being sexually abused, physically abused and abused in other ways, starting with infants and young children at the hands of their parents, and then other authority figures. 

Then imagine ongoing stories, pictures as appropriate and updates, several times a day.

Then:

We are calling on every country to act with speed, scale and clear-minded determination.

This is not a drill. 

This is not the time to give up. 

This is not a time for excuses. 

This is a time for pulling out all the stops.

Pause here.

Imagine that actually happening.

As we’ve brought to the forefront before, globally, the above ads up to hundreds of millions of children sexually abused and even more hundreds of millions physically abused, every year, ongoing.

Again, as reported before:

A government of India study said half of all children in India (about to be the largest nation on earth) are sexually abused. No government agency would want to acknowledge something so horrid.

And as also reported before:

In the US, the Centers for Disease Control (CDC) in their study of Adverse Childhood Experiences found one in four girls and one in six boys are sexually abused, with even more physically abused–the former more by men and the latter more by women. But both do both, just as both girls and boys are the victims.

Two weeks ago, two days in a row, in two parts, The Daily podcast by The New York Times, A Criminal Underworld of Child Abuse, featured Michael H. Keller, an investigative reporter at The Times, and Gabriel J.X. Dance, an investigations editor for The Times.

Keller and Dance were the reporters on the unprecedented three dominant front-page articles in the Sunday Times, along with Nellie Bowles on the third article, from September through December, on child sexual abuse and the internet, starting with infants. This itself is human violence at its most unimaginable, but the level of accompanying torture revealed in the articles (which we have also covered before from other reports) is equally unfathomable.

We posted pieces on all three articles, herehere and here, which included links to our other pieces on the issue going back over a year before.

We don’t know how to overstate what we keep saying can’t be overstated.

The sexual abuse and other abuse of children is the single worst and most critical issue facing us as a species. Nurturing and protecting our children, especially during the early years, has more impact on the brain and development than anything else. As we’ve said over and over, a species that won’t protect its children won’t protect anything in the end, and won’t survive.

The Times’ pieces have been another watershed in the movement to deal with this issue (there was a fourth piece in the series that although not a Sunday front page headline was an important story on possible headway in solutions).

The podcasts are further illuminating, heart-wrenching and enraging, and constitute the next reports in the series, which it appears is an investigation that will continue.

Here’s an excerpt from our first piece on the first article:

There is no more influential or widely read front page article in the world than the lead in the Sunday New York Times.

This one is particularly striking for a number of reasons.

First, it is sandwiched in the middle between two other headlines, a common practice at the Times.

But the other two stories aren’t common.

One is on the impeachment inquiry about President Donald Trump.

The other is about the 70th anniversary of the founding of The People’s Republic of China.

Two critical stories about the first and second most powerful and influential nations on earth, at the crossroads internally, with each other, and in the world.

But these headlines are lost, as the graphic for the main story covers the majority of the page, with 88 images of children sexually abused on the internet, a collage of artistic photography obscuring the images for privacy, which only increases the impact.

Placed strategically between the images are the following three statements in black boxes and white letters:

Last year, tech companies reported over 45 million online photos and videos of children being sexually abused — more than double what they found the previous year.

Each image shown here documents a crime. The photos are in a format analysts devised to protect the abused.

Twenty years ago, the online images were a problem; 10 years ago, an epidemic. Now, the web is overrun with them.

With the headline in the print edition, “Child Sex Abuse on the Internet: Stolen Innocence Gone Viral.”

The Times deserves a special salute for putting the failure to protect children front and center among global issues where it belongs.

The subject is the ugliest of humanity’s blights, which the article points out is enabled by society-wide avoidance, becauseit’s too “ugly” of a “mirror”.

Never forget those words as you read on.

What this meant, in addition to the unique additional plague the digital age has brought to this issue, is that the statistics on child sexual abuse, as many have suspected, are vastly understated, in part for reasons not dissimilar from those that prevent adults from reporting their victimization as children for an average of decades.

And as far as children go, as the Times pointed out editorially, its hard enough for adults to report, for children the obstacles are usually impassable.

Until or unless a scenario such as described above with WHO and the media and a society acting with intellectual and moral integrity occurs.

And following strategies such as WHO’s INSPIRE: Seven strategies for ending violence against children, which “identifies a select group of strategies that have shown success in reducing violence against children. They are: implementation and enforcement of laws; norms and values; safe environments; parent and caregiver support; income and economic strengthening; response and support services; and education and life skills.”

Sound familiar?

But none of this can happen effectively without first facing the issue.

We’ve come a long way on this issue. No one can afford to appear to be on the wrong side of it in 2020. Many care deeply and are working hard to end it. But as we’ve noted often, the greatest resistance, consciously and unconsciously, to all momentous social movements, comes just before they succeed. And the moment of success depends on the choices of every one of us. Many people who say they care, even some who have worked on the issue, have either effectively enabled it or been unwilling to do what is necessary. The degree to which the issue is enmeshed with almost everyone directly or indirectly, starting in families and then throughout society, has an enormous impact–especially with this issue. The impulse to avoid is insidious. But the number of people who not only understand the issue, but who live the solution, are growing. There are some realities that require you to choose what kind of human being you are and what your legacy will be. Basic decency is not optional. It’s not a job that ends as long as we draw breath.

We are all called to be the freedom riders, risking it all, as long as one child screams, too often silently, for our help.

Part one of the Times podcast starts with:

It’s not altogether uncommon in investigations for us to turn up information that is shocking and disturbing. The challenge is when, in the course of your reporting, you come across something so depraved and so shocking that it demands attention. People have to know about this, but nobody wants to hear about it.

Part two of the series ends with:

A few weeks ago, the National Center for Missing and Exploited Children released new data for 2019. It said that the number of photos, videos and other materials related to online child sexual abuse grew by more than 50 percent to 70 million.

The following are a few excerpts of comments on the podcasts.

In our work and in our lives, personally and professionally, we have heard for decades, from countless survivors and countless people, increasingly, up to the moment as we write, virtually the same words as below:

-Sickening. What’s wrong with this world? I’m increasingly thinking that the internet amplifies the worst of humanity. Thank you for your investigative reporting. I had no idea. We need to resolve this.

-Thank you for this reporting. We need to get this issue heard about more, and then find ways to take action. This should be the next #metoo movement.

-This story is the tip of the iceberg. I have no idea how this problem continues since we know about it. I worked in this area for a short time. It should be on equal footing with foreign military action and Corona virus.

-Thank you for reporting this story and for sharing it on The Daily. I personally know five friends who were sexually abused as children – at the youngest at the age of 4 – by their fathers and in one case their uncle. I see how it cripples the foundations of their childhood and the trauma haunts them the rest of their lives. We need to be more aware of this problem and have more conversations about it. Thank you again for your work in bringing more light to the issue.

-This country needs to put the money, the energy, the intention on the right things. And protecting children from this monstrosity is second to nothing.

-Gutted after hearing this. Thank you for reporting on this. So hard to listen to. But I won’t look away.

-I was sexually abused as a child. I am thankful the technology wasn’t available to post pictures and videos. …I desperately hope your story drives some sort of action.

-Thank you for this jaw-dropping investigative reporting. After listening to Part 1 yesterday, I woke with nightmares. Perhaps again tonight. Bless these precious, innocent children who have no voice to protect themselves.

-Incredible reporting on a story that couldn’t possibly be more urgently important. Thank you for bringing this to the attention of the public.

-If there was one issue that could unite our government, it should be coming together to resolve this issue and protect the sanctity of childhood innocence.

-I believe we are complicit if we do not act.

-These are the children of our future. If not us, then who?

Here are the podcasts, links and transcripts:

A Criminal World of Child Abuse, Part 1

Hosted by Michael Barbaro, produced by Jazmín Aguilera, Annie Brown and Marc Georges, and edited by Larissa Anderson and Mike Benoist

The Daily, The New York Times, Wednesday , February 19, 2020

Child sexual abuse imagery online is now a problem on an almost unfathomable scale.

Gabriel Dance

It’s not altogether uncommon in investigations for us to turn up information that is shocking and disturbing. The challenge is when, in the course of your reporting, you come across something so depraved and so shocking that it demands attention. People have to know about this, but nobody wants to hear about it.

How do you tell that story?

Michael Barbaro

From the New York Times, I’m Michael Barbaro. This is “The Daily.”

Today: A monthslong Times investigation uncovers a digital underworld of child sexual abuse imagery that is hiding in plain sight. In part one, my colleagues Michael Keller and Gabriel Dance on the almost unfathomable scale of the problem — and just how little is being done to stop it.

It’s Wednesday, February 19.

Gabriel, tell me how this investigation first got started.

Gabriel Dance

So it all began with a tip. Early last year, we got a tip from a guy, and this guy was looking up bullets.

Michael Barbaro

Bullets for guns.

Gabriel Dance

Bullets for guns. And he was actually looking for a very specific weight of bullet on Microsoft’s Bing search engine. And while he was looking up these bullets, he started getting results of children being sexually abused.

And the guy was horrified. He didn’t understand why he was seeing these images, couldn’t stand to look at them. And so he reported it to Bing — and heard nothing. And full of outrage, he writes us a letter to our tip line and described what he was looking for, described the kind of images he was getting back. He says, New York Times, can you please look into this for me?

So I actually emailed my colleague Michael Keller and asked him to look into it.

Michael Keller

So in the tip, they had listed the search terms they’d used. So we tried to replicate it. We put it into Bing. And we saw a flash on the screen of images of children. And so I wrote back to Gabe and said, yeah, this checks out. You could type words into Bing and get back explicit images of children.

Michael Barbaro

So this is not the dark web. This is just a regular, commonplace search engine.

Gabriel Dance

That’s right. So a few things went through my head. First of all is, what we need to document this. Because, as most of us know, things on the internet change all the time. It’s possible they came down soon after, etc. But we were very unsure what kind of legal liabilities we had when it came to documenting anything regarding this imagery. So we emailed Dave McCraw, who’s the head counsel at The New York Times, to ask him, you know, what can we do? What can’t we do? How do we go about investigating where this imagery is online?

Michael Barbaro

And doing it without somehow violating the law.

Gabriel Dance

That’s right. And David wrote back immediately and said, there is no journalistic privilege when investigating this. You have no protections. And you have to report it immediately to the F.B.I.

Michael Keller

And so that’s what we did. We submitted a report both to the F.B.I. and also to the National Center for Missing and Exploited Children, which is the kind of government-designated clearinghouse for a lot of these reports.

Michael Barbaro

And what did they tell you?

Michael Keller

They weren’t able to tell us anything about the report we submitted. But it made us wonder, how common is it that they get these kinds of reports? How many images are out there? How many images are flagged to them each year? And they were able to tell us that. And that number was frankly shocking to us.

The handful of images that the tipster stumbled across was just a tiny portion of what the National Center sees every single day. They told us that in 2018 alone, they received over 45 million images and videos.

Michael Barbaro

Wow.

Gabriel Dance

45 million images a year. That’s more than 120,000 images and videos of children being sexually abused every day. Every single day. But to put it in perspective, 10 years ago, there were only 600,000 images and videos reported to the National Center. And at that time, they were calling it an epidemic.

Michael Barbaro

So in just a decade, it went from 600,000 reports to 45 million?

Gabriel Dance

Yeah. So we were really curious — how does a problem called an epidemic 10 years ago become such a massive issue now?

Michael Keller

And one of the first things we learned was that we did try and tackle it back then.

^Archived Recording^ (Stone Phillips)

The national epidemic of grown men using the internet to solicit underage teens for sex. As more and more parents become aware of the dangers, so have lawmakers in Washington.

Michael Keller

In the mid to late 2000s, as the internet was being more widely adopted, this issue of online child sexual abuse really got on the radar of Congress. There was even a bill being introduced by Debbie Wasserman Schultz.

Archived Recording (Debbie Wasserman Schultz)

The internet has facilitated an exploding multibillion-dollar market for child pornography.

Michael Keller

There were multiple hearings.

Archived Recording (Flint Waters)

I’m here today to testify about what many of my law enforcement colleagues are not free to come here and tell you.

Michael Keller

They heard from law enforcement.

Archived Recording (Flint Waters)

We are overwhelmed. We are underfunded. And we are drowning in a tidal wave of tragedy.

Michael Keller

They were overwhelmed with the number of reports that were coming in.

Archived Recording

Unless and until the funding is made available to aggressively investigate and prosecute possession of child pornography, federal efforts will be hopelessly diluted.

Michael Keller

They in many cases had the tools to see where offenders were, but not enough staff to actually go out and arrest the perpetrators.

Archived Recording (Flint Waters)

We don’t have the resources we need to save these children.

Archived Recording (Alicia Kozakiewicz)

Hello. Thank you for inviting me to speak today. My name is Alicia Kozakiewicz. A Pittsburgh resident, I am 19 years old and a sophomore in college.

Michael Keller

There was also a very chilling testimony from a victim of child sexual abuse.

Archived Recording (Alicia Kozakiewicz)

For the benefit of those of you who don’t know, don’t remember those headlines, I am that 13-year-old girl who was lured by an internet predator and enslaved by a sadistic pedophile monster. In the beginning, I chatted for months with Christine, a beautiful, red-haired 14-year-old girl, and we traded our school pictures. Too bad that hers were fake. Yeah, Christine was really a middle-aged pervert named John. And he had lots of practice at his little masquerade because he had it all down. The abbreviations, the music, the slang, the clothes — he knew it all. John slash Christine was to introduce me to a great friend of hers. This man was to be my abductor, my torturer. I met him on the evening of January 1, 2002. Imagine, suddenly you’re in the car, terrified, and he’s grabbing onto your hand and crushing it. And you cry out, but there’s no one to hear. In between the beatings and the raping, he will hang you by your arms while beating you, and he will share his prized pictures with his friends over the internet.

The boogeyman is real, and he lives on the net. He lived in my computer, and he lives in yours. While you are sitting here, he’s at home with your children.

Task forces all over this country are poised to capture him, to put him in that prison cell with the man who hurt me. They can do it. They want to do it. Don’t you?

[Music]

Michael Keller

Alicia’s testimony really moved people. People responded. And eventually, about a year later, the bill passes unanimously.

Michael Barbaro

And what is this new law supposed to do?

Gabriel Dance

So this law, the 2008 Protect Our Children Act, is actually a pretty formidable law with some pretty ambitious goals.

Michael Keller

It was supposed to, for the first time ever, secure tens of millions of dollars in annual funding for investigators working on this issue. And it required the Department of Justice to really study the problem and put out reports to outline a strategy to tackle it.

Michael Barbaro

And what has happened since this ambitious law was put into place?

Michael Keller

In many ways, the problem has only gotten worse. Even though the number of reports has grown into the millions, funding is still pretty much what it was 10 years ago. And even though the government was supposed to do these regular reports, they’ve only done two in 10 years. And that’s an issue, because if you don’t have anyone studying the size of the problem, you don’t have anyone one raising alarm bells and saying, hey, we need more resources for this.

Michael Barbaro

So they didn’t study it. And they didn’t increase the funding in a way that would match the scale at which the problem is growing.

Michael Keller

Yeah. It really looked like they had this law in 2008, and then everyone really took their eye off the ball. So we called Congresswoman Debbie Wasserman Schultz, who was one of the leading proponents of this bill, to figure out what happened.

Gabriel Dance

And we are really gobsmacked to hear that she’s unaware of the extent of the failings. She sends a letter to Attorney General William Barr laying out a lot of our findings, requesting an accounting. As far as we know, she hasn’t heard anything.

Michael Barbaro

So even the person who co-wrote the law was unaware that it was pretty much failing.

Michael Keller

She knew about the funding, but even she didn’t know that things had gotten this bad.

And we wanted to figure out now, 10 years later, what kind of effect is this having to law enforcement, to the people on the ground working these cases? And what we heard from them really shows what happens when everyone looks away.

Michael Barbaro

We’ll be right back.

Gabriel, Michael — what happens when you start reaching out to law enforcement?

Gabriel Dance

So Mike and I probably spoke with about 20 different Internet Crimes Against Children task forces. And these are the law enforcement agencies responsible with investigating child sexual abuse. To be honest, most of the time, as an investigative reporter, generally law enforcement — I mean, generally anybody, but especially law enforcement — is not particularly interested in speaking with us. Usually, they don’t see much to gain. But surprisingly, when it came to this issue, they were not only willing to speak with us, but they were interested in speaking with us.

Michael Barbaro

Why do you think that was?

Gabriel Dance

It’s partly because we were using the right terminology. And by that I mean, we were asking them about child sexual abuse imagery, not child pornography.

Michael Barbaro

And what exactly is the distinction?

Gabriel Dance

Well, legally, they’re basically the same. But for the people who deal with this type of crime day in, day out, who see these images and who speak with survivors, they know that calling it child pornography implies a bunch of things that are generally incorrect.

Michael Keller

One is that it equates it with the adult pornography industry, which is legal and made up of consenting adults, whereas children cannot consent to the sexual behavior. The other thing is that the crimes depicted are heinous, and that each one of them is essentially looking at a crime scene.

Gabriel Dance

And for that reason, they prefer to call it child sexual abuse imagery. But I think they also talked to us, because for the law enforcement who deal with this, they very much feel that the issue is under-covered and under-discussed, especially considering the seriousness of the crime. I mean, we had the kind of coordination and cooperation from these law enforcement officers that we rarely see from anybody whatsoever. They let us go out on raids with them. They provided us with detailed reports. They talked to us about specific cases. They were really, really open, because they felt that as a nation, we were turning our backs on children, essentially.

Michael Barbaro

And once you have that access, what do you find?

Michael Keller

What we learned talking with all these law enforcement officers was just how this world operates. A lot of the departments told us about the high levels of turnover they have. We had one commander who said, back when he was an investigator, he saw one image that was so shocking to him, he quit and served a tour in Iraq. That was his escape.

Gabriel Dance

To even see this imagery once changes your life. And these people look at it all day long. And then on top of that, they have to deal with the fact that their funding has not gone up whatsoever. They’re being funded at a level that means they can’t do proactive investigations anymore. So they’re not sitting in chat rooms trying to catch what many of them think are the worst criminals. They’re unable to do anything, really, then respond to the utter flood of reports coming in from the National Center. And because of the sheer number of reports coming in, they’re forced to make some truly terrible decisions on how to prioritize who they’re investigating.

Michael Barbaro

Like what?

Gabriel Dance

The F.B.I. told us that in addition to, of course, looking for anybody who’s in immediate danger, they have to prioritize infants and toddlers. When we first got into this, we didn’t even consider the idea of infants. And to hear that the F.B.I. — and later L.A.P.D. would say the same thing — we’re prioritizing infants and toddlers, and essentially not able to respond to reports of anybody older than that. I mean, it really left us pretty speechless.

Michael Keller

So we’re learning a lot from speaking with law enforcement. But they also only have a small part of the picture. We also are thinking about this tip that we got, where the tipster was able just to find these images on search engines very easily. And so we still have this big question of how easy is it to come across this material.

Michael Barbaro

And how do you go about answering that question?

Michael Keller

When we initially started trying to replicate the tipster’s search, we had to stop because we didn’t want to be searching for these illegal images. But then we discovered a technology that would allow us to keep investigating without having to look at the images. It’s called PhotoDNA. And it essentially creates a unique digital fingerprint for each image. And as the National Center is receiving these reports and identifying these illegal images of child sexual abuse, they keep track of these digital fingerprints. And other companies can tap into that database to see, is this image that I have, is it in that database of known images? And we stumbled upon a service from Microsoft that actually allows you to do just that. It’ll take a URL of an image and tell you if it matches an image already in that database. So we wrote a computer program that would replicate that initial search from the tipster, record all of the URLs. The key part of it, though, was that it blocked all images.

Michael Barbaro

So suddenly you can now search for these images, and find where they live on the internet, without illegally looking at them.

Michael Keller

Right. So we started doing this test across a number of different search engines.

Gabriel Dance

And to sit there and watch as the program starts ticking away, and see the first time it flashes that it got a match, and then to see a match again —

Michael Keller

Bing, from Microsoft.

Gabriel Dance

— and match again —

Michael Keller

Yahoo.

Gabriel Dance

— and match again —

Michael Keller

Another one called DuckDuckGo.

Gabriel Dance

— I mean, I think both our jaws dropped.

Michael Keller

We didn’t find any on Google. But on other search engines powered by Microsoft data, we found a number of matches.

Gabriel Dance

Dozens, dozens of images.

Michael Barbaro

You’re saying the company that came up with this technology to help track these images is the very same company whose search engine allows people to find them, view them, keep viewing them.

Gabriel Dance

Right. As soon as we started running this computer program using Microsoft to detect illegal imagery, we were finding illegal imagery on Microsoft. So it was very clear that they were not using their own services to protect users from this type of image.

Michael Keller

And Microsoft told us that this was the result of a bug that they fixed. But about a week later, we reran the test and found even more images.

Michael Barbaro

Wow. So it sounds like you’re finding all these images on Microsoft-powered search engines. So how much of this, in the end, is just a Microsoft problem?

Michael Keller

It’s more complicated than that. We were performing this limited search just to test search engines on images. But we’re also, at the same time, reading over 10,000 pages of court documents. These are search warrants and subpoenas from cases where people were caught trading this material.

Gabriel Dance

And what becomes clear pretty quickly is that every major tech company is implicated. In page after page after page, we see social media platforms —

Michael Keller

Facebook, Kik, Tumblr.

Gabriel Dance

We see cloud storage companies —

Michael Keller

Google Drive, Dropbox. We read one case where an offender went on a Facebook group to ask for advice from other people — say, hey, how do I share this? How do I get access to it? How do I get access to children? And they say, download Kik to talk to the kids and download Dropbox. And we can share links with you.

Michael Barbaro

Wow.

Gabriel Dance

And from these documents, it becomes clear that the companies know. I mean, there’s so many cases for each company that they all know. And so the question becomes, what are they doing about it?

[Music]

Michael Barbaro

Tomorrow on “The Daily,” a victim’s family asks that same question.

. . .

A Criminal World of Child Abuse, Part 2

Hosted by Michael Barbaro, produced by Marc Georges, Annie Brown and Jazmín Aguilera, and edited by Larissa Anderson and Michael Benoist

The Daily, The New York Times, Wednesday , February 19, 2020

Images of victims of child sexual abuse recirculate on the internet — seemingly forever. What are tech companies doing to stop this?

Michael Barbaro

From The New York Times, I’m Michael Barbaro. This is “The Daily.” Yesterday, my colleagues Michael Keller and Gabriel Dance described the government’s failure to crack down on the explosive growth of child sexual abuse imagery online. Today: The role of the nation’s biggest tech companies and why — despite pleas from victims — those illicit images can still be found online.

It’s Thursday, February 20.

Michael Keller

Would it be possible — I don’t want it to get too sticky, but for the sound, the —

Lawyer

I can turn it down, we’ll just get warm.

Michael Keller

If it’s too warm —

Michael Keller

Last summer, Gabriel and I got on a plane and flew out to the West Coast to meet in a lawyer’s office to speak with the family that she’d been representing.

Gabriel Dance

As Mike said, like, once we started looking into it, there’s so many facets.

Gabriel Dance

So we explained to them a little bit about our reporting and why it was so important that we speak with them.

Gabriel Dance

Yeah, we’re here to answer your questions, too, I mean, as best we can.

Michael Barbaro

And who is this family?

Michael Keller

All we can say is that this is a mom and a stepdad who live on the West Coast. And that’s because they only felt comfortable speaking with us if we could protect their privacy.

Stepfather

I mean, we started this not knowing anything about it. And I might get emotional here but — you know, as parents, we’re trying to figure out what’s the best way for our kid to deal with this.

Michael Keller

And they started to tell us a story about what happened to their daughter.

Stepfather

It was August 21, 2013. I was at work. She was shopping with our two middle children and —

Gabriel Dance

So one day, six years ago, the mom is out with her kids, doing some back to school shopping, and she gets a call from the police. And they tell her she has to come in immediately.

Mother

Just odd. An odd call —

Stepfather

It was very weird to get a call.

Mother

— to get a phone call from a detective saying, you need to come down to the station right now. We need to talk to you. We feel your kids are in danger. And I’m like, what?

Stepfather

She called me panicked, going they want to talk to us. Go talk to them. We don’t have anything — we’re not criminals, so.

Mother

It was just odd.

Gabriel Dance

So she goes into the sheriff’s office.

Mother

And there’s this F.B.I. agent, introduces himself. He said, you know, we think your children are in danger from their father, particularly the youngest one. And I was just shocked and had no idea what they were talking about. No idea. Mind you, I had my two other kids who I was shopping with, were in the room next door, playing or doing something, I don’t know what they do. And he just went on to say that we’re going to start investigating him, or we’ve been investigating him.

Stepfather

And there’ll be an agent at our house Friday to tell us more.

Mother

Mhm.

Stepfather

And she showed up Friday morning, August 23, 9:00 in the morning. Can I talk to you outside? And we talk outside. And she drops this bomb on us.

Gabriel Dance

And what the agent tells her is that her children’s biological father had been sexually abusing her youngest child, starting at the age of four years old and going on for four years. And not only that, he had been documenting this abuse with photos and videos and had been posting them online.

Mother

So I remember that moment. I mean, I think I kind of doubled over or something. I was just shocked. I asked, when did this start? How long has this been going on? How did I not know? I’m mom. So all of those questions and all those feelings, and I mean everything just came crashing down.

Michael Barbaro

What does this couple tell their children at this point?

Michael Keller

The F.B.I. agent said, actually, it’s better if I’m the one to tell your kids.

Mother

In her experience, it would be best for the news to come from her, as opposed for me telling the kids directly that their dad was arrested, because kids might blame me and point the finger. I said, yeah, whatever you think.

Stepfather

She passed around her F.B.I. badge and showed the kids and built a rapport with them, and then —

Mother

She said, you know, your dad has been arrested. I don’t think we even talked about what it was for or why he was arrested, just that he was arrested and they’re still investigating.

Stepfather

Right.

Mother

You know, even to this day, I think about — again, I was married to this guy. How did I miss that? Still. And this is six years later. You know, how did I miss that? So there is a piece of that guilt that’s always going to be there, I think. [SIGHS]

Stepfather

Yeah.

But it isn’t your fault.

Mother

And just —

Stepfather

It’s not your fault.

Mother

(CRYING) And how could somebody do that to their own child?

I still — I don’t think I’ll ever understand a person like that.

[Music]

Mother

She’s just now developmentally dealing with the effects of it. She’s angry, and she’s acting out and —

Michael Keller

Her daughter is now a young teenager —

Mother

She’s in counseling now, you know, so —

Michael Keller

— and has a counselor, and is not only dealing with all of the normal things that young teenagers have to deal with, but the trauma of being sexually abused at such a young age.

Mother

So that’s something she’s just now having to learn how to say no, how to inform others that she’s not comfortable with something. And she’s only 13.

Michael Keller

And meanwhile, even though the physical abuse ended six years ago, the images continue to circulate to this day. And the parents know that because they get notified whenever someone is arrested having these images in their possession.

Michael Barbaro

And why would they be notified? What’s the point? It feels like it would just be a horrendous reminder every time that this is still happening.

Michael Keller

One of the abilities that victims of this crime have is to seek restitution from the offenders. So when they’re notified, they and their lawyers can petition the court to get a certain sum of money, which is a really good thing. And helps to rectify some of the damage that was done.

Gabriel Dance

But Mike, you’re right.

Mother

Oh my gosh, this person in Kansas and New York. Somebody in Wisconsin saw my kid, and oh my god.

Gabriel Dance

It is a brutal, double-edged sword, these notifications.

Mother

Oh my god. There’s people out there who know about this, and they can watch it over and over.

Gabriel Dance

And for this young woman, her parents and their lawyer received more than 350 notices in just the last four years.

Michael Barbaro

Wow. So something on the order of 100 times a year, somebody is convicted of having looked at these photos of their daughter?

Gabriel Dance

That’s right.

[Music]

Michael Keller

What do you think should be kind of in the front of our minds? What do you think is really important for us to understand?

Mother

The internet technology companies need to really do something about this, because — I don’t know. It’s just —

Stepfather

I don’t know enough about technology, but I just — where is this crap, you know?

Gabriel Dance

Certainly for the stepfather, his question, which is a legitimate question, is —

Stepfather

How do we get it off?

Gabriel Dance

— why can’t they just take it down?

Stepfather

How do we, how do we make it go away?

Michael Keller

And that’s the same question we had. Why, six years later, are these images and videos still showing up on some of the largest technology platforms in the world?

Stepfather

Figure it out. We’ve got this technology. We can go to the moon and Mars and stuff. We can get this crap off the web.

Michael Keller

And are the companies doing enough to prevent it?

[Music]

Michael Barbaro

We’ll be right back.

Gabriel, Michael, how do you begin to answer these parents’ very reasonable questions about why these images of their child’s sexual abuse keep showing up online?

Gabriel Dance

So before we can answer the question of why these images keep showing up, we needed to understand where the images were online and what companies were responsible for people sharing them. But the National Center, which you’ll remember is the designated clearinghouse for this information, wouldn’t tell us.

Michael Barbaro

But they know, right? So why wouldn’t they tell you? Why wouldn’t they give you that information?

Gabriel Dance

Part of the reason why they don’t divulge these numbers is because the companies are not required by law to look for these images.

Michael Barbaro

It’s voluntary.

Gabriel Dance

It’s voluntary to look for them.

Michael Barbaro

Right. Without the help of these companies, they have no idea where these images are, where they’re coming from, how many of them there are.

Gabriel Dance

That’s right. And the National Center is concerned that if they disclose these numbers, that they might damage those relationships which they depend on.

Michael Barbaro

Mhm.

Gabriel Dance

But then we start to hear anecdotally that there is one company responsible for the majority of the reports. And that was Facebook. So we are doing everything we can to run down that number. But very few people know it. However, we ultimately do find somebody who has documentation that reveals that number.

So in 2018, the National Center received over 18.4 million reports of illegal imagery. And the number this person provides us shows that of those 18.4 million, nearly 12 million came from Facebook Messenger.

Michael Barbaro

Wow. So the vast majority of them.

Gabriel Dance

Almost two out of every three reports came from Facebook Messenger.

Michael Barbaro

So just that one service of Facebook.

Gabriel Dance

That’s right, just the chat application. This doesn’t include groups or wall posts or any of the other public information that you might post or share. This is specifically from the chat application.

Michael Keller

But then, after we reported that number, the D.O.J. actually comes out and says that Facebook in total — so Messenger plus the other parts of the platform — are responsible for nearly 17 of the 18.4 million reports that year.

Michael Barbaro

Wow. This is a Facebook problem.

Michael Keller

That’s what the numbers at first glance would suggest. But we realized we needed to talk to people that really understood this to know what conclusions to come to from these numbers.

Phone Ringing

Alex Stamos

Hello.

Gabriel Dance

So we called up somebody who would know better than almost anybody.

Gabriel Dance

Alex.

Gabriel Dance

Alex Stamos.

Alex Stamos

Hey.

Gabriel Dance

Gabe Dance. Mike Keller here.

Michael Keller

Hey, how’s it going?

Alex Stamos

I’m doing OK. I’m getting back to my office.

Gabriel Dance

Alex was the former chief security officer at Facebook for the past several years. He’s now a professor at Stanford University.

Michael Barbaro

So he’s somebody who very much would have seen this happening, would’ve understood what was going on inside Facebook when it comes to child sexual abuse.

Gabriel Dance

Absolutely.

Michael Keller

You’ve obviously worked on this area for years. Facebook is, Facebook Messenger is responsible for about 12 million of the 18.4 million reports last year to the National Center. That seems like a lot. Can you help us understand what’s happening on that platform?

Alex Stamos

Yeah, so we’ve been discussing one very important number — 18.4 million, which is that the number of —

Gabriel Dance

So Stamos tells us something that actually is a little counterintuitive. That this huge number of reports coming from the company —

Alex Stamos

That’s not because Facebook has the majority of abuse. It’s because Facebook does the most detection.

Gabriel Dance

It’s them working the hardest to find this type of content and report it.

Alex Stamos

I expect that pretty much any platform that allows the sharing of files is going to be absolutely infested with child sexual abuse. If everybody was doing the same level of detection, we’d probably be in the hundreds of millions of reports.

Michael Barbaro

What is he telling you? That of the 18 million, the reason why Facebook has so many is because other companies are not reporting this at all?

Gabriel Dance

Essentially, yes. What he’s saying is that Facebook is reporting such a high number of images and videos because they’re looking for them. And that a lot of other companies aren’t even doing that.

Alex Stamos

Facebook checks effectively every image that transfers across the platform in an unencrypted manner to see whether or not it is known child sexual abuse material.

Gabriel Dance

Every single time somebody uploads a photo or a video to Facebook, it’s scanned against a database of previously identified child sexual abuse imagery. And in doing so, Facebook is finding far, far more of this content than any other company.

Michael Barbaro

So he’s saying don’t shoot the messenger here.

Gabriel Dance

That’s what he’s saying. So as of now, this is the best method these companies have to identify and remove the imagery.

Michael Barbaro

Mhm. What Facebook is doing?

Gabriel Dance

That’s right.

[Music]

Michael Barbaro

So why doesn’t every technology company do this? It seems pretty straightforward.

Gabriel Dance

Well, the short answer is that it’s not baked into the business model.

Michael Barbaro

Mhm.

Gabriel Dance

That this is not something that helps them grow their base of users. It’s not something that provides any source of revenue. And it’s in fact something that works against both of those things, if done correctly.

Michael Barbaro

What do you mean?

Gabriel Dance

Well, every time Facebook detects one of these images, they shut down that account. But most —

Michael Barbaro

You’re deleting your own users.

Gabriel Dance

That’s right. You’re deleting your own users. And it costs money to delete your own users.

Michael Barbaro

Right. You have to spend money to hire people to find accounts that are doing something wrong that you’re then going to lose as a customer.

Gabriel Dance

You got it. So both of those things fly in the face of almost all of these companies’ business models. And Stamos actually told us something else interesting.

Alex Stamos

The truth is that the tech industry is still pretty immature at the highest levels about the interaction between executives.

Gabriel Dance

And that’s that these companies aren’t really working together to solve this problem.

Alex Stamos

You know, if you look at, say, the banking industry, these big investment banks, the C.E.O.s hate each other. But they understand that their boats all rise and fall together, and so they are able to work together on what kind of regulation they’d like to see. But a lot of the top executives at the tech companies really kind of personally despise one another. And it is very difficult to get them to agree to anything from a policy perspective.

Michael Keller

And in our reporting, we found some pretty egregious disparities in how different companies police this on their platforms. Amazon, who has a massive cloud storage business, for example — they handle millions of uploads and download a day — they don’t scan whatsoever. Apple has an encrypted messaging platform, so that doesn’t get scanned. They also choose not to scan their photos in iCloud. Snapchat, Yahoo, they don’t scan for videos at all, even though everyone knows video is a big part of the problem. And it’s not because the solutions don’t exist, they just have chosen not to implement them.

And now Facebook, the company looking for this content most aggressively, is starting to rethink its policy in doing that. Over the last few years, a lot of tech companies have realized that privacy is an important feature that a lot of their customers are expecting. And citing privacy concerns, Facebook announced that it will soon encrypt its entire Messenger platform, which would effectively blind them to doing any type of automated scanning between chat — which, again, was responsible for nearly 12 million of those 18.4 million reports.

Michael Barbaro

So they would stop searching for this child sexual abuse material?

Michael Keller

Right. They would limit their own ability to be aware of it.

Michael Barbaro

And privacy is important enough that they would handicap their ability to find this criminal conduct and these horrible photos?

Michael Keller

Based on the criticism that a lot of tech companies have received over the last few years, moving towards encryption is a really attractive option for them. Because it lets them say, we really care about the privacy of your conversations, and we’re going to make that more secure.

Michael Barbaro

Gabriel, it occurs to me that the debate over privacy is enormous and loud. We’ve done episodes of “The Daily” about it, many episodes of “The Daily” about it. But the child sexual abuse subject is not as well-known. It’s not as widely debated. It’s the first time we’re talking about it. Is that reflected in this decision, the attention that these two subjects get?

Gabriel Dance

It is. And in fact, it’s one of the main reasons we chose this line of reporting. The issue of child sexual abuse imagery really brings the privacy issue to a head, because we’re forced with these very, very stark decisions that we’re discussing here right now. Which is that, is it more important to encrypt the communications on a platform where it’s well known that children interact with adults? Is it worth it to encrypt those communications to protect people’s privacy when we know what the ramifications for children are?

Michael Keller

But the child protection question is also a privacy question. And when you are Facebook and you’re saying, we’re going to encrypt conversations for the privacy of our users, the child advocates will say, well, but you’re doing nothing to protect the privacy of the child in the image.

Michael Barbaro

In fact, you’re making it harder for their life to ever be private.

Michael Keller

Exactly.

And this means that next year, or whenever Facebook moves ahead with its plan to encrypt, they won’t be sending nearly 17 million reports to the National Center. They’ll be sending far fewer.

Michael Barbaro

Right.

Michael Keller

And that means for the family that we spoke with on the West Coast, who is receiving about 100 notifications a year that someone has been caught with images and videos of their daughter, they’ll likely be getting far fewer notifications. But not because people aren’t looking at these images. They still will be — the family just won’t know about it.

Mother

That’s my big thing. I want people to care about this, because there is a human factor to this obviously. We have to force people to look at it, you know, the tech companies. They have to do something about it.

Michael Barbaro

So where does that leave this family now?

Gabriel Dance

They feel abandoned and angry with the tech companies.

Mother

They have to do something about it, because knowing that that’s out there, I don’t know, it’s just being traumatized all over again.

Gabriel Dance

They keep getting these notifications that their daughter’s imagery has been found.

Stepfather

It’s ongoing. It’s lifelong. There’s nothing they can do about being a victim for the rest of their life.

Gabriel Dance

And the way they describe it is it’s like getting hit by a truck.

Stepfather

They can’t stop being in a car wreck every day.

Gabriel Dance

Only to get back up and get hit by a truck time after time after time.

Stepfather

There’s no, there’s no other way to say it. She’s — that will be there forever until it’s removed, until somebody comes up with a way to take that off.

Gabriel Dance

And so, what this family has decided is that —

Stepfather

She doesn’t know.

Gabriel Dance

— they’re not telling their daughter. They’re not going to tell her that her images are online.

Mother

There’s no good it would do. There’s no benefit, at least from our perspective, to tell her. I mean, she needs to worry about soccer and —

Stepfather

Making the team. She worries about sleepovers and wrestling, grades.

Mother

She doesn’t need to be worrying about the worst part of her life available on the internet.

Stepfather

I don’t want her to be fearful of what other people —

Mother

Might see.

Stepfather

— might be seeing of her —

Mother

Be recognized.

Stepfather

— when they do a Google search for a bullet, up pops her image. That’s horrible.

Gabriel Dance

But there is a clock ticking.

Stepfather

I just found out when she turns 18, it isn’t a choice. The F.B.I. will get ahold of her, because she’s an adult victim now.

Gabriel Dance

When she turns 18, the federal government is going to start sending her the notifications. So what they’re hoping is that in the four years until that happens, the tech companies are going to solve this problem.

Michael Barbaro

Which is to say, get these images offline for good.

Gabriel Dance

That’s what they hope.

Stepfather

My motivation for this is we have to explain to our daughter, this is on the internet. And she has to live with that. But being able to tell her — you know, if you can tell me in five years, this will be something that was, not is, that’d be great.

Gabriel Dance

The dad asked us the question, can you tell me that when I talk to my daughter about this when she turns 18, that I can tell her that these horrific images used to be online but no longer are?

Michael Barbaro

And what did you say?

Gabriel Dance

What I wished I could tell them was yes.

Gabriel Dance

Well, I think that’s why we’re doing this story —

Mother

Yeah.

Gabriel Dance

— to be honest with you, is —

Gabriel Dance

What I did tell him was that that was the point of us doing this reporting. Was the hopes that something would change, and that in five years, those images would no longer be there.

Gabriel Dance

But once we publish this article, and we see how they respond, and we see how, not only tech companies —

Gabriel Dance

But from everything we’ve learned, it’s only getting worse.

Stepfather

What if it was your daughter?

Mother

Yeah, you know, put yourself in that kid’s shoes.

Stepfather

What if you were the person they’re looking at the rest of your life? If we can tell that this happened, instead of this is happening, the world would be a lot better off. She’ll be a lot better off.

[Music]

Michael Barbaro

Michael, Gabriel, thank you.

Michael Keller

Thank you.

Gabriel Dance

Thank you.

Michael Barbaro

A few weeks ago, the National Center for Missing and Exploited Children released new data for 2019. It said that the number of photos, videos and other materials related to online child sexual abuse grew by more than 50 percent to 70 million.