“Tech Companies Detect a Surge in Online Videos of Child Sexual Abuse”, The New York Times
By Gabriel J.X. Dance and
In a first, videos outnumbered photos in reports to the authorities last year. Facebook found the most imagery, the bulk of it on its Messenger app.
The number of reported photos, videos and other materials related to online child sexual abuse grew by more than 50 percent last year, an indication that many of the world’s biggest technology platforms remain infested with the illegal content.
Nearly 70 million images and videos were reported to the National Center for Missing and Exploited Children, a federally designated clearinghouse for the imagery that works with law enforcement agencies.
The record number was driven by a surge in illegal videos, which have always been popular among sexual predators but are now more readily detected by some companies. Over 41 million videos were reported; the number five years ago was under 350,000. The companies flagged many of the same images and videos multiple times as they were shared among users.
In 2019, there were more videos of abuse reported than photos
Facebook reported nearly 60 million photos and videos, more than 85 percent of the total. The number reflects both its immense user base and its aggressive approach to routing out the material, but shows that offenders continue to exploit the platform. About half of the content was not necessarily illegal, according to the company, and was reported to help law enforcement with investigations. Instagram, owned by Facebook, was responsible for an additional 1.7 million photos and videos.
In a statement, Antigone Davis, Facebook’s global head of safety, said “the size and expertise of our team, together with our sophisticated technology, have made us industry leaders in detecting, removing and reporting these images, and thwarting people from sharing them.”
Snapchat, Twitter and other social media companies also submitted reports of imagery. So did companies whose services include search engines and cloud storage, including Google and Microsoft. Apple, Dropbox and the chat platform Discord also detected the illegal content.
In all, 164 companies submitted reports.
“These numbers show that any service provider that allows individuals to host images and videos are susceptible to child sexual exploitation material being posted,” said John Shehan, a vice president at the national center.
He confirmed the numbers released on Friday reflected all content reported to the center, including material that “may not meet the legal definition of child pornography.”
Still, the numbers do not paint a complete picture of the problem: The industry has been plagued by uneven and inconsistent detection practices, as The Times reported last year. Some cloud storage services, including those owned by Amazon and Microsoft, do not scan for any of the illegal content at all, while other companies, like Snap, scans for photos but not videos.
The data shows broad disparities in the tech industry. Google reported more than 3.5 million combined images and videos; Yahoo more than two million; and Imgur, a photo-sharing site, more than 260,000. Dropbox, Microsoft, Snap and Twitter are the only other companies that reported more than 100,000 images and videos last year.
Apple reported dramatically fewer images than most other tech giants, just over 3,000 in total, and zero videos. These figures reflect the company’s inability to scan material sent through its messaging app, which is encrypted, as well as the fact that it does not scan its file storage service, iCloud. Amazon, whose cloud services handle millions of uploads and downloads every second, sent no images or videos to the national center.
Senator Richard Blumenthal of Connecticut, who has sponsored child protection legislation and was recently part of a bipartisan group of lawmakers who asked 36 tech companies to detail their efforts in this area, called the numbers “appalling and astonishing.”
“The disparate data reported here shows that we clearly cannot rely on tech companies to self-police,” he said.
Alex Stamos, who served as chief of information security at both Facebook and Yahoo, said the numbers were a reflection of companies that have put more effort into finding and removing the material from their platforms.
“I hope these numbers encourage people to do more, not less,” Mr. Stamos said.
Among imagery reported from tech companies, Facebook dominates
Last year, there was actually a decrease in the total number of reports filed with the national center, falling to 16.9 million from 18.4 million in 2018. That was at least in part because tech companies improved their reporting process by bundling photos and videos instead of flagging them individually.
A single report usually includes multiple photos and videos — for example, when the material is found in someone’s email account — so the overall growth in reported imagery may signal “those that are sharing it are sharing in larger volumes,” said Mr. Shehan of the national center.
Some companies that made a small number of reports ended up finding a large volume of imagery. Dropbox, for instance, made roughly 5,000 reports last year but found over 250,000 photos and videos. For victims of child sexual abuse, the recirculating imagery can cause lasting trauma. Online offenders are known to seek out children in the photos and videos, even into adulthood. Victims, or the parents of abused minors, also receive legal notices when their images are found during investigations, serving as constant reminders of their pain.
“To know that these images are online and that other people are enjoying your degradation for sexual gratification in some ways means you are forever being abused,” said Alicia Kozakiewicz, a survivor of child sexual abuse who has been a longtime internet safety educator.
The growth in reported imagery, however, does not offer insights into whether more of the illegal content is being newly produced and posted online. Most imagery is detected by tech companies through automated scans that only recognize previously flagged material. And detecting videos, which last year for the first time surpassed the number of photos, is particularly difficult because the industry lacks a common standard for identifying them.
The number of reported videos spiked in 2018 when Facebook ramped up its detection efforts. The company was responsible for more than 90 percent of reports that year, according to law enforcement officials.
The continued growth in reported images from Facebook is sure to increase pressure on the company, which has been generally lauded for finding and reporting the content, but announced last year that it intended to encrypt its Messenger app. In 2019, Messenger was responsible for over 80 percent of all reports made by Facebook. Encryption would make it much more difficult to detect the illegal imagery on Messenger, which was also the largest source of reported material in 2018.
In September, The Times reported that the number of reports to the national center had grown exponentially, and that the federal response was lacking despite a 2008 law meant to address what was then called an “epidemic.” Throughout the country, law enforcement groups charged with investigating the crimes have been overwhelmed.
Legislation introduced in December would extend the length of time companies are required to retain information about illegal imagery in order to give law enforcement more opportunity to investigate. A bipartisan group of lawmakers said the bill was in response to a Times investigation revealing that cases often went cold after companies deleted the data. A draft of other proposed legislation is aimed at making companies follow a set of best practices to police imagery on their platforms or risk greater legal liability.
Even as the number of reported images and videos continues to grow, it remains difficult to assess the scope of the problem. While more companies are making efforts to detect the content, encrypted technologies and the dark web allow predators to continue trading imagery in secret.
“If all of the companies involved were looking as aggressively as Facebook, that number of reports could be 50 million or 100 million,” Mr. Stamos said.
Gabriel Dance is the deputy investigations editor. He was previously interactive editor for The Guardian and was part of the team awarded the 2014 Pulitzer Prize for Public Service for coverage of widespread secret surveillance by the N.S.A. @gabrieldance
Michael H. Keller is a reporter and data journalist specializing in technology on the investigative team. Before joining The Times, he worked at Bloomberg News, Newsweek and was a fellow at the Tow Center for Digital Journalism at Columbia University. @mhkeller