{"id":9209,"date":"2020-02-09T23:59:05","date_gmt":"2020-02-10T07:59:05","guid":{"rendered":"https:\/\/worldcampaign.net\/?p=9209"},"modified":"2020-02-10T05:11:02","modified_gmt":"2020-02-10T13:11:02","slug":"post1-86","status":"publish","type":"post","link":"https:\/\/worldcampaign.net\/?p=9209","title":{"rendered":"&#8220;Tech Companies Detect a Surge in Online Videos of Child Sexual Abuse&#8221;, The New York Times"},"content":{"rendered":"<p>By <span class=\"css-1baulvz\">Gabriel J.X. Dance<\/span> and <span class=\"css-1baulvz last-byline\">Michael H. Keller, Feb. 7, 2020<\/span><\/p>\n<p id=\"article-summary\" class=\"css-1ifw933 e1wiw3jv0\"><em>In a first, videos outnumbered photos in reports to the authorities last year. Facebook found the most imagery, the bulk of it on its Messenger app.<\/em><\/p>\n<section class=\"meteredContent css-1r7ky0e\">\n<div class=\"css-1fanzo5 StoryBodyCompanionColumn\">\n<div class=\"css-53u6y8\">\n<p class=\"css-exrw3m evys1bk0\">The number of reported photos, videos and other materials related to online child sexual abuse grew by more than 50 percent last year, an indication that many of the world\u2019s biggest technology platforms remain infested with the illegal content.<\/p>\n<p class=\"css-exrw3m evys1bk0\">Nearly 70 million images and videos were reported to <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.missingkids.org\/home\" target=\"_blank\" rel=\"noopener noreferrer\">the National Center for Missing and Exploited Children<\/a>, a federally designated clearinghouse for the imagery that works with law enforcement agencies.<\/p>\n<p class=\"css-exrw3m evys1bk0\">The record number was driven by a surge in illegal videos, which have always been popular among sexual predators but are now more readily detected by some companies. Over 41 million videos were reported; the number five years ago was under 350,000. The companies flagged many of the same images and videos multiple times as they were shared among users.<\/p>\n<\/div>\n<aside class=\"css-ew4tgv\"><\/aside>\n<\/div>\n<section id=\"07childabuse-numbers-images-video\" class=\"interactive-content interactive-size-scoop css-174j8de\" data-id=\"100000006964960\">\n<header id=\"interactive-header\" class=\"css-cl76n0 interactive-header\">\n<h2 id=\"interactive-headline\" class=\"css-1su19vv interactive-headline\">In 2019, there were more videos of abuse reported than photos<\/h2>\n<\/header>\n<div class=\"css-17ih8de interactive-body\" data-sourceid=\"100000006964960\">\n<div class=\"g-story g-freebird g-max-limit \" data-preview-slug=\"2020-02-07-childabuse-numbers\">\n<div id=\"\" class=\"g-container \">\n<div class=\"g-asset g-svelte     \">\n<div class=\"g-svelte\" data-component=\"3\">\n<div class=\"chart svelte-1f33zsf\">\n<div class=\"grid-line horizontal svelte-1f33zsf first\"><\/div>\n<div class=\"box svelte-1f33zsf\">The center identified to The New York Times the companies that had detected the imagery, the first time detailed company information had been released.<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/section>\n<div id=\"story-ad-1-wrapper\" class=\"css-1r07izm\"><\/div>\n<div class=\"css-1fanzo5 StoryBodyCompanionColumn\">\n<div class=\"css-53u6y8\">\n<p class=\"css-exrw3m evys1bk0\">Facebook reported nearly 60 million photos and videos, more than 85 percent of the total. The number reflects both its immense user base and its aggressive approach to routing out the material, but shows that offenders continue to exploit the platform. About half of the content was not necessarily illegal, according to the company, and was reported to help law enforcement with investigations. Instagram, owned by Facebook, was responsible for an additional 1.7 million photos and videos.<\/p>\n<p class=\"css-exrw3m evys1bk0\">In a statement, Antigone Davis, Facebook\u2019s global head of safety, said \u201cthe size and expertise of our team, together with our sophisticated technology, have made us industry leaders in detecting, removing and reporting these images, and thwarting people from sharing them.\u201d<\/p>\n<div id=\"NYT_MID_MAIN_CONTENT_REGION\" class=\"css-9tf9ac\">\u201cWe will continue to develop the best solutions to keep more children safe,\u201d she added.<\/div>\n<p class=\"css-exrw3m evys1bk0\">Snapchat, Twitter and other social media companies also submitted reports of imagery. So did companies whose services include search engines and cloud storage, including Google and Microsoft. Apple, Dropbox and the chat platform Discord also detected the illegal content.<\/p>\n<p class=\"css-exrw3m evys1bk0\">In all, 164 companies submitted reports.<\/p>\n<p class=\"css-exrw3m evys1bk0\">\u201cThese numbers show that any service provider that allows individuals to host images and videos are susceptible to child sexual exploitation material being posted,\u201d said John Shehan, a vice president at the national center.<\/p>\n<\/div>\n<\/div>\n<div id=\"story-ad-2-wrapper\" class=\"css-2ninbb\"><\/div>\n<div class=\"css-1fanzo5 StoryBodyCompanionColumn\">\n<div class=\"css-53u6y8\">\n<p class=\"css-exrw3m evys1bk0\">He confirmed the numbers released on Friday reflected all content reported to the center, including material that \u201cmay not meet the legal definition of child pornography.\u201d<\/p>\n<p class=\"css-exrw3m evys1bk0\">Still, the numbers do not paint a complete picture of the problem: The industry has been plagued by uneven and inconsistent detection practices, <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.nytimes.com\/interactive\/2019\/09\/28\/us\/child-sex-abuse.html\">as The Times<\/a> <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.nytimes.com\/interactive\/2019\/11\/09\/us\/internet-child-sex-abuse.html\">reported last year<\/a>. Some cloud storage services, including those owned by Amazon and Microsoft, do not scan for any of the illegal content at all, while other companies, like Snap, scans for photos but not videos.<\/p>\n<p class=\"css-exrw3m evys1bk0\">The data shows broad disparities in the tech industry. Google reported more than 3.5 million combined images and videos; Yahoo more than two million; and Imgur, a photo-sharing site, more than 260,000. Dropbox, Microsoft, Snap and Twitter are the only other companies that reported more than 100,000 images and videos last year.<\/p>\n<p class=\"css-exrw3m evys1bk0\">Apple reported dramatically fewer images than most other tech giants, just over 3,000 in total, and zero videos. These figures reflect the company\u2019s inability to scan material sent through its messaging app, which is encrypted, as well as the fact that it does not scan its file storage service, iCloud. Amazon, whose cloud services handle millions of uploads and downloads every second, sent no images or videos to the national center.<\/p>\n<p class=\"css-exrw3m evys1bk0\">Senator Richard Blumenthal of Connecticut, who has sponsored child protection legislation and was recently part of a bipartisan group of lawmakers who asked <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/twitter.com\/mhkeller\/status\/1196818679683530752\" target=\"_blank\" rel=\"noopener noreferrer\">36 tech companies<\/a> to detail their efforts in this area, called the numbers \u201cappalling and astonishing.\u201d<\/p>\n<p class=\"css-exrw3m evys1bk0\">\u201cThe disparate data reported here shows that we clearly cannot rely on tech companies to self-police,\u201d he said.<\/p>\n<p class=\"css-exrw3m evys1bk0\">Alex Stamos, who served as chief of information security at both Facebook and Yahoo, said the numbers were a reflection of companies that have put more effort into finding and removing the material from their platforms.<\/p>\n<\/div>\n<\/div>\n<div id=\"story-ad-3-wrapper\" class=\"css-1r07izm\"><\/div>\n<div class=\"css-1fanzo5 StoryBodyCompanionColumn\">\n<div class=\"css-53u6y8\">\n<p class=\"css-exrw3m evys1bk0\">\u201cI hope these numbers encourage people to do more, not less,\u201d Mr. Stamos said.<\/p>\n<\/div>\n<aside class=\"css-ew4tgv\"><\/aside>\n<\/div>\n<section id=\"07childabuse-numbers-company-counts\" class=\"interactive-content interactive-size-scoop css-174j8de\" data-id=\"100000006964976\">\n<header id=\"interactive-header\" class=\"css-cl76n0 interactive-header\">\n<h2 id=\"interactive-headline\" class=\"css-1su19vv interactive-headline\">Among imagery reported from tech companies, Facebook dominates<\/h2>\n<\/header>\n<\/section>\n<div class=\"css-1fanzo5 StoryBodyCompanionColumn\">\n<div class=\"css-53u6y8\">\n<p class=\"css-exrw3m evys1bk0\">Last year, there was actually a decrease in the total number of reports filed with the national center, falling to 16.9 million from 18.4 million in 2018. That was at least in part because tech companies improved their reporting process by bundling photos and videos instead of flagging them individually.<\/p>\n<p class=\"css-exrw3m evys1bk0\">A single report usually includes multiple photos and videos \u2014 for example, when the material is found in someone\u2019s email account \u2014 so the overall growth in reported imagery may signal \u201cthose that are sharing it are sharing in larger volumes,\u201d said Mr. Shehan of the national center.<\/p>\n<p class=\"css-exrw3m evys1bk0\">Some companies that made a small number of reports ended up finding a large volume of imagery. Dropbox, for instance, made roughly 5,000 reports last year but found over 250,000 photos and videos. For victims of child sexual abuse, the recirculating imagery can cause lasting trauma. Online offenders are known to seek out children in the photos and videos, even into adulthood. Victims, or the parents of abused minors, also receive legal notices when their images are found during investigations, serving as constant reminders of their pain.<\/p>\n<p class=\"css-exrw3m evys1bk0\">\u201cTo know that these images are online and that other people are enjoying your degradation for sexual gratification in some ways means you are forever being abused,\u201d said Alicia Kozakiewicz, a survivor of child sexual abuse who has been a longtime internet safety educator.<\/p>\n<p class=\"css-exrw3m evys1bk0\">The growth in reported imagery, however, does not offer insights into whether more of the illegal content is being newly produced and posted online. Most imagery is detected by tech companies through automated scans that only recognize previously flagged material. And detecting videos, which last year for the first time surpassed the number of photos, is particularly difficult because the industry lacks a common standard for identifying them.<\/p>\n<p class=\"css-exrw3m evys1bk0\">The number of reported videos spiked in 2018 when Facebook ramped up its detection efforts. The company was responsible for more than 90 percent of reports that year, according to law enforcement officials.<\/p>\n<\/div>\n<\/div>\n<div><\/div>\n<div class=\"css-1fanzo5 StoryBodyCompanionColumn\">\n<div class=\"css-53u6y8\">\n<p class=\"css-exrw3m evys1bk0\">The continued growth in reported images from Facebook is sure to increase pressure on the company, which has been generally lauded for finding and reporting the content, but announced last year that it<a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.nytimes.com\/2019\/03\/06\/technology\/facebook-privacy-blog.html\"> intended to encrypt its Messenger app<\/a>. In 2019, Messenger was responsible for over 80 percent of all reports made by Facebook. Encryption would make it much more difficult to detect the illegal imagery on Messenger, which was also the largest source of reported material in 2018.<\/p>\n<p class=\"css-exrw3m evys1bk0\">In September, <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.nytimes.com\/interactive\/2019\/09\/28\/us\/child-sex-abuse.html\">The Times reported<\/a> that the number of reports to the national center had grown exponentially, and that the federal response was lacking despite a 2008 law meant to address what was then called an \u201cepidemic.\u201d Throughout the country, law enforcement groups charged with investigating the crimes have been overwhelmed.<\/p>\n<p class=\"css-exrw3m evys1bk0\">Legislation <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.nytimes.com\/2019\/12\/10\/us\/legislation-child-sexual-abuse.html\">introduced in December<\/a> would extend the length of time companies are required to retain information about illegal imagery in order to give law enforcement more opportunity to investigate. A bipartisan group of lawmakers said the bill was in response to a <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.nytimes.com\/interactive\/2019\/09\/28\/us\/child-sex-abuse.html\">Times investigation<\/a> revealing that cases often went cold after companies deleted the data. A draft of other <a class=\"css-1g7m0tk\" title=\"\" href=\"https:\/\/www.bloomberg.com\/news\/articles\/2020-01-30\/lindsey-graham-proposal-could-expose-apple-facebook-to-lawsuits\" target=\"_blank\" rel=\"noopener noreferrer\">proposed legislation<\/a> is aimed at making companies follow a set of best practices to police imagery on their platforms or risk greater legal liability.<\/p>\n<p class=\"css-exrw3m evys1bk0\">Even as the number of reported images and videos continues to grow, it remains difficult to assess the scope of the problem. While more companies are making efforts to detect the content, encrypted technologies and the dark web allow predators to continue trading imagery in secret.<\/p>\n<p class=\"css-exrw3m evys1bk0\">\u201cIf all of the companies involved were looking as aggressively as Facebook, that number of reports could be 50 million or 100 million,\u201d Mr. Stamos said.<\/p>\n<\/div>\n<aside class=\"css-ew4tgv\"><\/aside>\n<\/div>\n<div>\n<div class=\"css-1wtvwtv epkadsg3\">\n<div class=\"css-ckpga5 epkadsg1\">Read The Times\u2019s investigations into online child sexual abuse.<\/div>\n<div class=\"css-15g2oxy epkadsg2\">\n<div class=\"css-2b3w4o e16ij5yr6\">\n<div class=\"css-i9gxme e16ij5yr4\">\n<div class=\"css-1j8dw05 e16ij5yr2\">The Internet Is Overrun With Images of Child Sexual Abuse. What Went Wrong?<\/div>\n<p><time class=\"css-x7rtpa e16638kd0\" datetime=\"2019-09-28T23:45:50-04:00\">Sept. 28, 2019<\/time><\/div>\n<div class=\"css-1vm5oi9 e16ij5yr0\"><img decoding=\"async\" class=\"css-32rbo2 e16ij5yr1\" src=\"https:\/\/static01.nyt.com\/images\/2019\/09\/28\/multimedia\/csam-promo\/csam-hppromo-threeByTwoSmallAt2X.jpg\" \/><\/div>\n<\/div>\n<div class=\"css-2b3w4o e16ij5yr6\">\n<div class=\"css-i9gxme e16ij5yr4\">\n<div class=\"css-1j8dw05 e16ij5yr2\">Child Abusers Run Rampant as Tech Companies Look the Other Way<\/div>\n<p><time class=\"css-x7rtpa e16638kd0\" datetime=\"2019-11-09T08:34:27-05:00\">Nov. 9, 2019<\/time><\/div>\n<div class=\"css-1vm5oi9 e16ij5yr0\"><img decoding=\"async\" class=\"css-32rbo2 e16ij5yr1\" src=\"https:\/\/static01.nyt.com\/images\/2019\/10\/23\/multimedia\/00child-top\/00child-top-threeByTwoSmallAt2X-v2.jpg\" \/><\/div>\n<\/div>\n<div class=\"css-2b3w4o e16ij5yr6\">\n<div class=\"css-i9gxme e16ij5yr4\">\n<div class=\"css-1j8dw05 e16ij5yr2\">Video Games and Online Chats Are \u2018Hunting Grounds\u2019 for Sexual Predators<\/div>\n<p><time class=\"css-x7rtpa e16638kd0\" datetime=\"2019-12-07T15:24:40-05:00\">Dec. 7, 2019<\/time><\/div>\n<div class=\"css-1vm5oi9 e16ij5yr0\"><img decoding=\"async\" class=\"css-32rbo2 e16ij5yr1\" src=\"https:\/\/static01.nyt.com\/images\/2019\/12\/08\/us\/08videogames-image-promo\/08videogames-image-promo-threeByTwoSmallAt2X-v5.png\" \/><\/div>\n<\/div>\n<div class=\"css-2b3w4o e16ij5yr6\">\n<div class=\"css-i9gxme e16ij5yr4\">\n<div class=\"css-1j8dw05 e16ij5yr2\">Fighting the Good Fight Against Online Child Sexual Abuse<\/div>\n<p><time class=\"css-x7rtpa e16638kd0\" datetime=\"2019-12-22T09:33:26-05:00\">Dec. 22, 2019<\/time><\/div>\n<div class=\"css-1vm5oi9 e16ij5yr0\"><img decoding=\"async\" class=\"css-32rbo2 e16ij5yr1\" src=\"https:\/\/static01.nyt.com\/images\/2019\/12\/22\/multimedia\/00exploited-arachnid-screenshot\/00exploited-arachnid-screenshot-threeByTwoSmallAt2X.png\" \/><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/section>\n<div class=\"bottom-of-article\">\n<div class=\"css-1ubp8k9\"><\/div>\n<div class=\"css-1jp38cr\">\n<div class=\"css-19hdyf3 e1e7j8ap0\">\n<div>\n<p><em>Gabriel Dance is the deputy investigations editor. He was previously interactive editor for The Guardian and was part of the team awarded the 2014 Pulitzer Prize for Public Service for coverage of widespread secret surveillance by the N.S.A. <span class=\"css-4w91ra\"><a class=\"css-1rj8to8\" href=\"https:\/\/twitter.com\/gabrieldance\" target=\"_blank\" rel=\"noopener noreferrer\"><span class=\"css-0\">@<\/span>gabrieldance<\/a><\/span><\/em><\/p>\n<\/div>\n<\/div>\n<div class=\"css-19hdyf3 e1e7j8ap0\">\n<div>\n<p><em>Michael H. Keller is a reporter and data journalist specializing in technology on the investigative team. Before joining The Times, he worked at Bloomberg News, Newsweek and was a fellow at the Tow Center for Digital Journalism at Columbia University. <span class=\"css-4w91ra\"><a class=\"css-1rj8to8\" href=\"https:\/\/twitter.com\/mhkeller\" target=\"_blank\" rel=\"noopener noreferrer\"><span class=\"css-0\">@<\/span>mhkeller<\/a><\/span><\/em><\/p>\n<p><a href=\"https:\/\/www.nytimes.com\/2020\/02\/07\/us\/online-child-sexual-abuse.html?searchResultPosition=1\">The New York Times<\/a><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>By Gabriel J.X. Dance and Michael H. Keller, Feb. 7, 2020 In a first, videos outnumbered photos in reports to the authorities last year. Facebook found the most imagery, the bulk of it on its Messenger app. The number of reported photos, videos and other materials related to online child sexual abuse grew by more [&hellip;]<\/p>\n","protected":false},"author":1001004,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[53],"tags":[],"_links":{"self":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/9209"}],"collection":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/users\/1001004"}],"replies":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=9209"}],"version-history":[{"count":3,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/9209\/revisions"}],"predecessor-version":[{"id":9241,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/9209\/revisions\/9241"}],"wp:attachment":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=9209"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=9209"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=9209"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}