{"id":8594,"date":"2019-11-09T22:17:11","date_gmt":"2019-11-10T06:17:11","guid":{"rendered":"https:\/\/worldcampaign.net\/?p=8594"},"modified":"2019-11-11T11:20:42","modified_gmt":"2019-11-11T19:20:42","slug":"issue-of-the-week-62","status":"publish","type":"post","link":"https:\/\/worldcampaign.net\/?p=8594","title":{"rendered":"Issue of the Week: Human Rights"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"alignleft size-medium wp-image-8603\" src=\"https:\/\/worldcampaign.net\/wp-content\/uploads\/2019\/11\/00child-top-superJumbo-4-300x282.jpeg\" alt=\"\" width=\"300\" height=\"282\" srcset=\"https:\/\/worldcampaign.net\/wp-content\/uploads\/2019\/11\/00child-top-superJumbo-4-300x282.jpeg 300w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2019\/11\/00child-top-superJumbo-4-150x141.jpeg 150w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2019\/11\/00child-top-superJumbo-4-768x722.jpeg 768w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2019\/11\/00child-top-superJumbo-4-1024x963.jpeg 1024w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2019\/11\/00child-top-superJumbo-4.jpeg 1366w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-size: 8pt;\"><em>Child Sex Abusers Elude Flimsy Digital Safeguards<\/em>\u00a0<em>, <\/em>The New York Times, November 10, 2019<\/span><\/p>\n<p>&nbsp;<\/p>\n<p>The second installment of The New York Times special investigative series,<em> Exploited<\/em>, on child sexual abuse, and its explotation on the internet, is here.<\/p>\n<p>As we noted when the series began, nothing quite like this has been done before.<\/p>\n<p>Again, the Times and the reporters,\u00a0Michael H. Keller and Gabriela J.X. Dance, deserve special admiration and gratitude from all of us&#8211;and we mean all of humanity&#8211;for bringing this most horrible of crimes and its digital aspect to the eyes of the world in a way only the Times can in some ways.<\/p>\n<p>As we&#8217;ve pointed out for years, half of all children are abused and probably at least half of those are sexually abused.<\/p>\n<p>As we&#8217;ve also pointed out for years, one second of not acting to prevent the sexual abuse and other abuse of a billion children and not helping adult survivors in every way necessary, is another second in enabling the abuse, ongoing trauma and destruction of their lives.<\/p>\n<p>And, as we&#8217;ve also pointed out for years, the human species will not survive unless this is ended.<\/p>\n<p>Lastly, as we&#8217;ve pointed out for years, the digital aspect of this horror has muliplied it in incalculable ways, making it, like everything else in the digital age, globally available in real time to everyone.<\/p>\n<p>A global shared experience that needs to be wiped out above all others.<\/p>\n<p><em>Child Abusers Run Rampant as Tech Companies Look the Other Way<\/em> is posted online by The New York Times today, and the dominating front page article in tomorrow&#8217;s Sunday print edition, under the headline,\u00a0<em>Child Sex Abusers Elude Flimsy Digital Safeguards.<\/em><\/p>\n<p>As with the first article, <em>The Internet Is Overrun With Images of Child Sexual Abuse. What Went Wrong?,\u00a0<\/em>also dominating the front page of the Sunday edition on September 29, the visual is deeply moving and heart-rending, this time a photo of two sisters hugging and comforting each other, unidentifiable, who are survivors of child sexual abuse.<\/p>\n<p>We posted the first article and commentary on\u00a0<a href=\"https:\/\/worldcampaign.net\/?p=8214\">September 28, 2019<\/a>.<\/p>\n<p>Be certain to read and experience the first article, first.<\/p>\n<p>As the Times notes:<\/p>\n<p><em>Articles in this series examine the explosion in online photos and videos of children being sexually abused. They include graphic descriptions of some instances of the abuse.<\/em><\/p>\n<p>Here&#8217;s today&#8217;s extraordinary new segment:<\/p>\n<p><a href=\"https:\/\/www.nytimes.com\/interactive\/2019\/11\/09\/us\/internet-child-sex-abuse.html\">&#8220;Child Abusers Run Rampant as Tech Companies Look the Other Way&#8221;<\/a><\/p>\n<div class=\"rad-cover headline-image-topper\">\n<header class=\"rad-header header-black\">\n<div class=\"rad-header-wrapper\">\n<p class=\"rad-summary\"><em>Though platforms bar child sexual abuse imagery on the web, criminals are exploiting gaps. Victims are caught in a living nightmare, confronting images again and again.<\/em><\/p>\n<p class=\"rad-byline-pubdate\"><span class=\"rad-byline\">By\u00a0Michael H. Keller and Gabriela J.X. Dance, The New York Times, Sunday, November 10, 2019<\/span><time class=\"rad-pubdate\"><\/time><\/p>\n<\/div>\n<\/header>\n<\/div>\n<div class=\"rad-story-body\">\n<p class=\"paragraph\">The two sisters live in fear of being recognized. One grew out her bangs and took to wearing hoodies. The other dyed her hair black. Both avoid looking the way they did as children.<\/p>\n<p class=\"paragraph\">Ten years ago, their father did the unthinkable: He posted explicit photos and videos on the internet of them, just 7 and 11 at the time. Many captured violent assaults in their Midwestern home, including him and another man drugging and raping the 7-year-old.<\/p>\n<p class=\"paragraph\">The men are now in prison, but in a cruel consequence of the digital era, their crimes are finding new audiences. The two sisters are among the first generation of child sexual abuse victims whose anguish has been preserved on the internet, seemingly forever.<\/p>\n<p class=\"paragraph\">This year alone, photos and videos of the sisters were found in over 130 child sexual abuse investigations involving mobile phones, computers and cloud storage accounts.<\/p>\n<p class=\"paragraph\">The digital trail of abuse \u2014 often stored on Google Drive, Dropbox and Microsoft OneDrive \u2014 haunts the sisters relentlessly, they say, as does the fear of a predator recognizing them from the images.<\/p>\n<p class=\"paragraph\">\u201cThat\u2019s in my head all the time \u2014 knowing those pictures are out there,\u201d said E., the older sister, who is being identified only by her first initial to protect her privacy. \u201cBecause of the way the internet works, that\u2019s not something that\u2019s going to go away.\u201d<\/p>\n<p class=\"paragraph\">Horrific experiences like theirs are being recirculated across the internet because search engines, social networks and cloud storage are rife with opportunities for criminals to exploit.<\/p>\n<p class=\"paragraph\">The scope of the problem is only starting to be understood because the tech industry has been more diligent in recent years in identifying online child sexual abuse material, with a record 45 million photos and videos flagged last year.<\/p>\n<p class=\"paragraph\">But the same industry has consistently failed to take aggressive steps to shut it down, an investigation by The New York Times found. Approaches by tech companies are inconsistent, largely unilateral and pursued in secret, often leaving pedophiles and other criminals who traffic in the material with the upper hand.<\/p>\n<p class=\"paragraph\">The companies have the technical tools to stop the recirculation of abuse imagery by matching newly detected images against databases of the material. Yet the industry does not take full advantage of the tools.<\/p>\n<p class=\"paragraph\">Amazon, whose cloud storage services handle millions of uploads and downloads every second, does not even look for the imagery. Apple does not scan its cloud storage, according to federal authorities, and encrypts its messaging app, <a href=\"https:\/\/www.nytimes.com\/2019\/10\/02\/technology\/encryption-online-child-sex-abuse.html\">making detection virtually impossible<\/a>. Dropbox, Google and Microsoft\u2019s consumer products scan for illegal images, but only when someone shares them, not when they are uploaded.<\/p>\n<p class=\"paragraph\">And other companies, including Snapchat and Yahoo, look for photos but not videos, even though illicit video content has been exploding for years. (When asked about its video scanning, a Dropbox spokeswoman in July said it was not a \u201ctop priority.\u201d On Thursday, the company said it had begun scanning some videos last month.)<\/p>\n<p class=\"paragraph\">The largest social network in the world, Facebook, thoroughly scans its platforms, accounting for over 90 percent of the imagery flagged by tech companies last year, but the company is not using all available databases to detect the material. And Facebook has announced that the main source of the imagery, Facebook Messenger, will eventually be encrypted, vastly limiting detection.<\/p>\n<p class=\"paragraph\">\u201cEach company is coming up with their own balance of privacy versus safety, and they don\u2019t want to do so in public,\u201d said Alex Stamos, who served as chief of information security at both Facebook and Yahoo. \u201cThese decisions actually have a humongous impact on children\u2019s safety.\u201d<\/p>\n<p class=\"paragraph\">Tech companies are far more likely to review photos and videos and other files on their platforms for facial recognition, malware detection and copyright enforcement. But some businesses say looking for abuse content is different because it can raise significant privacy concerns.<\/p>\n<p class=\"paragraph\">Tech companies are loath to be seen going through someone\u2019s photos and videos, and imagery flagged in automated scans is almost always reviewed by a person later.<\/p>\n<p class=\"paragraph\">\u201cOn the one hand, there is an important imperative to protect personal information,\u201d said Sujit Raman, an associate deputy attorney general in the Justice Department. \u201cOn the other hand, there is so much stuff on the internet that is very damaging.\u201d<\/p>\n<p class=\"paragraph\">The main method for detecting the illegal imagery was created in 2009 by Microsoft and Hany Farid, now a professor at the University of California, Berkeley. The software, known as PhotoDNA, can use computers to recognize photos, even altered ones, and compare them against databases of known illegal images. Almost none of the photos and videos detected last year would have been caught without systems like PhotoDNA.<\/p>\n<p class=\"paragraph\">But this technique is limited because no single authoritative list of known illegal material exists, allowing countless images to slip through the cracks. The most commonly used database is kept by a federally designated clearinghouse, which compiles digital fingerprints of images reported by American tech companies. Other organizations around the world maintain their own.<\/p>\n<p class=\"paragraph\">Even if there were a single list, however, it would not solve the problems of newly created imagery flooding the internet, or the surge in live-streaming abuse.<\/p>\n<p class=\"paragraph\">For victims like E. and her sister, the trauma of the constantly recirculating photos and videos can have devastating effects. Their mother said both sisters had been hospitalized for suicidal thoughts.<\/p>\n<p class=\"paragraph\">\u201cEvery hope and dream that I worked towards raising my children \u2014 completely gone,\u201d she said. \u201cWhen you\u2019re dealing with that, you\u2019re not worried about what somebody got on a college-entrance exam. You just want to make sure they can survive high school, or survive the day.\u201d<\/p>\n<p class=\"paragraph\">And because online offenders are known to seek out abused children, even into adulthood, the sisters do not speak publicly about the crimes against them. Their emotional conversations with The Times were the first time they\u2019ve spoken publicly about the abuse.<\/p>\n<p class=\"paragraph\">\u201cYou get your voice taken away,\u201d E. said. \u201cBecause of those images, I don\u2019t get to talk as myself. It\u2019s just like, Jane Doe.\u201d<\/p>\n<p class=\"paragraph\"><strong>Searching for Abuse<\/strong><\/p>\n<p class=\"paragraph\">Joshua Gonzalez, a computer technician in Texas, was arrested this year with over 400 images of child sexual abuse on his computer, including some of E. and her sister.<\/p>\n<p class=\"paragraph\">Mr. Gonzalez told the authorities that he had used Microsoft\u2019s search engine, Bing, to find some of the illegal photos and videos, according to court documents.<\/p>\n<p class=\"paragraph\">Microsoft had long been at the forefront of combating abuse imagery, even creating the PhotoDNA detection tool a decade ago. But many criminals have turned to Bing as a reliable tool of their own.<\/p>\n<p class=\"paragraph\">A <a href=\"https:\/\/techcrunch.com\/2019\/01\/10\/unsafe-search\/\">report in January<\/a> commissioned by TechCrunch found explicit images of children on Bing using search terms like \u201cporn kids.\u201d In response to the report, Microsoft said it would ban results using that term and similar ones.<\/p>\n<p class=\"paragraph\">The Times created a computer program that scoured Bing and other search engines. The automated script repeatedly found images \u2014 dozens in all \u2014 that Microsoft\u2019s own PhotoDNA service flagged as known illicit content. Bing even recommended other search terms when a known child abuse website was entered into the search box.<\/p>\n<p class=\"paragraph\">While The Times did not view the images, they were reported to the National Center for Missing and Exploited Children and the Canadian Center for Child Protection, which work to combat online child sexual abuse.<\/p>\n<p class=\"paragraph\">One of the images, the Canadian center said, showed a naked girl on her back spreading her legs \u201cin an extreme manner.\u201d The girl, about 13, was recognized by the center\u2019s analysts, who regularly review thousands of explicit images to help identify and rescue exploited children and scrub footage from the internet. The analysts said the authorities had already removed the girl from danger.<\/p>\n<p class=\"paragraph\">Similar searches by The Times on DuckDuckGo and Yahoo, which use Bing results, also returned known abuse imagery. In all, The Times found 75 images of abuse material across the three search engines before stopping the computer program.<\/p>\n<p class=\"paragraph\">Both DuckDuckGo and Yahoo said they relied on Microsoft to filter out illegal content.<\/p>\n<p class=\"paragraph\">After reviewing The Times\u2019s findings, Microsoft said it uncovered a flaw in its scanning practices and was re-examining its search results. But subsequent runs of the program found even more.<\/p>\n<p class=\"paragraph\">A spokesman for Microsoft described the problem as a \u201cmoving target.\u201d<\/p>\n<p class=\"paragraph\">\u201cSince the NYT brought this matter to our attention, we have found and fixed some issues in our algorithms to detect unlawful images,\u201d the spokesman said.<\/p>\n<p class=\"paragraph\">Hemanshu Nigam, who served as a director overseeing child safety at Microsoft from 2000 to 2006, called the findings a \u201cmajor failing,\u201d and added that \u201cit looks like they\u2019re not using their own tools.\u201d Mr. Nigam said it showed the company was seemingly unaware of how its platforms could be manipulated by criminals.<\/p>\n<p class=\"paragraph\">Child abusers are well aware of Bing\u2019s vulnerabilities, according to court records and interviews with law enforcement officials. Going back years, pedophiles have used Bing to find illegal imagery and have also deployed the site\u2019s \u201creverse image search\u201d feature, which retrieves pictures based on a sample photo.<\/p>\n<p class=\"paragraph\">The same computer program, when run by The Times on Google\u2019s search engine, did not return abuse content. But separate documentation provided by the Canadian center showed that images of child sexual abuse had also been found on Google and that the company had sometimes resisted removing them.<\/p>\n<p class=\"paragraph\">One image captured the midsections of two children, believed to be under 12, forced into explicit acts with each other. It is part of a known series of photos showing the children being sexually exploited.<\/p>\n<p class=\"paragraph\">The Canadian center asked Google to take down the image in August last year, but Google said it did not meet its threshold for removal, the documents show. The analysts pressed for nine months until Google relented.<\/p>\n<p class=\"paragraph\">Another image, found in September 2018, depicts a woman touching the genitals of a naked 2-year-old girl. Google declined to take down the photo, stating in an email to the Canadian analysts that while it amounted to pedophilia, \u201cit\u2019s not illegal in the United States.\u201d<\/p>\n<p class=\"paragraph\">When The Times later asked Google about the image and others identified by the Canadians, a spokesman acknowledged that they should have been removed, and they subsequently were. The spokesman also said that the company did not believe any form of pedophilia was legal, and that it had been a mistake to suggest otherwise.<\/p>\n<div id=\"interactive-00childporn-epidemic-collage-11\" class=\"rad-interactive full_bleed\" data-id=\"100000006808878\" data-slug=\"00childporn-epidemic-collage-11\">\n<div class=\"rad-interactive-wrapper\">\n<div class=\"g-container g-image-cluster\">\n<div class=\"g-image-wrap g-img-medium col2 g-top\">\n<div class=\"g-aspect-ratio\"><picture><source srcset=\"https:\/\/static01.nyt.com\/images\/2019\/11\/06\/us\/politics\/00exploited-canada\/00exploited-canada-master1050.jpg 1x, https:\/\/static01.nyt.com\/images\/2019\/11\/06\/us\/politics\/00exploited-canada\/00exploited-canada-master1050.jpg 2x\" media=\"(min-width: 640px)\" \/><img decoding=\"async\" src=\"https:\/\/static01.nyt.com\/images\/2019\/11\/06\/us\/politics\/00exploited-canada\/00exploited-canada-master1050.jpg\" srcset=\"https:\/\/static01.nyt.com\/images\/2019\/11\/06\/us\/politics\/00exploited-canada\/00exploited-canada-master675.jpg 1x, https:\/\/static01.nyt.com\/images\/2019\/11\/06\/us\/politics\/00exploited-canada\/00exploited-canada-master1050.jpg 2x\" alt=\"null\" \/><\/picture><\/div>\n<\/div>\n<div class=\"g-mastercaption g-mastercaption col5\">\n<p class=\"g-body g-desktop\"><span style=\"font-size: 8pt;\">Lianna McDonald is executive director of the Canadian Center for Child Protection. She said the group was baffled by Google\u2019s resistance to removing explicit images in some cases.<\/span><\/p>\n<p class=\"credit\"><span style=\"font-size: 8pt;\">Kholood Eid for The New York Times<\/span><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<p class=\"paragraph\">A week after the images were removed, the Canadian center reported two additional images to Google. One was of a young girl, approximately 7, with semen covering her face. The other was of a girl, between 8 and 11, with her legs spread, exposing her genitals. Google told the Canadian center that neither image met \u201cthe reporting threshold,\u201d but later agreed to remove them.<\/p>\n<p class=\"paragraph\">\u201cIt baffles us,\u201d said Lianna McDonald, the center\u2019s executive director.<\/p>\n<p class=\"paragraph\"><strong>Criminals Everywhere<\/strong><\/p>\n<p class=\"paragraph\">The problem is not confined to search engines.<\/p>\n<p class=\"paragraph\">Pedophiles often leverage multiple technologies and platforms, meeting on chat apps and sharing images on cloud storage, according to a review of hundreds of criminal prosecutions.<\/p>\n<p class=\"paragraph\">\u201cThe first thing people need to understand is that any system that allows you to share photos and videos is absolutely infested with child sexual abuse,\u201d said Mr. Stamos, the former security chief at Facebook and Yahoo, who is now a professor at Stanford.<\/p>\n<p class=\"paragraph\">Criminals often discuss in online forums and chat groups how to exploit vulnerabilities in platforms, the criminal cases show. They carefully follow the prosecutions of people who have been found with explicit imagery and learn from them. There are even online manuals that explain in graphic detail how to produce the images and avoid getting caught.<\/p>\n<p class=\"paragraph\">The digital trail that has followed one young abuse victim, a girl who was raped by her father over four years starting at age 4, is sadly representative of the pattern.<\/p>\n<p class=\"paragraph\">The girl, now a teenager living on the West Coast, does not know that footage of her abuse is on the internet. Her mother and stepfather wish it would stay that way.<\/p>\n<p class=\"paragraph\">The mother and stepfather of a teenage girl on the West Coast said their daughter was unaware there was online footage of her abuse. \u201cTo her, the internet is looking up puppies,\u201d her stepfather said.<\/p>\n<p class=\"paragraph\">\u201cWe\u2019re just afraid of all the negative impacts that it might have \u2014 because I\u2019ve spoken with other moms whose daughters know their images are online and they\u2019re train wrecks,\u201d her mother said. \u201cShe doesn\u2019t need to be worrying about most likely the worst part of her life available on the internet.\u201d<\/p>\n<p class=\"paragraph\">Her stepfather also worries. \u201cTo her, the internet is looking up puppies.\u201d<\/p>\n<p class=\"paragraph\">Sex offenders frequently share photos and videos of the girl\u2019s abuse on sites that appear on Bing and elsewhere. When the images are detected, the F.B.I. notifies the girl\u2019s family or their lawyer. Over the past four years, her family says, they have received over 350 notifications about cases across the country, including in Florida, Kansas, Kentucky, Michigan, Minnesota and Texas.<\/p>\n<p class=\"paragraph\">Images of the girl surfaced in a case reported to the authorities by a woman who had been conversing with a <a href=\"https:\/\/www.mlive.com\/news\/ann-arbor\/2015\/09\/ann_arbor_man_accused_of_wanti.html\">Michigan man on Facebook Messenger<\/a>.<\/p>\n<p class=\"paragraph\">The man had proposed that the woman and her children live as nudists, while also suggesting to her that incest was normal. He offered to move in with her along with his 13-year-old daughter, whom, he said, he had orally raped the night before.<\/p>\n<p class=\"paragraph\">The man, Snehal Yogeshkumar Shah, had also been communicating on Messenger with other abusers, who recommended he download the <a href=\"https:\/\/www.forbes.com\/sites\/thomasbrewster\/2019\/03\/06\/exclusive-the-fbi-took-over-the-online-identity-of-a-pedophile-letting-child-porn-spread-for-18-months\/#63cbfdb05cb1\">Kik messaging app<\/a> and create <a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2019\/01\/meme-accounts-are-fighting-child-porn-instagram\/579730\/\">a Dropbox account<\/a> to store his illicit material. The police found more than 400 illegal photos and videos in his Dropbox account and on his iPhone, including some of the West Coast girl. They also found chats on Kik between him and two young teenagers containing explicit imagery. He is now in prison.<\/p>\n<p class=\"paragraph\">Images of the girl also emerged in an investigation into Anthony Quesinberry, an Army specialist in San Antonio who shared abuse content on Yik Yak, a now-shuttered social networking app. He told investigators that he\u2019d been addicted to the imagery for years and had obtained it using search engines, <a href=\"https:\/\/melmagazine.com\/en-us\/story\/tumblr-child-pornography-problem\/\">Tumblr<\/a>, Dropbox, <a href=\"https:\/\/www.forbes.com\/sites\/thomasbrewster\/2017\/08\/03\/kik-has-a-massive-child-abuse-problem\/#777bf5441a14\">Kik<\/a> and the live-streaming video platform Omegle. He was sentenced to more than 16 years.<\/p>\n<p class=\"paragraph\">The girl\u2019s mother said she could see the effects of the trauma years later. Sometimes, her daughter becomes inexplicably angry. More often, she can seem detached, as if nothing bothers her. There \u201care things, developmentally, that she is just having to learn now,\u201d said her mother, who agreed to be interviewed with her lawyer and only if her name and location were withheld.<\/p>\n<p class=\"paragraph\">When the girl turns 18, she will become the legal recipient of reports about the material. At that point, her mother and stepfather hope, she will be better able to handle the news. They also hold out hope that the tech companies will have managed to remove the images from the internet by then.<\/p>\n<div id=\"interactive-00childporn-epidemic-collage-8\" class=\"rad-interactive full_bleed\" data-id=\"100000006796535\" data-slug=\"00childporn-epidemic-collage-8\">\n<div class=\"rad-interactive-wrapper\">\n<div class=\"g-container g-image-cluster\">\n<div class=\"g-image-wrap g-img-medium col2 g-top\">\n<div class=\"g-aspect-ratio\"><picture><source srcset=\"https:\/\/static01.nyt.com\/images\/2019\/10\/29\/multimedia\/00child-parents\/00child-parents-master1050.jpg 1x, https:\/\/static01.nyt.com\/images\/2019\/10\/29\/multimedia\/00child-parents\/00child-parents-master1050.jpg 2x\" media=\"(min-width: 640px)\" \/><img decoding=\"async\" src=\"https:\/\/static01.nyt.com\/images\/2019\/10\/29\/multimedia\/00child-parents\/00child-parents-master1050.jpg\" srcset=\"https:\/\/static01.nyt.com\/images\/2019\/10\/29\/multimedia\/00child-parents\/00child-parents-master675.jpg 1x, https:\/\/static01.nyt.com\/images\/2019\/10\/29\/multimedia\/00child-parents\/00child-parents-master1050.jpg 2x\" alt=\"null\" \/><\/picture><\/div>\n<\/div>\n<div class=\"g-mastercaption g-mastercaption col5\">\n<p class=\"g-body g-desktop\"><span style=\"font-size: 8pt;\">Over the past four years, the West Coast girl\u2019s mother and stepfather have been sent over 350 F.B.I. reports about images of her on the internet. Their daughter will start receiving them when she turns 18.<\/span><\/p>\n<p class=\"credit\"><span style=\"font-size: 8pt;\">Kholood Eid for The New York Times<\/span><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<p class=\"paragraph\">\u201cI would love to be able to tell her they were online,\u201d her mother said, \u201cbut they are not anymore.\u201d<\/p>\n<p class=\"paragraph\"><strong>\u2018Pictures Are Forever\u2019<\/strong><\/p>\n<p class=\"paragraph\">Other parents are resigned to the possibility that the images may remain online forever.<\/p>\n<p class=\"paragraph\">In a foster family with multiple victims, one teenage daughter recently went on antidepressants to cope with feelings that her abuse was her fault. Another daughter found the courage to begin dating eight years after her abuse. But when her worried brother recently went online to test whether the imagery was still available, agents from the F.B.I. visited their home. It is illegal to view child sexual abuse material, regardless of intentions.<\/p>\n<p class=\"paragraph\">\u201cI don\u2019t fool myself into thinking I\u2019m doing great anymore,\u201d said the foster mother, who requested anonymity to protect the privacy of her children. \u201cI\u2019m just surviving day by day.\u201d<\/p>\n<p class=\"paragraph\">Two of her foster daughters were filmed being raped by their father, while others were abused but not photographed or filmed. The difference, she said, can be profound over time.<\/p>\n<p class=\"paragraph\">\u201cThey\u2019re angry that those pictures are forever,\u201d she said.<\/p>\n<p class=\"paragraph\">To discourage them from posting on social media and risk being recognized, she has told her children their abusive images live on the internet. \u201cI don\u2019t think they\u2019ll ever be totally wiped out,\u201d she said.<\/p>\n<p class=\"paragraph\">So far, the tech industry has proved her right.<\/p>\n<p class=\"paragraph\">It has been 10 years since PhotoDNA was developed at Microsoft, yet the industry\u2019s efforts to detect and remove known illegal photos remains uneven and cloaked in secrecy.<\/p>\n<p class=\"paragraph\">The industry\u2019s response to video content has been even more wanting, according to interviews, internal company emails and reviews of thousands of court records. There is no common standard for identifying illegal video content, and many major platforms \u2014 including AOL, Snapchat and Yahoo \u2014 do not even scan for it. AOL and Yahoo did not respond to requests for comment about their video policies. A Snap spokesman said the company was working with industry partners to develop a solution.<\/p>\n<p class=\"paragraph\">A spokesman for Kik, before it was sold last month, said the company was also working on the issue; on Friday, the new owners said it now scanned for video.<\/p>\n<p class=\"paragraph\">\u201cOver all, the tech companies are doing the minimal amount necessary to maintain public respect for their organizations,\u201d said Chad M.S. Steel, who teaches computer forensics at George Mason University and has assisted federal investigators in abuse-related cases. \u201cBut they\u2019re not doing nearly as much as they could based on the technologies available.\u201d<\/p>\n<p class=\"paragraph\">Tech companies have known for years that videos of children being sexually abused are shared on their platforms, according to former employees at Microsoft, Twitter, Tumblr and other companies. One former Twitter employee described gigabytes of illegal videos appearing more quickly than they could be taken down on Vine, the video service since shuttered by Twitter.<\/p>\n<p class=\"paragraph\">That was in 2013, when fewer than 50,000 videos were reported. Last year, tech companies referred more than 22 million to the National Center for Missing and Exploited Children, the nonprofit clearinghouse mandated by the federal government to act as a repository for the imagery.<\/p>\n<p class=\"paragraph\">Efforts to tackle the urgent problem of video content have run into roadblocks of the companies\u2019 own making. Google, for example, developed video-detection technology that it makes available to other companies, and Facebook also has a system. But the two cannot share information because the fingerprints generated by each technology are not compatible.<\/p>\n<p class=\"paragraph\">In 2017, the tech industry approved a process for sharing video fingerprints to make it easier for all companies to detect illicit material, according to confidential emails and other documents that were part of a project run by the Technology Coalition, a group focused on child safety issues that includes most major companies.<\/p>\n<p class=\"paragraph\">One document notes the project\u2019s justification: \u201cVideo has become as easy to create as images and no standard solution\/process has been adopted by industry.\u201d<\/p>\n<p class=\"paragraph\">But the plan has gone nowhere.<\/p>\n<p class=\"paragraph\">The lack of action across the industry has allowed untold videos to remain on the internet. Of the center\u2019s 1.6 million fingerprints, less than three percent are for videos.<\/p>\n<p class=\"paragraph\">Photos and videos are each being handled in ways that give criminals great leeway. None of the largest cloud storage platforms \u2014 including Amazon Web Services, Dropbox, Google Drive and Microsoft\u2019s OneDrive and Azure \u2014 scan for abuse material when files are uploaded, according to law enforcement officials, former employees and public statements by the companies.<\/p>\n<p class=\"paragraph\">While the files may be scanned later, when users share them, for example, some criminals have avoided detection by sharing their account logins rather than the files themselves.<\/p>\n<p class=\"paragraph\">A Florida man, Gregory Householder, told investigators that he had used online platforms for eight years and had regularly shared logins to Dropbox accounts with other offenders. Mr. Householder said he knew he was committing a crime, but did not believe he would get caught.<\/p>\n<p class=\"paragraph\">A spokesman for Amazon, which does not scan for abuse imagery whatsoever, said that the \u201cprivacy of customer data is critical to earning our customers\u2019 trust,\u201d and noted that the company had a policy that prohibited illegal content. Microsoft Azure also said it did not scan for the material, citing similar reasons.<\/p>\n<p class=\"paragraph\">Privacy concerns were raised by other companies, including Dropbox and Google. A Dropbox spokeswoman said scanning raised a specter that privacy advocates could take issue with.<\/p>\n<p class=\"paragraph\">Some companies, like Dropbox and Google, invoked security concerns when asked about their detection and removal practices. A spokesman for Apple declined to specify how it scanned its platforms, saying the information could help criminals.<\/p>\n<p class=\"paragraph\">Several digital forensic experts and law enforcement officials said the companies were being disingenuous in invoking security. Mr. Stamos, the former Facebook and Yahoo security chief, said the companies just \u201cdon\u2019t want to advertise that they are open for business\u201d to criminals.<\/p>\n<p class=\"paragraph\">\u201cIf they\u2019re saying, \u2018It\u2019s a security problem,\u2019 they\u2019re saying that they don\u2019t do it,\u201d Mr. Stamos said.<\/p>\n<p class=\"paragraph\"><strong>An Uncertain Future<\/strong><\/p>\n<p class=\"paragraph\">A heinous case in Pennsylvania warns of a tsunami of new, hard-to-detect abuse content through live-streaming platforms.<\/p>\n<p class=\"paragraph\">More than a dozen men from around the world were logged in to the business conference software Zoom. They were chatting while watching a live stream that had nothing to do with work: A man was sexually assaulting a 6-year-old boy.<\/p>\n<p class=\"paragraph\">One of the viewers, according to court documents, asked the assailant to spread the boy\u2019s buttocks. Another to \u201cspit in his face.\u201d A third to \u201crape him.\u201d<\/p>\n<p class=\"paragraph\">The boy was orally raped and violently penetrated while some of the men, appearing on cameras of their own, cheered and masturbated for the others to see.<\/p>\n<p class=\"paragraph\">The men also broadcast prerecorded clips of young children \u2014 including infants \u2014 being raped, beaten and urinated on.<\/p>\n<p class=\"paragraph\">During the trial, an investigator said that offenders often knew that live streams are harder to detect and leave no record.<\/p>\n<p class=\"paragraph\">\u201cThat\u2019s why they go to Zoom,\u201d said the federal prosecutor in the case, Austin Berry, during his closing remarks. \u201cIt\u2019s the Netflix of child pornography.\u201d Prosecutions in other cases have involved live streaming on Apple\u2019s FaceTime, Facebook, Omegle, Skype, YouNow and others.<\/p>\n<p class=\"paragraph\">None of the major tech companies is able to detect, much less stop, the live streaming through automated imagery analysis, although a technology executive at Zoom said the company had made significant improvements since the Pennsylvania case using other methods.<\/p>\n<p class=\"paragraph\">And while Facebook, Google and Microsoft have said they are developing technologies that will find new photos and videos on their platforms, it could take years to reach the precision of fingerprint-based detection of known imagery, according to interviews.<\/p>\n<p class=\"paragraph\">Men in the Pennsylvania case were caught in 2015 only because Janelle Blackadar, a detective constable with the Toronto police, discovered the broadcast while conducting an undercover investigation. The detective recorded the stream using screen-capturing technology, and within hours alerted Special Agent Austin Berrier of Homeland Security Investigations.<\/p>\n<p class=\"paragraph\">The 6-year-old boy was rescued the next day, and 14 men from multiple states have since been sentenced to prison.<\/p>\n<p class=\"paragraph\">In January, the assailant, William Byers Augusta, 20, received a sentence of up to 90 years.<\/p>\n<p class=\"paragraph\">The Pennsylvania state prosecutor in the case, describing the offenders as monsters, <a href=\"https:\/\/www.pennlive.com\/news\/2017\/02\/byers_augusta_recorded_rapes.html\">said in court<\/a> that Mr. Augusta had \u201cencouraged people all over the world to tune in.\u201d<\/p>\n<p class=\"paragraph\"><em>Produced by Rich Harris, Virginia Lozano, Adriana Rami\u0107 and Rumsey Taylor.<\/em><\/p>\n<p class=\"paragraph\">(To report online child sexual abuse or find resources for those in need of help, contact the <a href=\"https:\/\/www.missingkids.org\/gethelpnow\" target=\"_blank\">National Center for Missing and Exploited Children<\/a>at 1-800-843-5678.)<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; Child Sex Abusers Elude Flimsy Digital Safeguards\u00a0, The New York Times, November 10, 2019 &nbsp; The second installment of The New York Times special investigative series, Exploited, on child sexual abuse, and its explotation on the internet, is here. As we noted when the series began, nothing [&hellip;]<\/p>\n","protected":false},"author":1001004,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[55],"tags":[],"_links":{"self":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/8594"}],"collection":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/users\/1001004"}],"replies":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=8594"}],"version-history":[{"count":7,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/8594\/revisions"}],"predecessor-version":[{"id":8621,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/8594\/revisions\/8621"}],"wp:attachment":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=8594"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=8594"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=8594"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}