Issue of the Week: Human Rights
In our first post of the new year, we explained how we evolved through three major issues as the ones we thought we would focus on. The first, child sexual abuse and other abuse against children–the violence against half the children in the world–which has been a major focus of our work, is the subject of today’s post.
The recent appearance before the US Congress of major tech leaders with the main focus being the enabling of their companies of child sexual abuse may be seen in hindsight as a seminal moment–not unlike the appearance of Big Tobacco CEOs before congress leading to the most significant action against them in history.
The main players in that situation were state attorneys general. It may be starting that way again.
New Mexico’s attorney general appears to be leading the way.
The Guardian reports:
Internal company documents obtained by the attorney general’s office as part of its investigation have also revealed that the company estimates about 100,000 children using Facebook and Instagram receive online sexual harassment each day.
That’s 36 million a year.
And that’s just one company and two platforms.
The following articles from The Guardian tell the story:
“Meta is the world’s ‘single largest marketplace for paedophiles’, says New Mexico attorney general”
Raúl Torrez is taking the company to court and expects further details to emerge about its knowledge of child sexual exploitation on its platforms
Katie McQue, 31 Jan 2024
The New Mexico attorney general, Raúl Torrez, who has launched legal action against Meta for child trafficking on its platforms, says he believes the social media company is the “largest marketplace for predators and paedophiles globally”.
While Meta’s CEO Mark Zuckerberg and other social media executives are being questioned on Wednesday in a congressional hearing about their role in online child sexual exploitation, Torrez tells the Guardian he believes that what his own investigation has already uncovered is “just the tip of the iceberg when it comes to how widespread and well known this problem was inside the company”.Quick Guide
Meta child safety
In December 2023, Torrez launched legal action against Meta, claiming that the company has allowed its social media platforms to become marketplaces for child predators.
The lawsuit claims that Meta allows and fails to detect the trafficking of children and “enabled adults to find, message and groom minors, soliciting them to sell pictures or participate in pornographic videos”, concluding that “Meta’s conduct is not only unacceptable; it is unlawful”.
Torrez says that he has been shocked by the findings of his team’s investigations into online child sexual exploitation on Meta’s platforms, which included having undercover officers pose as children on Facebook and Instagram.
“There was an explosion of sexual interest from users attracted to the undercover accounts that confirmed the scale and pervasiveness of what turned out to be this unregulated space, where unconnected adults had very quickly expressed the kind of sexual interest that we were concerned about,” he says.
Internal company documents obtained by the attorney general’s office as part of its investigation have also revealed that the company estimates about 100,000 children using Facebook and Instagram receive online sexual harassment each day.
Torrez’s lawsuit is yet to enter the “discovery” stage, where both parties will gain access to the information, evidence and witnesses the other side will be presenting at trial.
“Once we get into discovery, we will be getting more documents from the company and conducting depositions, and systematically going through all of the relevant business units and executives to understand who knew what and when,” says Torrez.
Fundamentally, we’re trying to get Meta to change how it does business and prioritise the safety of its users
Raúl Torrez
He says his work on online child sexual exploitation is “deeply personal” after spending years as a prosecutor working on crimes against children.
“I have profound memories of interacting with child victims. That motivated me to make this a central component of the case we put together.”
The idea of the lawsuit came to him after reading media coverage of Meta’s role in child sexual exploitation, including a Guardian investigation that it was failing to report or detect the use of Facebook and Instagram for child trafficking. If it progresses, the New Mexico lawsuit is expected to take years to conclude.
Meta, like all other social media platforms, has so far avoided accountability for illegal acts committed on their platforms, sheltering behind a clause in the 1996 Communications Decency Act aimed at regulating online pornographic content.
Section 230 of the act states that providers of “interactive computer services” – which includes the owners of social media platforms and website hosts – should not be treated as the publisher of material posted by users.
“A great deal of the issues we’re seeing now and the harm that’s being committed has been facilitated by that act,” Torrez says, while criticising government partisanship for not reforming it. “This is emblematic of some of the polarisation and some of the gridlock we’re seeing at the federal level.”
Torrez wants his lawsuit to provide a medium to usher in new regulations. “Fundamentally, we’re trying to get Meta to change how it does business and prioritise the safety of its users, particularly children.”
A Meta spokesperson says: “Child exploitation is a horrific crime and online predators are determined criminals. We use sophisticated technology, hire child safety experts, report content to the National Center for Missing & Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators. In one month alone, we disabled more than half a million accounts for violating our child safety policies.
“We work aggressively to fight child exploitation both on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.”
- Information and support for anyone affected by rape or sexual abuse issues is available from the following organisations. In the UK, Rape Crisis offers support on 0808 500 2222 in England and Wales, 0808 801 0302 in Scotland, or 0800 0246 991 in Northern Ireland. In the US, Rainn offers support on 800-656-4673. In Australia, support is available at 1800Respect(1800 737 732). Other international helplines can be found at ibiblio.org/rcip/internl.html.
. . .
“How Facebook and Instagram became marketplaces for child sex trafficking”
The long read, by Katie McQue and Mei-Ling McNamara
Our two-year investigation suggests that the tech giant Meta is struggling to prevent criminals from using its platforms to buy and sell children for sex
Content warning – the following article contains descriptions of child sexual abuse, exploitation and trafficking
Maya Jones* was only 13 when she first walked through the door of Courtney’s House, a drop-in centre for victims of child sex trafficking in Washington DC. “She was so young, but she was already so broken by what she’d been through,” says Tina Frundt, the founder of Courtney’s House. Frundt, one of Washington DC’s most prominent specialists in countering child trafficking, has worked with hundreds of young people who have suffered terrible exploitation at the hands of adults, but when Maya eventually opened up about what she had been through, Frundt was shaken.
Maya told Frundt that when she was 12, she had started receiving direct messages on Instagram from a man she didn’t know. She said the man, who was 28, told her she was really pretty. According to Frundt, Maya told her that after she started chatting with the man, he asked her to send him naked photos. She told Frundt that he said he would pay her $40 for each one. He seemed kind and he kept giving Maya compliments, which made her feel special. She decided to meet him in person.
Then came his next request: “Can you help me make some money?” According to Frundt, Maya explained that the man asked her to pose naked for photos, and to give him her Instagram password so that he could upload the photos to her profile. Frundt says Maya told her that the man, who was now calling himself a pimp, was using her Instagram profile to advertise her for sex. Before long, sex buyers started sending direct messages to her account, wanting to make a date. Maya told Frundt that she had watched, frozen, what was taking place on her account, as the pimp negotiated prices and logistics for meetings in motels around DC. She didn’t know how to say no to this adult who had been so nice to her. Maya told Frundt that she hated having sex with these strangers but wanted to keep the pimp happy.
One morning three months after she first met the man, Frundt says that Maya was found by a passerby lying crumpled on a street in south-east DC, half-naked and confused. The night before, Maya told her, a sex buyer had taken her somewhere against her will, and she later recalled being gang-raped there for hours before being dumped on the street. “She was traumatised, and blamed herself for what happened. I had to work with her a lot to help her realise this was not her fault,” said Frundt when we visited Courtney’s House last summer.
Frundt, who has helped hundreds of children like Maya since she opened Courtney’s House in 2008, says that the first thing she now does when a young person is referred to her is to ask for their Instagram handle. Other social media platforms are also used to exploit the young people in her care, but she says Instagram is the one that comes up most often.
In the 20 years since the birth of social media, child sexual exploitation has become one of the biggest challenges facing tech companies. According to the United Nations Office on Drugs and Crime (UNODC), the internet is used by human traffickers as “digital hunting fields”, allowing them access to both customers and potential victims, with children being targeted by traffickers on social media platforms. The biggest of these, Facebook, is owned by Meta, the tech giant whose platforms, which also include Instagram, are used by more than 3 billion people worldwide. In 2020, according to a report by US-based not-for-profit the Human Trafficking Institute, Facebook was the platform most used to groom and recruit children by sex traffickers (65%), based on an analysis of 105 federal child sex trafficking cases that year. The HTI analysis ranked Instagram second most prevalent, with Snapchat third.
Grooming and child sex trafficking, though often researched and discussed together, are distinct acts. “Grooming” refers to the period of manipulation of a victim prior to their exploitation for sex or for other purposes. “Child sex trafficking” is the sexual exploitation of a child specifically as part of a commercial transaction. When the pimp was flattering and chatting with Maya, he was grooming her; when he was selling her to other adults for sex,he was trafficking.
Though people often think of “trafficking” as the movement of victims across or within borders, under international law the term refers to the use of force, fraud or coercion to obtain labour, or in the buying and selling of non-consensual sex acts, whether or not travel is involved. Because, under international law, children cannot legally consent to any kind of sex act, anyone who profits from or pays for a sex act from a child – including profiting from or paying for photographs depicting sexual exploitation – is considered a human trafficker.
Meta has numerous policies in place to try to prevent sex trafficking on its platforms. “It’s very important to me that everything we build is safe and good for kids,” Mark Zuckerberg, Meta’s founder, wrote in a memo to staff in 2021. In a statement responding to a detailed list of the allegations in this piece, a Meta spokesperson said: “The exploitation of children is a horrific crime – we don’t allow it and we work aggressively to fight it on and off our platforms. We proactively aid law enforcement in arresting and prosecuting the criminals who perpetrate these grotesque offences. When we are made aware that a victim is in harm’s way, and we have data that could help save a life, we process an emergency request immediately.” The statement cited the group director of intelligence at the charity Stop the Traffik, who is former deputy director of the UK’s Serious Organised Crime Agency, who has said “millions are safer and traffickers are increasingly frustrated” because of their work with Meta.
But over the past two years, through interviews, survivor testimonies, US court documents and human trafficking reporting data, we have heard repeated claims that Facebook and Instagram have become major sales platforms for child trafficking. We have interviewed more than 70 sources, including survivors and their relatives, prosecutors, child protection professionals and content moderators across the US in order to understand how sex traffickers are using Facebook and Instagram, and why Meta is able to deny legal responsibility for the trafficking that takes place on its platforms.
While Meta says it is doing all it can, we have seen evidence that suggests it is failing to report or even detect the full extent of what is happening, and many of those we interviewed said they felt powerless to get the company to act.
The survivors
Courtney’s House sits on a quiet residential street on the outskirts of Washington DC. Inside, Frundt and her team have tried to make the modest two-storey house feel like a family home, with comfortable sofas and photos on the mantlepiece. Frundt, who was herself trafficked as a child in the 1980s and 90s, is now one of Washington DC’s most experienced and respected anti-trafficking advocates. Warm and ferociously protective of the children in her care, she is contracted by the city’s child protection services to identify trafficked children going through the court system, and she regularly attends court hearings for the youth in her care. She also helps train the FBI and local law enforcement sex-trafficking units on how to spot traffickers on online platforms, including Instagram. “When I was trafficked long ago I was advertised in the classified sections of freesheet newspapers,” Frundt told us. “Now my youth here are trafficked on Instagram. It’s exactly the same business model but you just don’t have to pay to place an ad.”
The children who are referred to Frundt, usually by the police or social services, have been sexually exploited and controlled: by a boyfriend, a pimp, a family member. Some of them are as young as nine. Almost without exception, they have childhoods scarred by sexual abuse, poverty and violence. This makes them perfect targets for sexual predators. “They are all looking for love and affirmation and a sense that they mean something,” said Frundt.
Almost all the young people who come to Courtney’s House are children of colour. They are, Frundt said, battling stereotypes that pressure them to become sexualised too early and make them vulnerable to traffickers. A 2017 study by the Georgetown Law Center on Poverty and Inequality found that adults typically regard Black girls as less innocent and more knowledgable about sex than their white peers. The same study showed that Black girls are often perceived to be older than they are.
Most of the time, Frundt says, the children who come to Courtney’s House are still being trafficked when they walk through the door. Even in cases where they have escaped their exploiters, she said, explicit videos and photos of them often continue to circulate online. Traffickers will lock victims out of their accounts, preventing them from taking down images posted to their profiles.
When we asked Frundt if she could show us examples of young people in her care who she says are currently being trafficked on Instagram, she pulled out her phone and scrolled through post after post of explicit images and videos of girls as young as 14 or 15. Most of the photos and videos seemed to have been taken by someone else. Frundt said that these posts were being used as a way of advertising the girls for potential sex buyers, who would send a direct message to buy explicit content or to arrange a meet up.
At one point, our conversation was interrupted by the arrival of five teenage girls. They had come back from school, and they gathered around the kitchen table, chatting and playing music on their phones while Frundt served them casserole. After they had eaten, we asked if we could talk to them about their experiences: had any of them been sexually exploited on social media or had explicit videos or pictures posted of them?
They glanced at each other and burst out laughing. Yes, they said, of course. All the time. One girl said she felt that “nobody at Instagram cares, they don’t care what’s posted. They don’t care shit about us.”
Frundt claims that she is constantly asking Instagram to close accounts and take down exploitative content of kids in her care. “I even have law enforcement calling me up asking, ‘Tina, can you get Instagram to do something?’. If I can’t get Instagram to act, what hope is there for anyone else?”
When we put these concerns to Meta, a spokesperson said: “We take all allegations and reports of content involving children extremely seriously and have diligently responded to requests from Courtney’s House. Our ability to remove content or delete accounts requires sufficient information to determine that the content or user violates our policies.”
Frundt says that in 2020 and 2021 she had discussions with Instagram about conducting staff training to help prevent child trafficking on its platforms. She says the training didn’t go ahead as, after a long back and forth, on a video call Instagram executives said that they wouldn’t pay Frundt her standard fee of $3,000, instead allegedly offering $300. When we put this to Meta, they did not deny it.
The court documents and the prosecutors
What makes social media platforms so powerful as a tool for traffickers – far more powerful than the back pages of a newspaper in which Frundt was advertised as a teenager – is the way that they make it possible to identify and cultivate relationships with both victims and potential sex buyers. Traffickers can advertise and negotiate deals by using different features of the same platform: sellers sometimes post publicly about the girls they have available, and then switch to private direct messages to discuss prices and locations with buyers.
US court documents provide a graphic insight into how these platforms can be used. In one case prosecuted in Arizona in 2019, Mauro Veliz, a 31-year-old who was convicted of conspiracy to commit sex trafficking of a child, exchanged messages on Facebook Messenger with Miesha Tolliver, who also received jail time for sex trafficking. Tolliver told Veliz that she had one girl available for sex, and photographs of two more, before saying that the girls were aged 17, 16 and 14.
Veliz: “How much is it for all of them?”
Tolliver: “The 14 [year-old] will cost the most … a couple of hundred for her but [$] 150 for the rest”
The 14-year-old, Tolliver told Veliz, was “new to the sex game”.
Tolliver: “The 1 on the right … is 16 with a fat ass … the other [is] 15 with huge tits”
The court transcripts then state that multiple sexually explicit images of the girls were sent to Veliz.
Tolliver: “do you want me to bring 1 of the girls with me so you guys can fuck?”
[ … ]
Veliz: “is your girl nervous? Or have you told her yet?”
Tolliver: “… shes still young and doesn’t understand how ppl like it”
Tolliver and Veliz exchanged more messages, arranging for Veliz to meet the girl in a hotel in California two days later.
The final message submitted to the court was from Veliz to Tolliver. “We’re finished she’s in the restroom,” it said.
Luke Goldworm, a former assistant district attorney in Boston, Massachusetts, who has investigated and prosecuted human trafficking cases for years, says that he has encountered numerous exchanges like this one. From 2019 until he left the job in October 2022, he said, his department’s caseload of child-trafficking crimes on social media platforms increased by about 30% each year. “We’re seeing more and more people with significant criminal records move into this area. It’s incredibly lucrative,” he said. A trafficker can make up to $1,000 a night. Many of the victims he saw were just 11 or 12, he said, and most of them were Black, Latinx or LGBTQI+.
According to Goldworm, while his investigations involved every social media platform, Meta platforms were the ones he encountered most often. Six other prosecutors in several different states told us that, in their experience, Facebook and Instagram are being widely used to groom children and traffick children. Five of these prosecutors spoke of their anger over what they felt were Meta’s unnecessary delays in complying with judge-signed warrants and subpoenas needed to gather evidence on sex trafficking cases. “We get a higher rate of rejected warrants from Facebook than any other electronic service provider,” claimed Gary Ernsdorff, senior deputy prosecuting attorney for King County, Washington state. “What I find frustrating is that the exchange can delay rescuing a victim by a month.”
Three of these prosecutors described experiences where they say the company would cite technicalities, picking faults with wording and format, and slowing down investigations. In response, the company said that these claims were “false”, adding that between January and June last year, it “provided data in nearly 88% of requests from the US government”.
The responsibility for reporting
Meta acknowledges that human traffickers use its platforms, but insists that it is doing everything in its power to stop them. By law, the company is required to report any child sexual abuse imagery shared over its platforms to the National Center for Missing & Exploited Children (NCMEC), which receives federal funding to act as a nationwide clearing house for leads about child abuse. Meta is a major funder of NCMEC, and holds a seat on the company’s board.
From January to September 2022, Facebook reported more than 73.3m pieces of content under “child nudity and physical abuse” and “child sexual exploitation” and Instagram reported 6.1m. “Meta leads the industry in using the most sophisticated technology to detect both known and previously unknown child exploitation content,” said a company spokesperson. Of the 34m pieces of child sexual exploitation content removed from Facebook and Instagram in the final three months of 2022, 98% was detected by Meta itself.
But the vast majority of the content that Meta reports falls under child sexual abuse materials (CSAM) – which includes photos and videos of pornographic content – rather than sex trafficking. Unlike with child sexual abuse imagery, there is no legal requirement to report child sex trafficking, so NCMEC must rely on all social media companies to be proactive in searching for and reporting it. This legal inconsistency – the fact that child sexual abuse imagery must be reported, but reporting child sex trafficking is not legally required – is a major problem, says Staca Shehan, vice-president of the analytical services division at NCMEC. “It’s concerning across the board how little trafficking is being reported,” Shehan says. Social media companies “are prioritising what’s [legally] required”.
“I think everyone could do more,” Shehan says. “The volume of child sexual abuse material (CSAM) and volume of trafficking [being reported] is like apples and oranges.” According to Shehan, one further reason for this disparity, beyond the differing legal requirements, is technological. “Child sexual abuse material is that much easier to detect. There are so many technology tools that have been developed that allow for the automated detection of that crime.”
A NCMEC spokesperson told us that if social media companies are not reporting child sex trafficking, it allows this crime to thrive online. Reporting trafficking, they emphasised, is crucial for rescuing victims and punishing offenders.
Between 2009 and 2019, Meta reported just three cases as suspected child sex trafficking in the US to NCMEC, according to records disclosed in a subpoena request seen by the Guardian.
A spokesperson for NCMEC confirmed this figure, but clarified that a number of child trafficking cases during the same time period were reported by Meta under other “incident types”, such as child pornography or enticement. “I think one of the things to be aware of is that is that there’s sort of a singular tag that’s used for reporting,” Antigone Davis, head of global safety at Meta, emphasised to us in a recent interview. “And so just because something isn’t tagged as sex trafficking doesn’t mean that it isn’t being reported.”
A Meta spokesperson claimed that over the past decade, the company had reported “tens of thousands of accounts which violated our policies against child sex trafficking and commercial child sexual abuse material to NCMEC.” When we put these claims to NCMEC, it said that it had not received “tens of thousands” of reports of child trafficking from Meta, but had received that number related to child abuse imagery.
Hany Farid is a professor at the University of California, Berkeley who helped invent the PhotoDNA technology that Meta uses to identify harmful content. He believes Meta, which is currently valued at more than $500bn, could do more to combat child trafficking. It could, for instance, be investing more to develop better tools to “flag suspicious words and phrases on unencrypted parts of the platform – including coded language around grooming,” he said. “This is, fundamentally, not a technological problem, but one of corporate priorities.” (There is a separate debate about how to handle encryption. Meta’s plans to encrypt direct messages on Facebook Messenger and Instagram has recently drawn criticism from law enforcement agencies, including the FBI and Interpol.)
In response to Farid’s claims and further questions from the Guardian, Meta did not specify how much money it has invested in technologies to detect child sex trafficking, but said that it had “focused on using AI and machine learning on non-private, unencrypted parts of its platforms to identify harmful content and accounts and make it easier for people to report messages to the company so we can take action, including referrals to law enforcement”. Davis also emphasised that Meta constantly works with partners to improve its anti-trafficking safeguards. For instance, she mentioned that “we’ve been able to identify the kinds of searches that people do when they’re searching for trafficking content, so that when people search for that, we will pop up with information to divert them or to let them know that what they’re doing is illegal activity”.
These efforts have failed to satisfy some of Meta’s own investors. In March, several pension and investment funds that own Meta stock launched legal action against the company in Delaware over its alleged failure to act on “systemic evidence” that its platforms are facilitating sex trafficking and child sexual exploitation. By offering insufficient explanation of how it is tackling these crimes, the complaint says, the board has failed to protect the interests of the company. Meta has rejected the basis for the lawsuit. “Our goal is to prevent people who seek to exploit others from using our platform,” the company said.
The moderators
As well as software, Meta uses teams of human moderators to identify cases of child grooming and sex trafficking. Until recently, Anna Walker* worked the night shift in an office of a Meta subcontractor. She would start each shift filled with dread. “We were just, like, shoved in a dark room to look at the stuff,” she said.
Walker’s job was to review interactions between adults and children on Facebook Messenger and Instagram direct messenger that had been flagged as suspicious by Meta’s AI software. Walker claims she and her team struggled to keep pace with the huge backlog of cases. She says she saw cases of adults grooming children and then making plans to meet them for sex, as well as discussions about payment in exchange for sex.
Walker’s managers would pass on such cases to Meta to decide if action should be taken against the user. In some cases, Walker claims: “Months would pass and then the automatic bot would send me an email saying it was closing this case, because nobody’s taken action on it.” She added: “I would cry to my manager about [the children I saw] and how I want to help. But it felt like nobody would pay attention to these horrible things.”
We talked to six other moderators who worked for companies that Meta subcontracted between 2016 and 2022. All made similar claims to Walker. Their efforts to flag and escalate possible child trafficking on Meta platforms often went nowhere, they said. “On one post I reviewed, there was a picture of this girl that looked about 12, wearing the smallest lingerie you could imagine,” said one former moderator. “It listed prices for different things explicitly, like, a blowjob is this much. It was obvious that it was trafficking,” she told us. She claims that her supervisor later told her no further action had been taken in this case.
When we put these claims to Meta, a spokesperson said that moderators such as Walker do not typically get feedback on whether their flagged content has been escalated. They stressed that if a moderator does not hear back about a flagged case, that does not mean no action has been taken.
Five of the moderators claimed that it was harder to get cases escalated or content taken down if it was posted on closed Facebook groups or Facebook Messenger. Meta “would be less stringent about something taking place behind ‘closed doors’,” claimed one team leader. “With Messenger, we really couldn’t make any moves unless the language and content was really obvious. If it was four guys who trusted each other and it was in a group it could just live on for ever.” Meta said these allegations “appear to be misleading and inaccurate” and said it uses technology to find child sexualisation content in private Facebook groups and on Messenger.
In 2021, former Facebook employee and whistleblower Frances Haugen leaked internal documents that seem to support the moderators’ claims. These documents, which numbered thousands of pages, detailed how the company managed harmful content. In one memo from the Haugen leak, the company states that “Messenger groups with less than 32 people should be treated with a full expectation of privacy”.
Matias Cruz*, who worked as a content moderator from 2018 to 2020, reviewing Spanish-language posts on Facebook, believes that the criteria that Meta was using to recognise trafficking was too narrow to keep up with traffickers, who would constantly switch codewords to avoid detection. According to Cruz, traffickers would say: “‘I have this cabra [Spanish for goat] for sale,’ and it’d be some really ridiculous price. Sometimes they would just outright say [the price] for a night or two, or ‘an hour’.” It was obvious what was going on, said Cruz, but “the managers would claim it was too vague, so in the end they would just leave it up”.
Cruz and three other moderators we spoke to claimed that in examples like this, where their managers felt there was insufficient evidence to escalate the case, moderators could receive lower accuracy scores, which in turn would affect their performance assessments. “We would take negative hits on their accuracy scores to try to get some help to these people,” Cruz said.
The limits of the law
While the law requires Meta to report any child exploitation imagery detected on its platforms, the company is not legally responsible for crimes that occur on its platform, because of a law created almost three decades ago, in the early days of the internet. In 1996, the US Congress passed the Communications Decency Act, which was primarily intended to ensure online pornographic content was regulated. But section 230 of the act states that providers of “interactive computer services” – which includes the owners of social media platforms and website hosts – should not be treated as the publisher of material posted by users. This section was included in the act to ensure the free flow of information while protecting the growing tech industry from being crushed by litigation.
Whereas a newspaper, say, must legally defend what it publishes, section 230 means that a company like Meta, which hosts the content of others, may not be held liable for what appears on its platforms. Section 230 therefore positions internet service providers as fundamentally neutral: offering forums in which illegal, harmful or false content may be posted and circulated, but ultimately not responsible for that content. Since the passing of the act, tech companies such as Meta have argued successfully in courts across the US that section 230 provides them with complete immunity from prosecution for any illegal content published on their platforms, as long as they are unaware of that content’s existence.
The debate around section 230 has become highly polarised. Those who want section 230 amended say that the legal safe harbour it has provided for internet companies means they have no incentive to root out illegal content on their sites. In an op-ed published in the Wall Street Journal in January, President Biden spoke out in favour of the section’s reform. “I’ve long said we must fundamentally reform section 230,” he wrote, calling for “bipartisan action by Congress to hold big tech accountable.”
However, tech companies, along with internet freedom groups, argue that changes to section 230 could lead to censorship and an erosion of privacy, particularly for private, encrypted content. These arguments over section 230 are being put to the test in a landmark case that has reached the US supreme court, which focuses on how far YouTube can be considered culpable for the videos it recommends to its users. A ruling is due by the end of June.
The consequences
Kyle Robinson is one year into serving a 10-year sentence at a federal prison in Massachusetts for sex trafficking two teenagers, one only 14 years old. We spoke to him in January over the muffled line of the prison’s payphone, our conversation interrupted by prison staff monitoring the call. Referring to himself as a pimp, Robinson described how he sought out damaged girls from care homes and on social media as a way to make money.
Instagram, he said, was his platform of choice. “I find the girls that have pride in themselves, but maybe don’t have the confidence, the self-esteem,” he claimed. “I make her feel special. I give her validation, social skills, her ‘hotential’, if you know what I mean.”
Once he had identified his targets, Robinson claimed that he would “coach” them and advertise them on their Instagram accounts and his own. He would talk to potential buyers through direct messages, offering to send video snippets of the girls in return for “a small deposit” – about $20 – so that the buyers could see what they would be getting. If a buyer decided to meet a girl, he would pay her the rest of the money later, via CashApp, he said. Robinson would then take most of that money.
To crack down on such cases of child sexual exploitation, last June Meta announced new policies including age verification software that will require users under 18 to provide proof of age through uploading an ID, recording a video selfie, or asking mutual friends on Facebook to confirm their age. When we asked Tina Frundt about these new measures, she was sceptical. The kids she works with had already found workarounds; a 14-year-old, for example, might use a video selfie made by her 18-year-old friend, and pretend that it’s her own.
Even after children have been referred to Courtney’s House, they continue to be vulnerable to traffickers. One night in June 2021, Frundt says she got a call from Maya, telling her she had arrived home safe. Frundt was relieved: she knew that Maya had spent the evening with a 43-year-old man who had been contacting her on Instagram.
Frundt says that Maya, now 15, was in a fragile state: over the previous few months, her mental health had been in sharp decline and she had told Frundt she’d been feeling suicidal. Photos and explicit videos taken by a pimp showing her having sex were being circulated and sold on Instagram. Sex buyers were contacting her relentlessly through her direct messages. “She didn’t know how to make it stop or how to say no,” Frundt recalled.
That night, on the phone, Frundt told Maya that she loved her and that they would talk in the morning. “That’s the last time I ever spoke to her,” said Frundt. The older man had given Maya drugs. When Maya’s mother went to wake her daughter the next morning, she found her dead.
A picture of Maya that still hangs on the wall of Courtney’s House shows a baby-faced teenage girl with brown curls and a huge smile. Two years after her death, Frundt continues to grieve for her caring “girly girl” who loved makeup, board games and dancing to her favourite Megan Thee Stallion songs. “Losing one of our youth, it changes you for ever. You can never forgive yourself,” she said.
Before Maya died, Frundt claims she spoke to Instagram on a video call, asking them to remove the exploitative content her trafficker had circulated. Frundt says that when Maya died, the videos of her being exploited were still on the platform.
In July 2021, a representative from an anti-trafficking organisation sent an email to Instagram’s head of youth policy, informing her of Maya’s death. Frundt was copied in on the email. It asked why Meta’s tools designed to detect grooming had not flagged a 43-year-old man contacting a young girl. Four days later, the company sent a brief reply. If Instagram was provided with details about the alleged trafficker’s account, it would investigate.
But Frundt says that it was too late. “She had already passed,” she says. “They could have done something to help her but they didn’t. She was gone.”
Names marked with an asterisk have been changed to preserve anonymity.
In the US, call or text the Childhelp abuse hotline on 800-422-4453. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International
- Issue of The Week: Human Rights, Economic Opportunity, War, Disease, Environment, Hunger, Population, Personal Growth
- “Donald Trump and the New World Order: The End of the West”, Der Spiegel
- “The Long Global Trail of Resentment Behind Trump’s Resurrection”, The New York Times
- “Francis Fukuyama: what Trump unleashed means for America”, The Financial Times
- “Collapse in Democratic Turnout Fueled Trump’s Victory”, The Wall Street Journal
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- April 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017