Message of the Day: Human Rights, Economic Opportunity, Personal Growth

Whistleblower: Facebook Prioritizing Growth Over Safety, 60 Minutes, 10.3.21

 

We started posting articles every day for this post, changing them by the day as the story unfolded. The Wall Street Journal investigative series, Facebook Files, started it. Today, Sunday, October 3, 60 Minutes on CBS brought it to an apex, for now, with the revelation of the identity and appearance of the whistleblower behind the Journal series.

Without further comment, the transcript and links to the 60 Minutes program, followed by the Wall Street Journal piece today on the whistleblower, follow:

“Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation”

October 3, 2021 (updated October 4)

Frances Haugen says in her time with Facebook she saw, “conflicts of interest between what was good for the public and what was good for Facebook.” Scott Pelley reports.

Her name is Frances Haugen. That is a fact that Facebook has been anxious to know since last month when an anonymous former employee filed complaints with federal law enforcement. The complaints say Facebook’s own research shows that it amplifies hate, misinformation and political unrest—but the company hides what it knows. One complaint alleges that Facebook’s Instagram harms teenage girls. What makes Haugen’s complaints unprecedented is the trove of private Facebook research she took when she quit in May. The documents appeared first, last month, in the Wall Street Journal. But tonight, Frances Haugen is revealing her identity to explain why she became the Facebook whistleblower.

Frances Haugen: The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.

Frances Haugen is 37, a data scientist from Iowa with a degree in computer engineering and a Harvard master’s degree in business. For 15 years she’s worked for companies including Google and Pinterest.

Frances Haugen: I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.

Scott Pelley: You know, someone else might have just quit and moved on. And I wonder why you take this stand.

Frances Haugen: Imagine you know what’s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.

Scott Pelley: When and how did it occur to you to take all of these documents out of the company?

Frances Haugen: At some point in 2021, I realized, “Okay, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.”

facebookvideo.jpg
  Frances Haugen

She secretly copied tens of thousands of pages of Facebook internal research. She says evidence shows that the company is lying to the public about making significant progress against hate, violence and misinformation. One study she found, from this year, says, “we estimate that we may action as little as 3-5% of hate and about 6-tenths of 1% of V & I [violence and incitement] on Facebook despite being the best in the world at it.”

Scott Pelley: To quote from another one of the documents you brought out, “We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.”

Frances Haugen: When we live in an information environment that is full of angry, hateful, polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other, the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.

‘Ethnic violence’ including Myanmar in 2018 when the military used Facebook to launch a genocide.

Frances Haugen told us she was recruited by Facebook in 2019. She says she agreed to take the job only if she could work against misinformation because she had lost a friend to online conspiracy theories.

Frances Haugen: I never wanted anyone to feel the pain that I had felt. And I had seen how high the stakes were in terms of making sure there was high quality information on Facebook.

At headquarters, she was assigned to Civic Integrity which worked on risks to elections including misinformation. But after this past election, there was a turning point.

Frances Haugen: They told us, “We’re dissolving Civic Integrity.” Like, they basically said, “Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.” Fast forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

Facebook says the work of Civic Integrity was distributed to other units. Haugen told us the root of Facebook’s problem is in a change that it made in 2018 to its algorithms—the programming that decides what you see on your Facebook news feed.

Frances Haugen: So, you know, you have your phone. You might see only 100 pieces of content if you sit and scroll on for, you know, five minutes. But Facebook has thousands of options it could show you.

The algorithm picks from those options based on the kind of content you’ve engaged with the most in the past.

Frances Haugen: And one of the consequences of how Facebook is picking out that content today is it is — optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions.

Scott Pelley: Misinformation, angry content– is enticing to people and keep–

Frances Haugen: Very enticing.

Scott Pelley:–keeps them on the platform.

Frances Haugen: Yes. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.

Haugen says Facebook understood the danger to the 2020 Election. So, it turned on safety systems to reduce misinformation—but many of those changes, she says, were temporary.

Frances Haugen: And as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety.

And that really feels like a betrayal of democracy to me.

Facebook says some of the safety systems remained. But, after the election, Facebook was used by some to organize the January 6th insurrection. Prosecutors cite Facebook posts as evidencephotos of armed partisans and text including, “by bullet or ballot restoration of the republic is coming!” Extremists used many platforms, but Facebook is a recurring theme.

After the attack, Facebook employees raged on an internal message board copied by Haugen. “…Haven’t we had enough time to figure out how to manage discourse without enabling violence?” We looked for positive comments and found this, “I don’t think our leadership team ignores data, ignores dissent, ignores truth…” but that drew this reply, “welcome to Facebook! I see you just joined in November 2020… we have been watching… wishy-washy actions of company leadership for years now.” “…Colleagues… cannot conscience working for a company that does not do more to mitigate the negative effects of its platform.”

Scott Pelley: Facebook essentially amplifies the worst of human nature.

Frances Haugen: It’s one of these unfortunate consequences, right? No one at Facebook is malevolent, but the incentives are misaligned, right? Like, Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.

That dynamic led to a complaint to Facebook by major political parties across Europe. This 2019 internal report obtained by Haugen says that the parties, “…feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook… leading them into more extreme policy positions.”

Scott Pelley: The European political parties were essentially saying to Facebook the way you’ve written your algorithm is changing the way we lead our countries.

Frances Haugen: Yes. You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.

Evidence of harm, she says, extends to Facebook’s Instagram app.

Scott Pelley: One of the Facebook internal studies that you found talks about how Instagram harms teenage girls. One study says 13.5% of teen girls say Instagram makes thoughts of suicide worse; 17% of teen girls say Instagram makes eating disorders worse.

Frances Haugen: And what’s super tragic is Facebook’s own research says, as these young women begin to consume this– this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just the Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.

Facebook said, just last week, it would postpone plans to create an Instagram for younger children.

Last month, Haugen’s lawyers filed at least 8 complaints with the Securities and Exchange Commission which enforces the law in financial markets. The complaints compare the internal research with the company’s public face—often that of CEO Mark Zuckerberg—who testified remotely to Congress last March.

Mark Zuckerberg testimony on March 25:

We have removed content that could lead to imminent real-world harm. We have built an unprecedented third-party fact checking program. The system isn’t perfect. But it is the best approach that we have found to address misinformation in line with our country’s values.

One of Frances Haugen’s lawyers, is John Tye. He’s the founder of a Washington legal group, called “Whistleblower Aid.”

Scott Pelley: What is the legal theory behind going to the SEC? What laws are you alleging have been broken?

John Tye: As a publicly-traded company, Facebook is required to not lie to its investors or even withhold material information. So, the SEC regularly brings enforcement actions, alleging that companies like Facebook and others are making material misstatements and omissions that affect investors adversely.

Scott Pelley: One of the things that Facebook might allege is that she stole company documents.

John Tye: The Dodd-Frank Act, passed over ten years ago at this point, created an Office of the Whistleblower inside the SEC. And one of the provisions of that law says that no company can prohibit its employees from communicating with the SEC and sharing internal corporate documents with the SEC.

Frances Haugen: I have a lot of empathy for Mark. and Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side effects of those choices are that hateful, polarizing content gets more distribution and more reach.

Facebook declined an interview. But in a written statement to 60 Minutes it said, “every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”

Facebook is a $1 trillion company. Just 17 years old, it has 2.8 billion users, which is 60% of all internet-connected people on Earth. Frances Haugen plans to testify before Congress this week. She believes the federal government should impose regulations.

Frances Haugen: Facebook has demonstrated they cannot act independently Facebook, over and over again, has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety. I’m hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place. That’s my hope.

Produced by Maria Gavrilovic and Alex Ortiz. Broadcast associate, Michelle Karim. Edited by Michael Mongulla.

. . .

“The Facebook Whistleblower, Frances Haugen, Says She Wants to Fix the Company, Not Harm It”

By Jeff Horwitz, Oct. 3, 2021

The former Facebook employee says her goal is to help prompt change at the social-media giant

The former Facebook Inc. employee who gathered documents that formed the foundation of The Wall Street Journal’s Facebook Files series said she acted to help prompt change at the social-media giant, not to stir anger toward it.

Frances Haugen, a former product manager hired to help protect against election interference on Facebook, said she had grown frustrated by what she saw as the company’s lack of openness about its platforms’ potential for harm and unwillingness to address its flaws. She is scheduled to testify before Congress on Tuesday. She has also sought federal whistleblower protection with the Securities and Exchange Commission.

In a series of interviews, Ms. Haugen, who left the company in May after nearly two years, said that she had come into the job with high hopes of helping Facebook fix its weaknesses. She soon grew skeptical that her team could make an impact, she said. Her team had few resources, she said, and she felt the company put growth and user engagement ahead of what it knew through its own research about its platforms’ ill effects.

Toward the end of her time at Facebook, Ms. Haugen said, she came to believe that people outside the company—including lawmakers and regulators—should know what she had discovered.

“If people just hate Facebook more because of what I’ve done, then I’ve failed,” she said. “I believe in truth and reconciliation—we need to admit reality. The first step of that is documentation.”

THE JOURNAL.

The JournalThe Facebook Files, Part 6: The Whistleblower

In a written statement, Facebook spokesman Andy Stone said, “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

Ms. Haugen, 37 years old, resigned from Facebook in April. She stayed on another month to hand off some projects. She also sifted through the company’s internal social network, called Facebook Workplace, for instances where she believed the company had failed to be responsible about users’ welfare.

She said she was surprised by what she found. The Journal’s series, based in part on the documents she gathered as well as interviews with current and former employees, describes how the company’s rules favor elites; how its algorithms foster discord; and how drug cartels and human traffickers use its services openly. An article about Instagram’s effects on teenage girls’ mental health was the impetus for a Senate subcommittee hearing last week in which lawmakers described the disclosures as a “bombshell.”

Ms. Haugen kept expecting to be caught, she said, as she reviewed thousands of documents over several weeks. Facebook logs employees’ activities on Workplace, and she was exploring parts of its network that, while open, weren’t related to her job.

She said that she began thinking about leaving messages for Facebook’s internal security team for when they inevitably reviewed her search activity. She liked most of her colleagues, she said, and knew some would feel betrayed. She knew the company would as well, but she thought the stakes were high enough that she needed to speak out, she said.

On May 17, shortly before 7 p.m., she logged on for the last time and typed her final message into Workplace’s search bar to try to explain her motives.

“I don’t hate Facebook,” she wrote. “I love Facebook. I want to save it.”

Facebook’s headquarters in Menlo Park, Calif.

PHOTO: IAN BATES FOR THE WALL STREET JOURNAL

Rule Follower

Ms. Haugen was born and raised in Iowa, the daughter of a doctor father and a mother who left behind an academic career to become an Episcopal priest. She said that she prides herself on being a rule-follower. For the last four Burning Man celebrations, the annual desert festival popular with the Bay Area tech and art scene, she served as a ranger, mediating disputes and enforcing the community’s safety-focused code.

Ms. Haugen previously worked at Alphabet Inc.’s Google, PinterestInc. and other social networks, specializing in designing algorithms and other tools that determine what content gets served to users. Google paid for her to attend Harvard and get her master’s in business administration. She returned to the company in 2011 only to be confronted with an autoimmune disorder.

“I came back from business school, and I immediately started decaying,” she said. Doctors were initially baffled. By the time she was diagnosed with celiac disease, she had sustained lasting damage to nerves in her hands and feet, leaving her in pain. She went from riding a bicycle as much as 100 miles a day to struggling to move around.

Ms. Haugen resigned from Google at the beginning of 2014. Two months later, a blood clot in her thigh landed her in the intensive care unit.

A family acquaintance hired to assist her with errands became her main companion during a year she spent largely homebound. The young man bought groceries, took her to doctors’ appointments, and helped her regain the capacity to walk.

“It was a really important friendship, and then I lost him,” she said.

The friend, who had once held liberal political views, was spending increasing amounts of time reading online forums about how dark forces were manipulating politics. In an interview, the man recalled Ms. Haugen as having unsuccessfully tried to intervene as he gravitated toward a mix of the occult and white nationalism. He severed their friendship and left San Francisco before later abandoning such beliefs, he said.

Ms. Haugen’s health improved, and she went back to work. But the loss of her friendship changed the way she thought about social media, she said.

“It’s one thing to study misinformation, it’s another to lose someone to it,” she said. “A lot of people who work on these products only see the positive side of things.”

Recruited

When a Facebook recruiter got in touch at the end of 2018, Ms. Haugen said, she replied that she might be interested if the job touched on democracy and the spread of false information. During interviews, she said, she told managers about her friend and how she wanted to help Facebook prevent its own users from going down similar paths.

She started in June 2019, part of the roughly 200-person Civic Integrity team, which focused on issues around elections world-wide. While it was a small piece of Facebook’s overall policing efforts, the team became a central player in investigating how the platform could spread political falsehoods, stoke violence and be abused by malicious governments.

Ms. Haugen was initially asked to build tools to study the potentially malicious targeting of information at specific communities. Her team, comprising her and four other new hires, was given three months to build a system to detect the practice, a schedule she considered implausible. She didn’t succeed, and received a poor initial review, she said. She recalled a senior manager telling her that people at Facebook accomplish what needs to be done with far less resources than anyone would think possible.

Around her, she saw small bands of employees confronting large problems. The core team responsible for detecting and combating human exploitation—which included slavery, forced prostitution and organ selling—included just a few investigators, she said.

“I would ask why more people weren’t being hired,” she said. “Facebook acted like it was powerless to staff these teams.”

Mr. Stone of Facebook said, “We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority.”

Ms. Haugen said the company seemed unwilling to accept initiatives to improve safety if that would make it harder to attract and engage users, discouraging her and other employees.

“What did we do? We built a giant machine that optimizes for engagement, whether or not it is real,” read a presentation from the Connections Integrity team, an umbrella group tasked with “shaping a healthy public content ecosystem,” in the fall of 2019. The presentation described viral misinformation and societal violence as among the results.

Samidh Chakrabarti, left, and other Facebook employees at work on Oct. 17, 2018, ahead of a runoff election in Brazil.

PHOTO: DAVID PAUL MORRIS/BLOOMBERG NEWS

Ms. Haugen came to see herself and the Civic Integrity team as an understaffed cleanup crew.

She worried about the dangers that Facebook might pose in societies gaining access to the internet for the first time, she said, and saw Myanmar’s social media-fueled genocide as a template, not a fluke.

She talked about her concerns with her mother, the priest, who advised her that if she thought lives were on the line, she should do what she could to save those lives.

Facebook’s Mr. Stone said that the company’s goal was to provide a safe, positive experience for its billions of users. “Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business,” he said.

On Dec. 2, 2020, the founder and chief of the team, Samidh Chakrabarti, called an all-hands teleconference meeting. From her San Francisco apartment, Ms. Haugen listened to him announce that Facebook was dissolving the team and shuffling its members into other parts of the company’s integrity division, the broader group tasked with improving the quality and trustworthiness of the platform’s content.

Mr. Chakrabarti praised what the team had accomplished “at the expense of our family, our friends and our health,” according to Ms. Haugen and another person at the talk. He announced he was taking a leave of absence to recharge, but urged his staff to fight on and to express themselves “constructively and respectfully” when they see Facebook at risk of putting short-term interests above the long-term needs to the community. Mr. Chakrabarti resigned in August. He didn’t respond to requests for comment.

That evening after the meeting, Ms. Haugen sent an encrypted text to a Journal reporter who had contacted her weeks earlier. Given her work on a team that focused in part on counterespionage, she was especially cautious and asked him to prove who he was.

The U.S. Capitol riot came weeks later, and she said she was dismayed when Facebook publicly played down its connection to the violence despite widespread internal concern that its platforms were enabling dangerous social movements.

Mr. Stone of Facebook called any implication that the company caused the riot absurd, noting the role of public figures in encouraging it. “We have a long track record of effective cooperation with law enforcement, including the agencies responsible for addressing threats of domestic terrorism,” he said.

In March, Ms. Haugen left the Bay Area to take up residence in Puerto Rico, expecting to continue working for Facebook remotely.

Open Forums

Ms. Haugen had expected there wouldn’t be much left on Facebook Workplace that wasn’t already either written about or hidden away. Workplace is a regular source of leaks, and for years the company has been tightening access to sensitive material.

To her surprise, she found that attorney-client-privileged documents were posted in open forums. So were presentations to Chief Executive Mark Zuckerberg —sometimes in draft form, with notes from top company executives included.

In Ms. Haugen’s view, allowing outsiders to see the company’s research and operations is essential.

PHOTO: STEPHEN VOSS FOR THE WALL STREET JOURNAL

Virtually any of Facebook’s more than 60,000 employees could have accessed the same documents, she said.

To guide her review, Ms. Haugen said she traced the careers of colleagues she admired, tracking their experiments, research notes and proposed interventions. Often the work ended in frustrated “badge posts,” goodbye notes that included denunciations of Facebook’s failure to take responsibility for harms it caused, she said. The researchers’ career arcs became a framework for the material that would ultimately be provided to the SEC, members of Congress and the Journal.

The more she read, she said, the more she wondered if it was even possible to build automated recommendation systems safely, an unpleasant thought for someone whose career focused on designing them. “I have a lot of compassion for people spending their lives working on these things,” she said. “Imagine finding out your product is harming people—it’d make you unable to see and correct those errors.”

The move to Puerto Rico brought her stint at Facebook to a close sooner than she had planned. Ms. Haugen said Facebook’s human resources department told her it couldn’t accommodate anyone relocating to a U.S. territory. In mid-April, she agreed to resign the following month.

Ms. Haugen continued gathering material from inside Facebook through her last hour with access to the system. She reached out to lawyers at Whistleblower Aid, a Washington, D.C., nonprofit that represents people reporting corporate and government misbehavior.

In addition to her coming Senate testimony and her SEC whistleblower claim, she said she’s interested in cooperating with state attorneys general and European regulators. While some have called for Facebook to be broken up or stripped of content liability protections, she disagrees. Neither approach would address the problems uncovered in the documents, she said—that despite numerous initiatives, Facebook didn’t address or make public what it knew about its platforms’ ill effects.

Mr. Stone of Facebook said, “We have a strong track record of using our research—as well as external research and close collaboration with experts and organizations—to inform changes to our apps.”

In Ms. Haugen’s view, allowing outsiders to see the company’s research and operations is essential. She also argues for a radical simplification of Facebook’s systems and for limits on promoting content based on levels of engagement, a core feature of Facebook’s recommendation systems. The company’s own research has found that “misinformation, toxicity, and violent content are inordinately prevalent” in material reshared by users and promoted by the company’s own mechanics.

“As long as your goal is creating more engagement, optimizing for likes, reshares and comments, you’re going to continue prioritizing polarizing, hateful content,” she said.

Beyond that, she has some business ideas she’d like to pursue—and she would like to think about something other than Facebook.

“I’ve done a really good job figuring out how to be happy,” she said. “Talking about things that make you sad all the time is not the way to make yourself happy.”