{"id":12561,"date":"2021-09-30T05:04:13","date_gmt":"2021-09-30T12:04:13","guid":{"rendered":"https:\/\/worldcampaign.net\/?p=12561"},"modified":"2021-10-07T02:00:32","modified_gmt":"2021-10-07T09:00:32","slug":"message-of-the-day-120","status":"publish","type":"post","link":"https:\/\/worldcampaign.net\/?p=12561","title":{"rendered":"Message of the Day: Human Rights, Economic Opportunity, Personal Growth"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-12576\" src=\"https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-15-300x169.jpeg\" alt=\"\" width=\"300\" height=\"169\" srcset=\"https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-15-300x169.jpeg 300w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-15-150x84.jpeg 150w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-15.jpeg 640w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-12574\" src=\"https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-13-300x169.jpeg\" alt=\"\" width=\"300\" height=\"169\" srcset=\"https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-13-300x169.jpeg 300w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-13-150x84.jpeg 150w, https:\/\/worldcampaign.net\/wp-content\/uploads\/2021\/09\/image-13.jpeg 640w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p><span style=\"font-size: 8pt;\"><em>Whistleblower: Facebook Prioritizing Growth Over Safety<\/em>, 60 Minutes, 10.3.21<\/span><\/p>\n<p>&nbsp;<\/p>\n<p>We started posting articles every day for this post, changing them by the day as the story unfolded. The Wall Street Journal investigative series, <em>Facebook Files<\/em>, started it. Today, Sunday, October 3, 60 Minutes on CBS brought it to an apex, for now, with the revelation of the identity and appearance of the whistleblower behind the Journal series.<\/p>\n<p>Without further comment, the transcript and links to the 60 Minutes program, followed by the Wall Street Journal piece today on the whistleblower, follow:<\/p>\n<p><a href=\"https:\/\/www.cbsnews.com\/news\/facebook-whistleblower-frances-haugen-misinformation-public-60-minutes-2021-10-03\/\">&#8220;Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation&#8221;<\/a><\/p>\n<p>October 3, 2021 (updated October 4)<\/p>\n<p class=\"dek\"><em>Frances Haugen says in her time with Facebook she saw, &#8220;conflicts of interest between what was good for the public and what was good for Facebook.&#8221; Scott Pelley reports.<\/em><\/p>\n<p>Her name is Frances Haugen. That is a fact that Facebook has been anxious to know since last month when an anonymous former employee filed complaints with federal law enforcement. The complaints say Facebook&#8217;s own research shows that it amplifies hate, misinformation and political unrest\u2014but the company hides what it knows. One complaint alleges that Facebook&#8217;s Instagram harms teenage girls. What makes Haugen&#8217;s complaints unprecedented is the trove of private Facebook research she took when she quit in May. The documents appeared first, last month, in the Wall Street Journal. But tonight, Frances Haugen is revealing her identity to explain why she became the Facebook whistleblower.<\/p>\n<ul>\n<li><span class=\"link\"><a href=\"https:\/\/www.cbsnews.com\/news\/facebook-statement-60-minutes-whistleblower-2021-10-03\/\" target=\"_blank\" data-invalid-url-rewritten-http=\"\">Facebook&#8217;s response to 60 Minutes&#8217; report, &#8220;The Facebook Whistleblower&#8221;<\/a><\/span><\/li>\n<li><span class=\"link\"><a href=\"https:\/\/www.cbsnews.com\/news\/facebook-whistleblower-polarizing-divisive-content-60-minutes-2021-10-03\/\" target=\"_blank\" data-invalid-url-rewritten-http=\"\">Facebook whistleblower says company incentivizes &#8220;angry, polarizing, divisive content&#8221;<\/a><\/span><\/li>\n<\/ul>\n<p>Frances Haugen: The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.<\/p>\n<p>Frances Haugen is 37, a data scientist from Iowa with a degree in computer engineering and a Harvard master&#8217;s degree in business. For 15 years she&#8217;s worked for companies including Google and Pinterest.<\/p>\n<p>Frances Haugen: I&#8217;ve seen a bunch of social networks and it was substantially worse at Facebook than anything I&#8217;d seen before.<\/p>\n<p>Scott Pelley: You know, someone else might have just quit and moved on. And I wonder why you take this stand.<\/p>\n<p>Frances Haugen: Imagine you know what&#8217;s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.<\/p>\n<p>Scott Pelley: When and how did it occur to you to take all of these documents out of the company?<\/p>\n<p>Frances Haugen: At some point in 2021, I realized, &#8220;Okay, I&#8217;m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.&#8221;<\/p>\n<figure class=\"embed embed--type-image is-image embed--float-none embed--size-medium\" data-ads=\"{&quot;extraWordCount&quot;:50}\"><span class=\"img embed__content\"><img loading=\"lazy\" decoding=\"async\" class=\" lazyloaded\" src=\"https:\/\/cbsnews2.cbsistatic.com\/hub\/i\/r\/2021\/10\/03\/af641300-196c-4a6b-8fe5-a8f3d393f7c6\/thumbnail\/620x349\/d471cc2f64a8b1542cd509d3fb5f1ac2\/facebookvideo.jpg#\" srcset=\"https:\/\/cbsnews2.cbsistatic.com\/hub\/i\/r\/2021\/10\/03\/af641300-196c-4a6b-8fe5-a8f3d393f7c6\/thumbnail\/620x349\/d471cc2f64a8b1542cd509d3fb5f1ac2\/facebookvideo.jpg 1x, https:\/\/cbsnews3.cbsistatic.com\/hub\/i\/r\/2021\/10\/03\/af641300-196c-4a6b-8fe5-a8f3d393f7c6\/thumbnail\/1240x698\/234fb8a94ff20cf3e759173571f2c27e\/facebookvideo.jpg 2x\" alt=\"facebookvideo.jpg \" width=\"620\" height=\"349\" data-srcset=\"https:\/\/cbsnews2.cbsistatic.com\/hub\/i\/r\/2021\/10\/03\/af641300-196c-4a6b-8fe5-a8f3d393f7c6\/thumbnail\/620x349\/d471cc2f64a8b1542cd509d3fb5f1ac2\/facebookvideo.jpg 1x, https:\/\/cbsnews3.cbsistatic.com\/hub\/i\/r\/2021\/10\/03\/af641300-196c-4a6b-8fe5-a8f3d393f7c6\/thumbnail\/1240x698\/234fb8a94ff20cf3e759173571f2c27e\/facebookvideo.jpg 2x\" \/><\/span><figcaption class=\"embed__caption-container\"><span class=\"embed__caption\">\u00a0 Frances Haugen<\/span><\/figcaption><\/figure>\n<p>She secretly copied tens of thousands of pages of Facebook internal research. She says evidence shows that the company is lying to the public about making significant progress against hate, violence and misinformation. One study she found, from this year, says, &#8220;we estimate that we may action as little as 3-5% of hate and about 6-tenths of 1% of V &amp; I [violence and incitement] on Facebook despite being the best in the world at it.&#8221;<\/p>\n<p>Scott Pelley: To quote from another one of the documents you brought out, &#8220;We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.&#8221;<\/p>\n<p>Frances Haugen: When we live in an information environment that is full of angry, hateful, polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other, the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.<\/p>\n<p>&#8216;Ethnic violence&#8217; including Myanmar in 2018 when the military used Facebook to launch a genocide.<\/p>\n<p>Frances Haugen told us she was recruited by Facebook in 2019. She says she agreed to take the job only if she could work against misinformation because she had lost a friend to online conspiracy theories.<\/p>\n<p>Frances Haugen: I never wanted anyone to feel the pain that I had felt. And I had seen how high the stakes were in terms of making sure there was high quality information on Facebook.<\/p>\n<p>At headquarters, she was assigned to Civic Integrity which worked on risks to elections including misinformation. But after this past election, there was a turning point.<\/p>\n<p>Frances Haugen: They told us, &#8220;We&#8217;re dissolving Civic Integrity.&#8221; Like, they basically said, &#8220;Oh good, we made it through the election. There wasn&#8217;t riots. We can get rid of Civic Integrity now.&#8221; Fast forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, &#8220;I don&#8217;t trust that they&#8217;re willing to actually invest what needs to be invested to keep Facebook from being dangerous.&#8221;<\/p>\n<p>Facebook says the work of Civic Integrity was distributed to other units. Haugen told us the root of Facebook&#8217;s problem is in a change that it made in 2018 to its algorithms\u2014the programming that decides what you see on your Facebook news feed.<\/p>\n<p>Frances Haugen: So, you know, you have your phone. You might see only 100 pieces of content if you sit and scroll on for, you know, five minutes. But Facebook has thousands of options it could show you.<\/p>\n<p>The algorithm picks from those options based on the kind of content you&#8217;ve engaged with the most in the past.<\/p>\n<p>Frances Haugen: And one of the consequences of how Facebook is picking out that content today is it is &#8212; optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it&#8217;s easier to inspire people to anger than it is to other emotions.<\/p>\n<p>Scott Pelley: Misinformation, angry content&#8211; is enticing to people and keep&#8211;<\/p>\n<p>Frances Haugen: Very enticing.<\/p>\n<p>Scott Pelley:&#8211;keeps them on the platform.<\/p>\n<p>Frances Haugen: Yes. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they&#8217;ll click on less ads, they&#8217;ll make less money.<\/p>\n<p>Haugen says Facebook understood the danger to the 2020 Election. So, it turned on safety systems to reduce misinformation\u2014but many of those changes, she says, were temporary.<\/p>\n<p>Frances Haugen: And as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety.<\/p>\n<p>And that really feels like a betrayal of democracy to me.<\/p>\n<p>Facebook says some of the safety systems remained. But, after the election, Facebook was used by some to organize the January 6th insurrection. Prosecutors cite Facebook posts as evidence<em>&#8212;<\/em>photos of armed partisans and text including, &#8220;by bullet or ballot restoration of the republic is coming!&#8221; Extremists used many platforms, but Facebook is a recurring theme.<\/p>\n<p>After the attack, Facebook employees raged on an internal message board copied by Haugen. &#8220;\u2026Haven&#8217;t we had enough time to figure out how to manage discourse without enabling violence?&#8221; We looked for positive comments and found this, &#8220;I don&#8217;t think our leadership team ignores data, ignores dissent, ignores truth\u2026&#8221; but that drew this reply, &#8220;welcome to Facebook! I see you just joined in November 2020\u2026 we have been watching\u2026 wishy-washy actions of company leadership for years now.&#8221; &#8220;\u2026Colleagues\u2026 cannot conscience working for a company that does not do more to mitigate the negative effects of its platform.&#8221;<\/p>\n<p>Scott Pelley: Facebook essentially amplifies the worst of human nature.<\/p>\n<p>Frances Haugen: It&#8217;s one of these unfortunate consequences, right? No one at Facebook is malevolent, but the incentives are misaligned, right? Like, Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.<\/p>\n<p>That dynamic led to a complaint to Facebook by major political parties across Europe. This 2019 internal report obtained by Haugen says that the parties, &#8220;\u2026feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook\u2026 leading them into more extreme policy positions.&#8221;<\/p>\n<p>Scott Pelley: The European political parties were essentially saying to Facebook the way you&#8217;ve written your algorithm is changing the way we lead our countries.<\/p>\n<p>Frances Haugen: Yes. You are forcing us to take positions that we don&#8217;t like, that we know are bad for society. We know if we don&#8217;t take those positions, we won&#8217;t win in the marketplace of social media.<\/p>\n<p>Evidence of harm, she says, extends to Facebook&#8217;s Instagram app.<\/p>\n<p>Scott Pelley: One of the Facebook internal studies that you found talks about how Instagram harms teenage girls. One study says 13.5% of teen girls say Instagram makes thoughts of suicide worse; 17% of teen girls say Instagram makes eating disorders worse.<\/p>\n<p>Frances Haugen: And what&#8217;s super tragic is Facebook&#8217;s own research says, as these young women begin to consume this&#8211; this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook&#8217;s own research says it is not just the Instagram is dangerous for teenagers, that it harms teenagers, it&#8217;s that it is distinctly worse than other forms of social media.<\/p>\n<p>Facebook said, just last week, it would postpone plans to create an Instagram for younger children.<\/p>\n<p>Last month, Haugen&#8217;s lawyers filed at least 8 complaints with the Securities and Exchange Commission which enforces the law in financial markets. The complaints compare the internal research with the company&#8217;s public face\u2014often that of CEO Mark Zuckerberg\u2014who testified remotely to Congress last March.<\/p>\n<p><em>Mark Zuckerberg testimony on March 25:<\/em><\/p>\n<p><em>We have removed content that could lead to imminent real-world harm. We have built an unprecedented third-party fact checking program. The system isn&#8217;t perfect. But it is the best approach that we have found to address misinformation in line with our country&#8217;s values.<\/em><\/p>\n<p>One of Frances Haugen&#8217;s lawyers, is John Tye. He&#8217;s the founder of a Washington legal group, called &#8220;Whistleblower Aid.&#8221;<\/p>\n<p>Scott Pelley: What is the legal theory behind going to the SEC? What laws are you alleging have been broken?<\/p>\n<p>John Tye: As a publicly-traded company, Facebook is required to not lie to its investors or even withhold material information. So, the SEC regularly brings enforcement actions, alleging that companies like Facebook and others are making material misstatements and omissions that affect investors adversely.<\/p>\n<p>Scott Pelley: One of the things that Facebook might allege is that she stole company documents.<\/p>\n<p>John Tye: The Dodd-Frank Act, passed over ten years ago at this point, created an Office of the Whistleblower inside the SEC. And one of the provisions of that law says that no company can prohibit its employees from communicating with the SEC and sharing internal corporate documents with the SEC.<\/p>\n<p>Frances Haugen: I have a lot of empathy for Mark. and Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side effects of those choices are that hateful, polarizing content gets more distribution and more reach.<\/p>\n<p>Facebook declined an interview. But in a written statement to 60 Minutes it said, &#8220;every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.&#8221;<\/p>\n<p>&#8220;If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.&#8221;<\/p>\n<p>Facebook is a $1 trillion company. Just 17 years old, it has 2.8 billion users, which is 60% of all internet-connected people on Earth. Frances Haugen plans to testify before Congress this week. She believes the federal government should impose regulations.<\/p>\n<p>Frances Haugen: Facebook has demonstrated they cannot act independently Facebook, over and over again, has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety. I&#8217;m hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place. That&#8217;s my hope.<\/p>\n<p><em>Produced by Maria Gavrilovic and Alex Ortiz. Broadcast associate, Michelle Karim. Edited by Michael Mongulla.<\/em><\/p>\n<p style=\"text-align: center;\"><strong>. . .<\/strong><\/p>\n<p style=\"text-align: left;\"><a href=\"https:\/\/www.wsj.com\/articles\/facebook-whistleblower-frances-haugen-says-she-wants-to-fix-the-company-not-harm-it-11633304122\">&#8220;The Facebook Whistleblower, Frances Haugen, Says She Wants to Fix the Company, Not Harm It&#8221;<\/a><\/p>\n<p style=\"text-align: left;\">By Jeff Horwitz, Oct. 3, 2021<\/p>\n<p style=\"text-align: left;\"><em>The former Facebook employee says her goal is to help prompt change at the social-media giant<\/em><\/p>\n<p>The former <a href=\"https:\/\/www.wsj.com\/market-data\/quotes\/FB\">Facebook<\/a><span class=\"company-name-type\"> Inc.<\/span> employee who gathered documents that formed the foundation of <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/articles\/the-facebook-files-11631713039?mod=article_inline\" target=\"_blank\">The Wall Street Journal\u2019s Facebook Files series<\/a> said she acted to help prompt change at the social-media giant, not to stir anger toward it.<\/p>\n<p>Frances Haugen, a former product manager hired to help protect against election interference on Facebook, said she had grown frustrated by what she saw as the company\u2019s lack of openness about its platforms\u2019 potential for harm and unwillingness to address its flaws. She is scheduled to testify before Congress on Tuesday. She has also sought federal whistleblower protection with the Securities and Exchange Commission.<\/p>\n<div class=\"paywall\">\n<p>In a series of interviews, Ms. Haugen, who left the company in May after nearly two years, said that she had come into the job with high hopes of helping Facebook fix its weaknesses. She soon grew skeptical that her team could make an impact, she said. Her team had few resources, she said, and she felt the company put growth and user engagement ahead of what it knew through its own research about its platforms\u2019 ill effects.<\/p>\n<p>Toward the end of her time at Facebook, Ms. Haugen said, she came to believe that people outside the company\u2014including lawmakers and regulators\u2014should know what she had discovered.<\/p>\n<p>\u201cIf people just hate Facebook more because of what I\u2019ve done, then I\u2019ve failed,\u201d she said. \u201cI believe in truth and reconciliation\u2014we need to admit reality. The first step of that is documentation.\u201d<\/p>\n<div class=\"media-object type-InsetPodcast wrap scope-web,mobileapps scope-web,mobileapps article__inset article__inset--type-InsetPodcast article__inset--wrap\" data-layout=\"wrap \" data-layout-mobile=\"\">\n<div class=\"media-object-podcast\">\n<div id=\"\" class=\"audioplayer\" data-audio=\"B311B3D8-B50A-425F-9EB7-12A9C4278ACD\" data-theme=\"wsj-article\" data-show-header=\"true\" data-show-subscribe=\"true\">\n<div id=\"audio-tag-inner-audio-B311B3D8-B50A-425F-9EB7-12A9C4278ACD\" class=\"audio-player-inner audio-size-sm audio-size-vsm\">\n<div id=\"audio-podcastName-audio-B311B3D8-B50A-425F-9EB7-12A9C4278ACD\" class=\"audio-podcastName\">\n<div class=\"audio-podcastName-icon\"><\/div>\n<div class=\"audio-podcastName-text\">THE JOURNAL.<\/div>\n<\/div>\n<p><a href=\"https:\/\/www.wsj.com\/podcasts\/the-journal\"><img decoding=\"async\" class=\"audio-img\" src=\"https:\/\/images.wsj.net\/im-83637?height=60\" alt=\"The Journal\" \/><\/a><a class=\"audio-title\" href=\"https:\/\/www.wsj.com\/podcasts\/the-journal\/the-facebook-files-part-6-the-whistleblower\/B311B3D8-B50A-425F-9EB7-12A9C4278ACD\">The Facebook Files, Part 6: The Whistleblower<\/a><\/p>\n<div class=\"controls-container\"><\/div>\n<p>In a written statement, Facebook spokesman Andy Stone said, \u201cEvery day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.\u201d<\/p>\n<p>Ms. Haugen, 37 years old, resigned from Facebook in April. She stayed on another month to hand off some projects. She also sifted through the company\u2019s internal social network, called Facebook Workplace, for instances where she believed the company had failed to be responsible about users\u2019 welfare.<\/p>\n<p>She said she was surprised by what she found. The Journal\u2019s series, based in part on the documents she gathered as well as interviews with current and former employees, describes how <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/articles\/facebook-files-xcheck-zuckerberg-elite-rules-11631541353?mod=article_inline\" target=\"_blank\">the company\u2019s rules favor elites<\/a>; how <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/articles\/facebook-algorithm-change-zuckerberg-11631654215?mod=article_inline\" target=\"_blank\">its algorithms foster discord<\/a>; and how <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/articles\/facebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953?mod=article_inline\" target=\"_blank\">drug cartels and human traffickers use its services openly<\/a>. An article about <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/articles\/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739?mod=article_inline\" target=\"_blank\">Instagram\u2019s effects on teenage girls\u2019 mental health<\/a> was the impetus for a <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/livecoverage\/facebook-hearing-live-updates\" target=\"_blank\">Senate subcommittee hearing<\/a> last week in which lawmakers described the disclosures as a \u201cbombshell.\u201d<\/p>\n<p>Ms. Haugen kept expecting to be caught, she said, as she reviewed thousands of documents over several weeks. Facebook logs employees\u2019 activities on Workplace, and she was exploring parts of its network that, while open, weren\u2019t related to her job.<\/p>\n<p>She said that she began thinking about leaving messages for Facebook\u2019s internal security team for when they inevitably reviewed her search activity. She liked most of her colleagues, she said, and knew some would feel betrayed. She knew the company would as well, but she thought the stakes were high enough that she needed to speak out, she said.<\/p>\n<p>On May 17, shortly before 7 p.m., she logged on for the last time and typed her final message into Workplace\u2019s search bar to try to explain her motives.<\/p>\n<p>\u201cI don\u2019t hate Facebook,\u201d she wrote. \u201cI love Facebook. I want to save it.\u201d<\/p>\n<div class=\"media-object type-InsetMediaIllustration bleed scope-web|mobileapps article__inset article__inset--type-InsetMediaIllustration article__inset--bleed\" data-layout=\"bleed \" data-layout-mobile=\"\">\n<figure class=\" media-object-image enlarge-image img-bleed article__inset__image \">\n<div class=\"image-container responsive-media article__inset__image__image\" data-subtype=\"photo\"><img decoding=\"async\" title=\"\" src=\"https:\/\/images.wsj.net\/im-410501?width=1260&amp;height=840\" sizes=\"(max-width: 639px) 100vw, (max-width: 979px) 100vw, (max-width: 1299px) 960px, 1260px\" srcset=\"https:\/\/images.wsj.net\/im-410501?width=639&amp;size=1.5 639w, https:\/\/images.wsj.net\/im-410501?width=939&amp;size=1.5 939w, https:\/\/images.wsj.net\/im-410501?width=960&amp;size=1.5 960w, https:\/\/images.wsj.net\/im-410501?width=1260&amp;size=1.5 1260w, https:\/\/images.wsj.net\/im-410501?width=1260&amp;size=1.5&amp;pixel_ratio=1.5 1890w, https:\/\/images.wsj.net\/im-410501?width=1260&amp;size=1.5&amp;pixel_ratio=2 2520w, https:\/\/images.wsj.net\/im-410501?width=1260&amp;size=1.5&amp;pixel_ratio=3 3780w\" alt=\"\" \/><\/div><figcaption class=\"wsj-article-caption article__inset__image__caption\">\n<h4 class=\"wsj-article-caption-content\">Facebook\u2019s headquarters in Menlo Park, Calif.<\/h4>\n<p><span class=\"wsj-article-credit article__inset__image__caption__credit\"><span class=\"wsj-article-credit-tag\">PHOTO: <\/span>IAN BATES FOR THE WALL STREET JOURNAL<\/span><\/p>\n<\/figcaption><\/figure>\n<\/div>\n<p><strong>Rule Follower<\/strong><\/p>\n<p>Ms. Haugen was born and raised in Iowa, the daughter of a doctor father and a mother who left behind an academic career to become an Episcopal priest. She said that she prides herself on being a rule-follower. For the last four Burning Man celebrations, the annual desert festival popular with the Bay Area tech and art scene, she served as a ranger, mediating disputes and enforcing the community\u2019s safety-focused code.<\/p>\n<p>Ms. Haugen previously worked at <a href=\"https:\/\/www.wsj.com\/market-data\/quotes\/GOOG\">Alphabet<\/a><span class=\"company-name-type\"> Inc.\u2019s<\/span> Google, <a href=\"https:\/\/www.wsj.com\/market-data\/quotes\/PINS\">Pinterest<\/a><span class=\"company-name-type\">Inc.<\/span> and other social networks, specializing in designing algorithms and other tools that determine what content gets served to users. Google paid for her to attend Harvard and get her master\u2019s in business administration. She returned to the company in 2011 only to be confronted with an autoimmune disorder.<\/p>\n<p>\u201cI came back from business school, and I immediately started decaying,\u201d she said. Doctors were initially baffled. By the time she was diagnosed with celiac disease, she had sustained lasting damage to nerves in her hands and feet, leaving her in pain. She went from riding a bicycle as much as 100 miles a day to struggling to move around.<\/p>\n<p>Ms. Haugen resigned from Google at the beginning of 2014. Two months later, a blood clot in her thigh landed her in the intensive care unit.<\/p>\n<p>A family acquaintance hired to assist her with errands became her main companion during a year she spent largely homebound. The young man bought groceries, took her to doctors\u2019 appointments, and helped her regain the capacity to walk.<\/p>\n<p>\u201cIt was a really important friendship, and then I lost him,\u201d she said.<\/p>\n<p>The friend, who had once held liberal political views, was spending increasing amounts of time reading online forums about how dark forces were manipulating politics. In an interview, the man recalled Ms. Haugen as having unsuccessfully tried to intervene as he gravitated toward a mix of the occult and white nationalism. He severed their friendship and left San Francisco before later abandoning such beliefs, he said.<\/p>\n<p>Ms. Haugen\u2019s health improved, and she went back to work. But the loss of her friendship changed the way she thought about social media, she said.<\/p>\n<p>\u201cIt\u2019s one thing to study misinformation, it\u2019s another to lose someone to it,\u201d she said. \u201cA lot of people who work on these products only see the positive side of things.\u201d<\/p>\n<p><strong>Recruited<\/strong><\/p>\n<p>When a Facebook recruiter got in touch at the end of 2018, Ms. Haugen said, she replied that she might be interested if the job touched on democracy and the spread of false information. During interviews, she said, she told managers about her friend and how she wanted to help Facebook prevent its own users from going down similar paths.<\/p>\n<p>She started in June 2019, part of the roughly 200-person Civic Integrity team, which focused on issues around elections world-wide. While it was a small piece of Facebook\u2019s overall policing efforts, the team became a central player in investigating how the platform could spread political falsehoods, stoke violence and be abused by malicious governments.<\/p>\n<p>Ms. Haugen was initially asked to build tools to study the potentially malicious targeting of information at specific communities. Her team, comprising her and four other new hires, was given three months to build a system to detect the practice, a schedule she considered implausible. She didn\u2019t succeed, and received a poor initial review, she said. She recalled a senior manager telling her that people at Facebook accomplish what needs to be done with far less resources than anyone would think possible.<\/p>\n<p>Around her, she saw small bands of employees confronting large problems. The core team responsible for detecting and combating human exploitation\u2014which included slavery, forced prostitution and organ selling\u2014included just a few investigators, she said.<\/p>\n<p>\u201cI would ask why more people weren\u2019t being hired,\u201d she said. \u201cFacebook acted like it was powerless to staff these teams.\u201d<\/p>\n<p>Mr. Stone of Facebook said, \u201cWe\u2019ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority.\u201d<\/p>\n<p>Ms. Haugen said the company seemed unwilling to accept initiatives to improve safety if that would make it harder to attract and engage users, discouraging her and other employees.<\/p>\n<p>\u201cWhat did we do? We built a giant machine that optimizes for engagement, whether or not it is real,\u201d read a presentation from the Connections Integrity team, an umbrella group tasked with \u201cshaping a healthy public content ecosystem,\u201d in the fall of 2019. The presentation described viral misinformation and societal violence as among the results.<\/p>\n<div class=\"wsj-immersive-ad-container\"><\/div>\n<div class=\"media-object type-InsetMediaIllustration bleed scope-web|mobileapps article__inset article__inset--type-InsetMediaIllustration article__inset--bleed\" data-layout=\"bleed \" data-layout-mobile=\"\">\n<figure class=\" media-object-image enlarge-image img-bleed article__inset__image \">\n<div class=\"image-container responsive-media article__inset__image__image\" data-subtype=\"photo\"><img decoding=\"async\" title=\"\" src=\"https:\/\/images.wsj.net\/im-410596?width=1260&amp;height=840\" sizes=\"(max-width: 639px) 100vw, (max-width: 979px) 100vw, (max-width: 1299px) 960px, 1260px\" srcset=\"https:\/\/images.wsj.net\/im-410596?width=639&amp;size=1.5 639w, https:\/\/images.wsj.net\/im-410596?width=939&amp;size=1.5 939w, https:\/\/images.wsj.net\/im-410596?width=960&amp;size=1.5 960w, https:\/\/images.wsj.net\/im-410596?width=1260&amp;size=1.5 1260w, https:\/\/images.wsj.net\/im-410596?width=1260&amp;size=1.5&amp;pixel_ratio=1.5 1890w, https:\/\/images.wsj.net\/im-410596?width=1260&amp;size=1.5&amp;pixel_ratio=2 2520w, https:\/\/images.wsj.net\/im-410596?width=1260&amp;size=1.5&amp;pixel_ratio=3 3780w\" alt=\"\" \/><\/div><figcaption class=\"wsj-article-caption article__inset__image__caption\">\n<h4 class=\"wsj-article-caption-content\">Samidh Chakrabarti, left, and other Facebook employees at work on Oct. 17, 2018, ahead of a runoff election in Brazil.<\/h4>\n<p><span class=\"wsj-article-credit article__inset__image__caption__credit\"><span class=\"wsj-article-credit-tag\">PHOTO: <\/span>DAVID PAUL MORRIS\/BLOOMBERG NEWS<\/span><\/p>\n<\/figcaption><\/figure>\n<\/div>\n<p>Ms. Haugen came to see herself and the Civic Integrity team as an understaffed cleanup crew.<\/p>\n<p>She worried about the dangers that Facebook might pose in societies gaining access to the internet for the first time, she said, and saw Myanmar\u2019s <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/articles\/banned-from-facebook-myanmars-top-general-finds-russian-refuge-1535631150?mod=article_inline\" target=\"_blank\">social media-fueled genocide<\/a> as a template, not a fluke.<\/p>\n<p>She talked about her concerns with her mother, the priest, who advised her that if she thought lives were on the line, she should do what she could to save those lives.<\/p>\n<p>Facebook\u2019s Mr. Stone said that the company\u2019s goal was to provide a safe, positive experience for its billions of users. \u201cHosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business,\u201d he said.<\/p>\n<p>On Dec. 2, 2020, the founder and chief of the team, Samidh Chakrabarti, called an all-hands teleconference meeting. From her San Francisco apartment, Ms. Haugen listened to him announce that Facebook was dissolving the team and shuffling its members into other parts of the company\u2019s integrity division, the broader group tasked with improving the quality and trustworthiness of the platform\u2019s content.<\/p>\n<p>Mr. Chakrabarti praised what the team had accomplished \u201cat the expense of our family, our friends and our health,\u201d according to Ms. Haugen and another person at the talk. He announced he was taking a leave of absence to recharge, but urged his staff to fight on and to express themselves \u201cconstructively and respectfully\u201d when they see Facebook at risk of putting short-term interests above the long-term needs to the community. Mr. Chakrabarti resigned in August. He didn\u2019t respond to requests for comment.<\/p>\n<p>That evening after the meeting, Ms. Haugen sent an encrypted text to a Journal reporter who had contacted her weeks earlier. Given her work on a team that focused in part on counterespionage, she was especially cautious and asked him to prove who he was.<\/p>\n<p>The U.S. Capitol riot came weeks later, and she said she was dismayed when Facebook publicly played down its connection to the violence despite widespread internal concern that its platforms were enabling dangerous social movements.<\/p>\n<p>Mr. Stone of Facebook called any implication that the company caused the riot absurd, noting the role of public figures in encouraging it. \u201cWe have a long track record of effective cooperation with law enforcement, including the agencies responsible for addressing threats of domestic terrorism,\u201d he said.<\/p>\n<p>In March, Ms. Haugen left the Bay Area to take up residence in Puerto Rico, expecting to continue working for Facebook remotely.<\/p>\n<p><strong>Open Forums<\/strong><\/p>\n<p>Ms. Haugen had expected there wouldn\u2019t be much left on Facebook Workplace that wasn\u2019t already either written about or hidden away. Workplace is a regular source of leaks, and for years the company has been tightening access to sensitive material.<\/p>\n<p>To her surprise, she found that attorney-client-privileged documents were posted in open forums. So were presentations to Chief Executive <a href=\"https:\/\/www.wsj.com\/topics\/person\/mark-zuckerberg\">Mark Zuckerberg<\/a> \u2014sometimes in draft form, with notes from top company executives included.<\/p>\n<div class=\"media-object type-InsetMediaIllustration inline scope-web|mobileapps article__inset article__inset--type-InsetMediaIllustration article__inset--inline\" data-layout=\"inline \" data-layout-mobile=\"\">\n<figure class=\" media-object-image enlarge-image img-inline article__inset__image \">\n<div class=\"image-container responsive-media article__inset__image__image\" data-subtype=\"photo\"><img decoding=\"async\" title=\"\" src=\"https:\/\/images.wsj.net\/im-410503?width=700&amp;height=1050\" sizes=\"(max-width: 639px) 100vw, (max-width: 979px) 620px, (max-width: 1299px) 700px, 700px\" srcset=\"https:\/\/images.wsj.net\/im-410503?width=620&amp;size=0.6666666666666666 620w, https:\/\/images.wsj.net\/im-410503?width=639&amp;size=0.6666666666666666 639w, https:\/\/images.wsj.net\/im-410503?width=700&amp;size=0.6666666666666666 700w, https:\/\/images.wsj.net\/im-410503?width=700&amp;size=0.6666666666666666&amp;pixel_ratio=1.5 1050w, https:\/\/images.wsj.net\/im-410503?width=700&amp;size=0.6666666666666666&amp;pixel_ratio=2 1400w, https:\/\/images.wsj.net\/im-410503?width=700&amp;size=0.6666666666666666&amp;pixel_ratio=3 2100w\" alt=\"\" \/><\/div><figcaption class=\"wsj-article-caption article__inset__image__caption\">\n<h4 class=\"wsj-article-caption-content\">In Ms. Haugen\u2019s view, allowing outsiders to see the company\u2019s research and operations is essential.<\/h4>\n<p><span class=\"wsj-article-credit article__inset__image__caption__credit\"><span class=\"wsj-article-credit-tag\">PHOTO: <\/span>STEPHEN VOSS FOR THE WALL STREET JOURNAL<\/span><\/p>\n<\/figcaption><\/figure>\n<\/div>\n<p>Virtually any of Facebook\u2019s more than 60,000 employees could have accessed the same documents, she said.<\/p>\n<p>To guide her review, Ms. Haugen said she traced the careers of colleagues she admired, tracking their experiments, research notes and proposed interventions. Often the work ended in frustrated \u201cbadge posts,\u201d goodbye notes that included denunciations of Facebook\u2019s failure to take responsibility for harms it caused, she said. The researchers\u2019 career arcs became a framework for the material that would ultimately be provided to the SEC, members of Congress and the Journal.<\/p>\n<p>The more she read, she said, the more she wondered if it was even possible to build automated recommendation systems safely, an unpleasant thought for someone whose career focused on designing them. \u201cI have a lot of compassion for people spending their lives working on these things,\u201d she said. \u201cImagine finding out your product is harming people\u2014it\u2019d make you unable to see and correct those errors.\u201d<\/p>\n<p>The move to Puerto Rico brought her stint at Facebook to a close sooner than she had planned. Ms. Haugen said Facebook\u2019s human resources department told her it couldn\u2019t accommodate anyone relocating to a U.S. territory. In mid-April, she agreed to resign the following month.<\/p>\n<p>Ms. Haugen continued gathering material from inside Facebook through her last hour with access to the system. She reached out to lawyers at Whistleblower Aid, a Washington, D.C., nonprofit that represents people reporting corporate and government misbehavior.<\/p>\n<p>In addition to her coming Senate testimony and her SEC whistleblower claim, she said she\u2019s interested in cooperating with state attorneys general and European regulators. While some have called for Facebook to be broken up or stripped of content liability protections, she disagrees. Neither approach would address the problems uncovered in the documents, she said\u2014that despite numerous initiatives, Facebook didn\u2019t address or make public what it knew about its platforms\u2019 ill effects.<\/p>\n<p>Mr. Stone of Facebook said, \u201cWe have a strong track record of using our research\u2014as well as external research and close collaboration with experts and organizations\u2014to inform changes to our apps.\u201d<\/p>\n<p>In Ms. Haugen\u2019s view, allowing outsiders to see the company\u2019s research and operations is essential. She also argues for a radical simplification of Facebook\u2019s systems and for limits on promoting content based on levels of engagement, a core feature of Facebook\u2019s recommendation systems. The company\u2019s own research has found that <a class=\"icon none\" href=\"https:\/\/www.wsj.com\/articles\/facebook-algorithm-change-zuckerberg-11631654215?mod=article_inline\" target=\"_blank\">\u201cmisinformation, toxicity, and violent content are inordinately prevalent\u201d<\/a> in material reshared by users and promoted by the company\u2019s own mechanics.<\/p>\n<p>\u201cAs long as your goal is creating more engagement, optimizing for likes, reshares and comments, you\u2019re going to continue prioritizing polarizing, hateful content,\u201d she said.<\/p>\n<p>Beyond that, she has some business ideas she\u2019d like to pursue\u2014and she would like to think about something other than Facebook.<\/p>\n<p>\u201cI\u2019ve done a really good job figuring out how to be happy,\u201d she said. \u201cTalking about things that make you sad all the time is not the way to make yourself happy.\u201d<\/p>\n<p class=\"articleTagLine\">\u2014Design by Andrew Levinson. A color filter has been used on some photos.<\/p>\n<p class=\"articleTagLine\"><strong><em>the facebook files<\/em><\/strong><\/p>\n<p class=\"articleTagLine\"><em>A series offering an unparalleled look inside the social-media giant\u2019s failings\u2014and its unwillingness or inability to address them.<\/em><\/p>\n<p class=\"articleTagLine\"><em><a class=\"sc-dlnjwi dJXsSm btn btn-secondary series-nav__series-link\" href=\"https:\/\/www.wsj.com\/articles\/the-facebook-files-11631713039?mod=series_facebookfiles\" target=\"_top\">VIEW THE FULL SERIES<\/a><\/em><\/p>\n<div class=\"media-object type-InsetDynamic inline scope-web|mobileapps article__inset article__inset--type-InsetDynamic article__inset--inline\" data-layout=\"inline \" data-layout-mobile=\"\"><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Whistleblower: Facebook Prioritizing Growth Over Safety, 60 Minutes, 10.3.21 &nbsp; We started posting articles every day for this post, changing them by the day as the story unfolded. The Wall Street Journal investigative series, Facebook Files, started it. Today, Sunday, October 3, 60 Minutes on CBS brought it to an apex, for now, with the [&hellip;]<\/p>\n","protected":false},"author":1001004,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[54],"tags":[],"_links":{"self":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/12561"}],"collection":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/users\/1001004"}],"replies":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=12561"}],"version-history":[{"count":4,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/12561\/revisions"}],"predecessor-version":[{"id":12596,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/12561\/revisions\/12596"}],"wp:attachment":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=12561"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=12561"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=12561"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}