Message of the Day: Human Rights, Economic Opportunity, War, Population, Disease, Hunger, Environment, Personal Growth

The Facebook Dilemma, Frontline, PBS, October 29-30, 2018

 

The End Of Civilization As We Knew It, Part Eleven.

“If we survive the process of the forces of global exploitation being defeated by the inexorable movement toward global equality, the tools that connect the world are making it a brave new world of addicts in denial while the architects and even the architecture of the new world technology itself can become the ultimate exploiters. The lines between our intelligence and artificial intelligence and who controls what are harder and harder to locate. Science fiction becomes more and more reality every day. This presents challenges to human rights and even the definition of being human in countless ways.”

–We Are One: A Reflection, World Campaign, Spring, 2016

We have referenced 2016 as the location in time from which we launched our reflections on the end of civilization as we knew it, explained in Part One and since. That will continue to be expanded on.

A brief comment on the last week before proceeding.

In the US, guns, again. As we noted before, the moment almost came at Parkland, in a process in which the revolt against this kind of weaponry will soon, in the eye of history, overcome. But how much more blood first? Eleven murdered in a synagogue, which among other things demonstrated care for refugees, uniting the anti-Semitic and anti-Muslim and anti-immigrant bigotry of the murderer. Bombs mailed to two former US presidents and the last presidential candidate who won the popular vote but lost the election, and a number of other prominent opponents of President Trump, attempted by a deranged supporter, another bigot. It would have been a group assassination unprecedented. Troops threatened to be sent to the Mexican border to stop a fake invasion of the desperate nowhere near the border. An unconstitutional threat to change the constitution by executive order. The Reagan-Gorbachev treaty vastly reducing nuclear weapons threatened to be torn up. The anchor of stability in Europe, Angela Merkel, beginning an exit, unimaginable at a moment of enormous instability. Hope in Ireland, which left the anachronism of blasphemy behind as it has led toward a progressive future in vote after vote in recent years. Continued lying from the Saudis and a Middle East in growing danger while more children die from war and hunger. A virtual fascist elected in Brazil, because of the usual dynamics of desperation and rage covered before, threatening human rights and the planet itself.

Only a partial list.

Back to our current reflection on the end of civilization as we knew it.

The above quote from our reflection in Spring 2016, was the most current at that point of many over the years in the same vein.

Sometimes we like being prescient. Sometimes we don’t.

For the current installment on the end of civilization as we knew it, the last before the mid-term elections in the US, which will be pivotal one way or another in US and world history, we turn again to the best ongoing television documentary journalism series for decades in our view, Frontline on PBS.

Last night and tonight, Frontline aired one of its rare two-part series, episodes 4 and 5 of season 37, The Facebook Dilemma: A Two-Night Special Event.

This is not just a review of the impact of social media on the election of 2016 in the US, the 2018 election next week, elections and policies globally before and since, genocide, those with power and those without, and the global conflicts over power in various ways.

That would be enough to take a good hard look at, right?

Well, this is all that and more.

As noted in previous reflections, we live in the digital age, which we’ve covered extensively.

This two-night blockbuster from Frontline is an expanded eye-opening look at the power of the digital age and how it has changed and is changing everything, for all humanity and all life on earth.

Its like taking a breath and suddenly realizing you’re submerged in ice-water as it fills your lungs. Not because the program in or of itself creates this as a reaction to an exquisitely-crafted journalistic art form, but because it’s a wake-up call to reality.

Want a chance to get to the surface and live?

Watch the program.

The Facebook Dilemma: A Two-Night Special Event

Here’s the online introduction:

“The promise of Facebook was to create a more open and connected world. But from the company’s failure to protect millions of users’ data, to the proliferation of “fake news” and disinformation, mounting crises have raised the question: Is Facebook more harmful than helpful? On Monday, Oct. 29, and Tuesday, Oct. 30, 2018, FRONTLINE presents The Facebook Dilemma. This major, two-night event investigates a series of warnings to Facebook as the company grew from Mark Zuckerberg’s Harvard dorm room to a global empire. With dozens of original interviews and rare footage, The Facebook Dilemma examines the powerful social media platform’s impact on privacy and democracy in the U.S. and around the world.”

The remainder of our post consists of related content, print and video, to the broadcasts.

Tonight, “The Facebook Dilemma” Begins: A Note From FRONTLINE’s Executive Producer

October 29, 2018, by Raney Aronson-Rath

I’m often asked, “How do you choose topics for FRONTLINE documentaries?”

In the case of our new, two-part documentary, The Facebook Dilemma, which begins tonight at a special time (9/8c; check local PBS listings), our investigation was sparked by the extraordinary impact Facebook has had here and abroad. We were particularly interested in a central question: what did Facebook know about the consequences of its mission to connect the world, and how did it address these issues?

As we dug deeper, producers James Jacoby and Anya Bourg found many people from across the globe — from Egypt, to Ukraine, to the Philippines —  to former company insiders, who had sounded alarms to the highest levels of the company, but who said their warnings had gone unheeded. And what started as a basic accountability question became an examination of Facebook’s core algorithm – the “secret sauce” as one insider told us – and the role that technology companies like Facebook play in our democracy.

Our team on the film also includes Dana Priest. Dana, a Pulitzer Prize-winning investigative reporter at the The Washington Postand professor at the University of Maryland’s Philip Merrill School of Journalism, was drawn to this project because, in her mind, Facebook had built the largest intelligence-gathering operation in the world, and yet was operating with very little oversight.

In addition to our two-part film on what we found, which will air tonight, Oct. 29, and tomorrow, Oct. 30, we will also soon be adding new content to our Transparency Project, where you can explore our interviews in depth, and see more of the context to our reporting.

This investigation wouldn’t have been possible without the amazing team at FRONTLINE — and, of course, the support of viewers like you. Thank you for giving us the opportunity to ask tough questions, and follow the story, wherever it might lead.

Part one of The Facebook Dilemma premieres tonight at 9/8c on PBS (check your local listings) and online, and tune in for part two tomorrow at 10/9c.

“The Facebook Dilemma” Continues Tonight: A Note From Our Executive Producer

October 29, 2018, by Raney Aronson-Rath

As you saw in part one of The Facebook Dilemma last night, the social media platform has had historic and far-reaching impact both at home and abroad.

Our challenge, from the start, was how to distill such a sprawling and consequential story – and a year’s worth of deep investigative reporting – into a coherent, fair and well-told narrative.

What we came to see was that although Facebook has more than two billion users, on many levels the story was about leadership, and the decisions made by a man on a mission to make the world “more open and connected.” So while this two-hour special covers years and many topics, Mark Zuckerberg is central to it all.

Our team collected an extensive archive of Zuckerberg’s public comments, appearances, and interviews over the years. His rise from college student to leader of a more-than-$400-billion company has been uniquely public and richly documented, so we had extensive footage and his own words to draw on to tell this story.

Tonight, in part two of The Facebook Dilemma, you’ll see how Zuckerberg responded after the 2016 election, when Facebook came under scrutiny for its role in disseminating disinformation — including false stories propagated by Russian operatives seeking to exploit divisions in American society and influence the election.

You’ll see how Facebook is preparing for next week’s midterm elections, how it’s trying to tackle a range of issues including hate speech and the spread of misinformation, and the way Zuckerberg views his company’s role in our country and our world.

It’s a story that impacts every one of us – whether we have Facebook accounts or not.

We hope you’ll join us tonight for the gripping conclusion of The Facebook Dilemma. If you missed part one, stream it online now. And, watch part two starting tonight at 10/9c on PBS stations (check local listings) and online.

Russian Disinformation on Facebook Targeted Ukraine Well Before the 2016 U.S. Election

In partnership with The Washington Post

By Dana Priest, James, Jacoby and Anya Bourg, October 28, 2018

 

KIEV, Ukraine — In the spring of 2015, Ukrainian President Petro Poroshenko was desperate for Mark Zuckerberg’s help. His government had been urging Facebook to stop the Kremlin’s spreading of misinformation on the social network to foment distrust in his new administration and to promote support of Russia’s invasion and occupation of parts of Ukraine.

To get Zuckerberg’s attention, the president posted a question for a town hall meeting at Facebook’s Silicon Valley headquarters. There, a moderator read it aloud.

“Mark, will you establish a Facebook office in Ukraine?” the moderator said, chuckling, according to a video of the assembly. The room of young employees rippled with laughter. But the government’s suggestion was serious: It believed that a Kiev office, staffed with people familiar with Ukraine’s political situation, could help solve Facebook’s high-level ignorance about Russian information warfare.

“You know, over time it’s something that we might consider,” the chief executive responded. “So thank you for — the Ukrainian president — for writing in. I don’t think we’ve gotten that one before.”

In the three years since then, officials here say the company has failed to address most of their concerns about Russian online interference that predated similar interference in the 2016 U.S. presidential election. The tactics identified by officials, such as coordinated activity to overwhelm Facebook’s system and the use of impostor accounts, are the same as in the 2016 contest — and continue to challenge Facebook ahead of next month’s midterm elections.

“I was explicitly saying that there are troll factories, that their posts and reposts promoted posts and news that are fake,” Dmytro Shymkiv, then deputy minister of the presidential administration, said he told Facebook executives in June 2015. “They are promoted on your platform. By very often fake accounts. Have a look.”

Facebook has launched major reforms to its platform and processes since the 2016 U.S. presidential election made the company — and American users of Facebook — aware of how Russian actors were abusing it to influence politics far beyond their borders. But Ukraine’s warnings two years earlier show how the social media giant has been blind to the misuse of Facebook, in particular in places where it is hugely popular but has no on-the-ground presence. There is still no Facebook office in Ukraine.

Facebook officials defend their response to Ukrainian officials. They said Shymkiv did not raise the issue of Russian misinformation and other tactics in the meeting but that he talked instead about the company’s standards for removing content. They also said what they were alerted to in Ukraine was not a preview of what happened in the United States during the 2016 election.

Activists, officials and journalists from countries including Ukraine, the Philippines and Myanmar who reported abuses say Facebook took little or no action, according to an investigation for the documentary The Facebook Dilemma, airing Monday and Tuesday on FRONTLINE PBS. It was not until after evidence that fake accounts from Russia were used to influence the 2016 U.S. election that the company acted, some said. This article is based on reporting done for the film.

“That was the moment when suddenly I got a lot of calls and questions,” said Shymkiv, who left the government recently to return to private industry. “Because we were one of the first ones who actually told them that this is happening.”

In the past year, Facebook has begun to double the number of employees — to 20,000 — tasked with removing hateful speech and fake accounts.

Facebook contracts out the work of finding misinformation to small, local nonprofit organizations, while engineers build automated tools to tackle the problem on a large scale. Such misinformation is “downranked” — moved down in the news feed — unless it also violates other community standards such as being spam, hate speech or inciting violence, in which case it is removed.

Facebook has said Russia’s manipulation of political messages on its platform during the U.S. presidential election caught it by surprise. In Ukraine and elsewhere, Facebook had been seen as a force for good, bolstering democracy and enabling free speech. It played an oversized role in Ukraine’s 2014 Maidan Revolution, helping communities orchestrate the delivery of medical care and supplies to the revolutionaries and the sharing of tactics for resisting police and troops.

In interviews, company executives said they were slow to act on other evidence that Facebook was causing what they called “real-world harm.”

“Mark has said this, that we have been slow to really understand the ways in which Facebook might be used for bad things. We’ve been really focused on the good things,” said Naomi Gleit, one of Facebook’s longest-serving employees and now vice president for social good. “It’s possible that we could have done more sooner, and we haven’t been as fast as we needed to be, but we’re really focused on it now.”

A team set up to safeguard the upcoming U.S. midterm elections will be reviewing and removing inappropriate posts in real time. Facebook in August removed 652 fake accounts and pages with ties to Russia and Iran aimed at influencing political debates — and an additional 82 Iran-backed accounts on Friday. False narratives about the Central American migrant caravan and mailed pipe bombs were rampant on the network this week.

Complaints and harm done overseas, where 90 percent of Facebook’s 2.2 billion users live, were not company priorities, experts say, and may have led to missed signals before the 2016 U.S. election.

“Facebook’s tactic is to say, ‘Oh, we were blindsided,’ when in fact people had been warning them — pleading, begging — for years,” said Zeynep Tufekci, associate professor at the University of North Carolina at Chapel Hill, who began urging Facebook to remove false rumors during the 2011 Arab Spring revolutions. “The public record here is that they are a combination of unable and unwilling to grasp and deal with this complexity.”

Some former Facebook employees say that they were aware early on of Russian online interference in Ukraine, but either did not have a full picture of the interference or were unable to move the warnings high enough up the chain of command.

Alex Stamos, Facebook’s recently departed chief security officer, said the company had acted in Ukraine against Russia’s traditional cyber unit, the military intelligence agency GRU, which later stole emails from the Democratic National Committee. “We knew that they were active during the Ukraine crisis” in 2014, he said in an interview, referring to the pro-democratic Maidan Revolution and subsequent Russian invasion. “We had taken action against a number of their accounts and shut down their activity.”

But, he said, “we had not picked up on the kind of the completely independent disinformation actors” behind phony accounts circulating false news and posts, the sort of activity Shymkiv and other officials were flagging.

Elizabeth Linder, until 2016 Facebook’s government and policy specialist in Europe, Middle East and Africa, based in London, said disinformation was “absolutely hugely worrisome to countries, especially in Eastern Europe” before the U.S. elections.

But “in a company that’s built off numbers and metrics and measurements, anecdotes sometimes got lost along the way,” she said. “And that was always a real challenge and always bothered me.”

As Facebook pushed into new markets around the world, in some places becoming in effect the internet by serving as the primary source of information online, it took few measures to assure that its product would be properly used, critics said.“They built the city but then they didn’t put any traffic lights in, so the cars kept crashing into each other,” said Maria Ressa, editor of Rappler, a prominent journalism website in the Philippines, which Facebook last month contracted to identify fake news and hate speech in the country.

In an August 2016 meeting with Facebook in Singapore, Ressa showed three Facebook employees how close supporters of President Rodrigo Duterte were using the platform to circulate disinformation and call for violence against critics. Facebook had taught Duterte’s campaign how to use its platform to communicate with voters — training it offered other campaigns in other countries, too.

She said she warned them that the same type of disinformation campaign could happen in the upcoming U.S. elections. “I was hoping they would kick into action when I mentioned that,” she said.

But Facebook didn’t remove the accounts until she went public with her findings two months later and became the target of rape and death threats, she said. “They need to take action now or they need to leave our countries,” Ressa said.

Facebook’s failure to heed the pleas of civil society groups on the ground in Myanmar, also known as Burma, as far back as 2015 has had an even more devastating result.

That was the year Australian tech entrepreneur David Madden, who was living in Myanmar, traveled to Facebook’s headquarters in Menlo Park, Calif., and gave a seminar for employees describing how the platform had become a megaphone for Buddhist leaders calling for killing and expelling the Muslim Rohingya minority. Facebook removed the particular posts Madden complained about at the time but “what we had not done until more recently is proactively investigate coordinated abuse and networks of bad actors and bad content on the platform,” the company said last week.

In March, the United Nations declared that Facebook had a “determining role” in the genocide. “Facebook has now turned into a beast, and not what it originally intended,” U.N. investigator Yanghee Lee said.

“I think we were too slow to build the right technical tools that can help us find some of this content and also work with organizations on the ground in a real-time fashion,” said Monika Bickert, head of global policy management.

As in many countries, Facebook had no employees or partnerships on the ground. It says this is changing but still refuses to disclose how many are deployed country by country — something of great concern to Ukraine, Myanmar and other nations that suspect its content moderators are biased, inadequately trained or lack the necessary language and cultural fluency.

“We are working here in Menlo Park,” said Gleit, Facebook’s vice president for social good. “To the extent that some of these issues and problems manifest in other countries around the world, we didn’t have sufficient information and a pulse on what was happening.” Hiring more people overseas “can give us that insight that we may not get from being here.”

But, she said, “It’s not that we were like, wow, we could do so much more here and decided not to. I think we… were just a bit idealistic.”

In Ukraine, Russian information warfare was in full swing on Facebook and a Russian social media network during the revolution in 2014, government officials say. There was a daily flood of fake news condemning the revolution and trying to legitimize the invasion by claiming Ukraine was an Islamic State safe haven, a hotbed for Chechen terrorists and led by Nazis.

“We tried to monitor everything, but it was a tsunami,” recalled Dmytro Zolotukhin, then working for the new Ukrainian government’s Information Analysis Center of the National Security and Defense Council, which investigated online disinformation. “Thousands of reports of fake news on fake pages came in.” With the help of hackers and other cyber experts, he says he traced some of these accounts back to the Kremlin, which was also amplifying the false claims on dozens of fake online publications.

After the revolution in 2014, and again in 2017, Facebook suddenly banned dozens of accounts owned by pro-democracy leaders. Zolotukhin and others concluded that Russian bots were probably combing past comments and posts looking for banned terms and sending their names and URLs of the account owners to Facebook with complaints.

Another problem was someone — they believe it to be Russia — created impostor Facebook accounts of real government ministries and politicians, including Poroshenko. The impostor accounts posted incorrect and inflammatory information meant to make the government look bad, said Zolotukhin, now the deputy minister of information policy. He and others begged Facebook through its public portal to add verification checks next to the real accounts and remove the fakes. But usually no action was taken.

“I asked for six months for my verification,” said Artem Bidenko, state secretary of the Information Ministry, who said someone had created a fake account using his name.

All this overwhelmed the new Ukrainian government, which was dealing with corruption in its ranks, a Russian invasion and the continuing onslaught of Russian propaganda. Shymkiv and others met to figure out how to get Facebook’s attention when they learned of the May 2015 town hall meeting with the Facebook CEO.

One town hall question — with a record 45,000 likes — asked whether the Ukrainian accounts were the victim of “mass fake abuse reports.” Zuckerberg replied that he personally had looked into it. “There were a few posts that tripped our rule against hate speech,” he said. He did not say whether Facebook had checked on the authenticity or origin of the ban requests.

A month later, Facebook sent Gabriella Cseh, its head of public policy for Central and Eastern Europe based in Prague, to met with Shymkiv, Bidenko and others in Kiev.

Shymkiv said he told Cseh that the government believed Russia was using Facebook accounts with fake names to post fictitious, inflammatory news reports and engaging in online discussions to stir up political divisions.

Facebook needed to send a team to investigate, he said. Ukraine’s stability as a new democracy was at stake.

Bidenko said Cseh agreed he could email her the names of civic leaders who believed their accounts had been wrongfully banned.

“People would come in here with tears in their eyes,” said Bidenko, seated in his crumbling Soviet-era office. “They would say, ‘I wrote nothing bad and they banned me.’ I would write to Gabriella.”

At the end of the meeting, according to Shymkiv, Cseh promised to review the cases, which Facebook says it did. Then she handed him a copy of its Community Standards policy, available online.

This appeals process worked well for about two years, Bidenko said.

But Cseh went silent, Bidenko said, since an email she sent him April 13, 2018, two days after Zuckerberg testified on Capitol Hill and public scrutiny of Facebook intensified. He figures she and the company became too busy with other problems to respond. But to his astonishment, she also unfriended him.

“I was like, what!? Why is Gabriella unfriending me?” he said. “Maybe I became a nuisance.”

Facebook declined to allow Cseh to be interviewed and didn’t respond to a question about why she unfriended Bidenko. In a statement they said, “Gabi has previously made it clear to Mr. Bidenko that she might not respond to every single one of his messages, but that doesn’t mean she isn’t escalating the issues he flags to the appropriate internal teams.”

In August, Zolotukhin met with Facebook officials and said he reiterated the same concerns. He sent them a list of pages that still needed verification marks and they complied soon thereafter.

Bidenko, Zolotukhin, hackers and journalists are eager to open their laptops and scroll through what they say are fabricated news that sometimes includes gruesome videos. “Phosphorus burns everything: Ukrainian militia is using illegal weapons,” said a repost of a YouTube video from 2017. “Executioners were harvesting internal organs for sale,” read a post from a Russian website.

More than 2,000 Ukrainians have been killed and an active war continues, making Russia’s continued clandestine attacks via Facebook an urgent national security matter.

Facebook recently posted a job for a public policy manager for Ukraine — based in Warsaw.

“Facebook is trying to stay on the sidelines” of the war between Ukraine and Russia, Zolotukhin said. “But now it is not about saying you’re for democracy. It’s about fighting for democracy.”

Priest is a Washington Post reporter and a professor at the Merrill College of Journalism at the University of Maryland. Jacoby and Bourg are producers for FRONTLINE PBS.

WATCH: Facebook and “The Data Dilemma”

By Patrice Taddonio, Digital Writer & Audience Development Strategist, October 29, 2018

You’ve heard a lot about the data Facebook gathers on its users.

In “The Data Dilemma,” examine what Facebook knows, and how it knows it – from the platform’s use of so-called “shadow profiles,” to the main ways Facebook tracks you on the web, even when you’re not on Facebook.

This digital video accompanies FRONTLINE’s new, two-night investigation of Facebook. For more on Facebook, data, and privacy, watch The Facebook Dilemma.

Is Facebook Ready for the 2018 Midterms?

By Patrice Taddonio, Digital Writer & Audience Development Strategist, October 30, 2018

Following the 2016 election, Facebook came under scrutiny from the press, Congress and the public for its role in disseminating disinformation — including false stories propagated by Russian operatives seeking to exploit divisions in American society and influence the election.

In advance of next week’s midterms, as FRONTLINE reports in part two of The Facebook Dilemma, the social media giant has mobilized an election team to monitor disinformation and delete fake accounts that may be trying to influence voters.

Will it work?

In the above scene from The Facebook Dilemma, FRONTLINE correspondent James Jacoby sits down with Facebook’s head of cybersecurity policy, Nathaniel Gleicher, who runs the election team. 

“Is there going to be real-time monitoring on election day of what’s going on on Facebook, and how are you gonna actually find things that may sow distrust in the election?” Jacoby asks.

“Absolutely. We’re gonna have a team on election day focused on that problem, and one thing that’s useful here is we’ve already done this in other elections,” Gleicher answers.

“And you’re confident that you can do that here?” Jacoby asks.

“I think that, yes, I’m confident that we can do this here,” Gleicher says.

In the clip, Jacoby also talks with Naomi Gleit, Facebook’s VP of social good, and one of the company’s longest-serving employees. He asks her what standard the public should hold Facebook to.

“I think the standard, the responsibility, what I’m focused on is amplifying good and minimizing the bad,” Gleit says. “And we need to be transparent about what we’re doing on both sides and you know, I think this is an ongoing discussion.”

“What’s an ongoing discussion?” Jacoby asks.

“How we’re doing on minimizing the bad,” Gleit responds.

Part one of The Facebook Dilemma — FRONTLINE’s investigation of the social network’s impact on privacy and democracy — is now streaming online. Watch part two of The Facebook Dilemma starting Tuesday, Oct. 30, at 10/9c on PBS stations (check local listings) and online.

. . .

To be continued.