Facebook has spent the past few months scrambling to ban Holocaust denial, the QAnon mass delusion, and right-wing extremist groups. To outside observers, it appears the company is finally reckoning with the vast landscape of hate and disinformation it has helped create.
But Facebook CEO Mark Zuckerberg recently told employees at a companywide meeting the real reason the social network was cracking down: the US presidential election.
In an all-hands conference call with Facebook employees last Thursday, the 36-year-old billionaire said that the company made policy changes to address the unstable situation around the US election and its aftermath. There has been no change in the way the company operates, according to Zuckerberg, who maintains majority shareholder voting power at Facebook.
“Once we’re past these events, and we’ve resolved them peacefully, I wouldn’t expect that we continue to adopt a lot more policies that are restricting of a lot more content,” Zuckerberg said in audio of the conference call obtained by BuzzFeed News.
While observers have speculated that Facebook’s new policies against potentially violent and conspiratorial content could mean that it’s turned a corner — or that the company is preparing for a Biden presidency and possible government regulation — Zuckerberg’s comments are an indication that the new rules are only stopgap measures. The 3 billion people who use at least one Facebook-owned product should not expect more rule changes after the election, Zuckerberg said.
“The basic answer is that this does not reflect a shift in our underlying philosophy or strong support of free expression,” he said. “What it reflects is, in our view, an increased risk of violence and unrest, especially around the elections, and an increased risk of physical harm, especially around the time when we expect COVID vaccines to be approved over the coming months.”
“This does not reflect a shift in our underlying philosophy or strong support of free expression.”
While the social network has created new policies to address health misinformation, violence-inciting militants, and hate, Zuckerberg has not expressed the same kind of candor in public Facebook posts as he has in speeches to his own employees. In September, he provided a general update on the company’s approach to the US election, which was followed this month by a post explaining the changing approach to Holocaust denial content.
He had previously posted a video in August explaining the “operational mistake” of allowing a militant group to exist on the platform in the lead-up to the shooting deaths of two protesters in Kenosha, Wisconsin. But he only published the video, which was made at an August all-hands meeting, after BuzzFeed News reported the contents of his talk and his admission to the company’s failure to act.
On Thursday, he tried to provide more clarity to more than 50,000 workers, some of whom pressed him on why it took so long take action against Holocaust denial. In 2018, Zuckerberg famously said in an interview that while he abhors such rhetoric, Facebook allowed Holocaust denial content because it showed that the company stood for free expression.
“There are a lot of things that I think that people say that are deeply offensive, and that are hurtful or even hateful,” Zuckerberg said. “But you know, where I think that we should draw the line is around when something has the likelihood to contribute to real-world violence or harm. And what we’ve seen over the last several years is a rise in anti-Semitic violence, both in the United States and across the world.”
“There’s an increased risk of these kind of different attacks, especially around this flashpoint around the election,” he added.
Facebook spokesperson Liz Bourgeois reiterated the idea that the changes in policies did not reflect a new approach to the company’s principles.
“We remain committed to free expression while also recognizing the current environment requires clearer guardrails to minimize harm,” she said.
For some, Zuckerberg’s comments may reinforce the charge that the company only makes policy changes when faced with major US events or under press scrutiny. In an explosive internal memo posted last month, fired engineer Sophie Zhang wrote that the company prioritized action in instances of political or electoral misinformation in the US and Western Europe.
“It’s an open secret within the civic integrity space that Facebook’s short-term decisions are largely motivated by PR and the potential for negative attention,” she wrote in her note, which was first obtained by BuzzFeed News.
In his talk, Zuckerberg cited non-US and European countries where the company had supposedly taken action to ban certain types of speech to prevent real-world harm.
“We’ve had ongoing work in a number of countries that we consider at risk — countries at risk of ongoing civil conflict, places like Myanmar, or Sri Lanka, or Ethiopia — where the determination that we’ve made alongside human rights groups and local groups on the ground is that a wider band of speech and expression would lead potentially to more incitement of violence or different issues,” he said.
In 2018, United Nations investigators found that Facebook had been culpable and slow to act against hate speech that fueled the genocide of Rohingya Muslims, an ethnic minority that had lived primarily in Myanmar’s Rakhine state. Facebook was also used to organize deadly violence against Muslims in Sri Lanka. Violent rhetoric on the social network is also pushing Ethiopia closer to genocide, according to Vice.
“The idea that this election is the only one that matters is nuts,” said David Kaye, a former UN special rapporteur on freedom of expression and clinical professor of law at the University of California, Irvine. “The platform has an enormous and a more significant impact in a lot of places around the world.”
“To say we’re done does not strike me as a plausible response to the possible violence that could be done in the next couple of months or years,” he added.
“To say we’re done does not strike me as a plausible response to the possible violence that could be done in the next couple of months or years.”
In his remarks about the election, Zuckerberg mostly avoided discussion of the company’s decision to limit the spread of an unconfirmed New York Post story about Joe Biden’s son Hunter Biden, only suggesting that it could have been part of “a big misinformation dump.”
He did, however, comment on the current presidential polling, which favors Biden.
“It looks like potentially Biden’s margin of leading in the polls has increased, which may lead to a result, which is not close, which, could be potentially helpful — if there’s just a decisive victory from someone — that could be helpful for clarity and for not having violence or civil unrest after the election,” Zuckerberg said. Facebook is currently running an online voting information center to tell people that results may not be settled by the end of election night due to the large increase in mail-in voting during the pandemic.
Beyond the election, Zuckerberg addressed other employee concerns, including when they’d be asked to return to the office. While some Facebook content moderator contractors have been required to come back to offices in California and Texas, Facebook’s CEO said it was not a priority for full-time workers and that remote work “was the only responsible thing to do during a pandemic.”
On Monday, the Intercept reported that a Facebook contractor hired by Accenture who had returned to an office in Austin had contracted the novel coronavirus. That person is now in self-quarantine.
When prompted, Zuckerberg also discussed his views on a recent report from a House of Representatives subcommittee on antitrust issues. After noting that he thought Facebook existed in a competitive environment that included Twitter, Snapchat, and TikTok, he said that his company would be adopting a policy that prevented employees from discussing antitrust issues on internal forums and messages.
He called a policy at Google that prohibits employees from discussing antitrust issues because of ongoing federal investigations “quite prudent” and noted that Facebook would be taking the same approach.
“Given that, you know, anything that any of you say internally is, of course, available to be subpoenaed or used in any of these investigations, I just think we should make sure that people aren’t just, you know, mouthing off about this and saying things that may reflect inaccurate data, or generally just are kind of incomplete,” he said, minutes after noting how Facebook stood for free expression. “You shouldn’t be emailing about these things and you shouldn’t really be discussing this in non-privileged forums across the company.”