Facebook has created a ”playbook” to help its employees rebut criticism that the company’s products fuel political polarization and social division.
The document, which cites a range of academic studies but does not include recent data from the company’s own research teams, was posted to Facebook’s internal Workplace discussion forum by Chief Product Officer Chris Cox and Pratiti Raychoudhury, vice president of research, earlier this week. During a Thursday webinar for employees, Cox said the document would “equip all of you to go home and have dinner” with friends and family and explain why public perceptions of Facebook are wrong.
In the paper, titled “What We Know About Polarization,” Cox and Raychodhury call polarization “an albatross public narrative for the company.”
“The implicit argument is that Facebook is contributing to a social problem of driving societies into contexts where they can’t trust each other, can’t share common ground, can’t have conversation about issues, and can’t share a common view on reality,” they write, adding that “the media narrative in this case is generally not supported by the research.”
While denying that Facebook meaningfully contributes to polarization during the webinar, Pablo Barberá, a research scientist at the company, also suggested political polarization could be a good thing during Thursday’s presentation.
“If we look back at history, a lot of the major social movements and major transformations, for example, the extension of civil rights or voting rights in this country have been the result of increasing polarization,” he told employees.
“This memo is corporate gaslighting disguised as a research brief.”
The paper and accompanying webinar are an indication of how Facebook will publicly address the perception that it is harmful to society, an issue that has become increasingly urgent in the aftermath of a divisive presidential election and attempted coup exacerbated by the misinformation and hate speech that had proliferated on the platform.
The presentation came more than two months after rioters — many of them stoked by misinformation on the social network about Joe Biden’s presidential win — stormed the US Capitol, and after various employees left Facebook in frustration, claiming it is not doing enough to address hate and extremism on its platform. Notably, the paper does not address internal data created or cited by departing employees that shows how divisive political content on the platform can radicalize people and degrade the overall Facebook user experience.
When shown the internal document by BuzzFeed News, one former Facebook employee who studied polarization at the company said, “This memo is corporate gaslighting disguised as a research brief.” They left the company last year and asked to remain anonymous for fear of retribution.
“It suggests that because other factors have contributed to increased polarization, Facebook itself is somehow less likely to have contributed to the problem,” they said. “It also ignores a large body of reporting and research — including research conducted by internal Facebook teams — which demonstrates the polarizing effects of Facebook’s products, including (and perhaps especially) Facebook Groups.”
In response, Facebook spokesperson Liz Bourgeois said in a statement that “better understanding and combating polarization is a focus of a number of teams across Facebook.”
She added, “Given much of the peer-reviewed research about polarization counters the views of our critics, it’s unsurprising that yesterday’s event is being mischaracterized to suggest we don’t acknowledge the role we have to play in its causes and solutions.”
Daniel Kreiss, a principal researcher of the Center for Information, Technology, and Public Life at the University of North Carolina at Chapel Hill, described Facebook’s paper as “generally in line with the state of the research literature.” But he offered this caveat: “With Facebook, it’s never just research. It’s research that’s running alongside the interests of a multiple billion dollar entity.”
He said the research cited in the company’s paper does not refute the idea that Facebook contributes to polarization. “I would say that it doesn’t seem like Facebook gets entirely off the hook,” Kreiss told BuzzFeed News.
Cox’s Thursday presentation follows a series of high-profile departures from Facebook’s civic integrity team, which was tasked with protecting users. As BuzzFeed News has reported, a number of those employees torched the company in their internal departure notes for failing to take stronger action against hate and misinformation.
“We’ve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” a departing employee wrote last August about the QAnon mass delusion. “In the meantime, the fringe group/set of beliefs has grown to national prominence with QAnon congressional candidates and QAnon hashtags and groups trending in the mainstream. We were willing to act only *after* things had spiraled into a dire state.”
Facebook’s polarization playbook makes no mention of that employee’s findings — or any internal data collected by the social giant’s large research teams. During the presentation, Barberá attributed polarization in the US over the last four decades to increasing inequality and racial divisions, growing disagreements among politicians, and partisan news outlets on TV and the internet.
He also noted that some countries with Facebook were much different from the US in that they had experienced less polarization. “There’s not a lot of clear evidence here,” he said.
Cox contrasted the fragmented media environments of the US, one of the most polarized countries by Facebook’s research, with places like the Philippines and Myanmar, where he said the “internet is incredibly important” to presenting views outside of state-controlled media. (Philippine President Rodrigo Duterte relied on Facebook misinformation campaigns to win an election, while the United Nations found that the site’s spread of hate speech abetted genocide against Rohingya Muslims in Myanmar.)
“The growing body of research on this issue suggests that social media cannot be clearly identified in social science research as a root cause of polarization, but we have lots more work to do to understand its effects (and what we should do about it),” said the Facebook paper.
Key to Facebook’s arguments is the idea that not all polarization is inherently bad. The company’s paper presented two types of polarization: issue polarization (divergence over a singular topic) and the more harmful, affective polarization (in which one group rejects another because of differences in politics, religion, ethnicity, or class).
“One thing social media platforms absolutely do is make it easier to have a global white supremacist movement”
“Issue polarization is not necessarily bad,” Barberá said during the presentation, noting that it can lead to political action. “When things get intense politically, that also gets people engaged in the political process. Like, when you see that the parties are presenting different platforms, you’ll get more interested because your vote can make a difference.”
But as Kreiss observed, there is a troubling counterpoint here. “One thing social media platforms absolutely do is make it easier to have a global white supremacist movement than it was 30 or 40 years ago.”
Cox spoke highly of the community-building potential of Facebook Groups, which the company sees as the future of interaction on the platform. Despite the fact that the social network stopped recommending political groups just before the election based on findings that they contributed to misinformation, the Facebook CPO commended the ability of Boy Scouts, parent teacher associations, or dog walking groups to contribute to a shared “social fabric.”
“On Groups, we believe that groups are on balance a positive, depolarizing force,” Cox said. He mentioned neither the supposedly banned right-wing extremist groups that persisted on the platform in the weeks leading up to the November election nor the “Stop the Steal” groups that flourished while trafficking in the lie that the vote had been rigged against Donald Trump.
When the webinar was opened up to questions from employees, one asked whether Facebook was like a tobacco company funding research to show its products were not a problem.
The paper’s conclusion also cites David Brooks, a prominent New York Times columnist who has written for Facebook’s corporate site
In response, Facebook’s leaders argued that most of the research presented in its internal document was not funded by the company. BuzzFeed News found that at least two of the 13 papers and other writings presented in the memo to employees were done in collaboration with Facebook. The paper’s conclusion also cites David Brooks, a prominent New York Times columnist who has written for Facebook’s corporate site and whose nonprofit project was funded by the company.
Despite dismissing the idea that Facebook contributes to negative polarization, company executives did acknowledge the need for change on the platform. Building on comments made by CEO Mark Zuckerberg during a January earnings call about halting recommendations for certain political content on people’s News Feeds, Cox said the company is evaluating “how people feel about the amount of divisive content in their News Feed.”
John Hegeman, the vice president of News Feed, said the company would be “rebalancing the amount of political content in feed” and “demote posts that can result in kind of nonconstructive angry reactions.”
Raychoudhury, the company’s vice president of research, echoed that sentiment. Even if executives didn’t think Facebook contributed to polarization, she said, it is developing metrics to understand whether its services “help people to build empathy and trust with one another.”
“Affective polarization is actually counter to our company’s mission,” she said. “It will be harder to bring the world closer together if groups of people hate each other.”