Less than 24 hours before one of the most contentious US presidential elections in recent history, a Facebook employee tried to flag an election conspiracy theory that was spreading on the social network.
A video titled “LEFT PLANS COUP / REVOLUTION NOV 3RD 2020?” had amassed 2,500 concurrent viewers on Facebook Live and was at the time the second recommended result on the platform’s live video product. The employee wanted to know why.
“The 2nd result in Watch Live is a non verified page pushing election conspiracy theories.”
“The 2nd result in Watch Live is a non verified page pushing election conspiracy theories,” the Facebook worker wrote. “Why do we allow non verified live videos about elections if we don’t have a way to check the content is not election misinformation?”
A Facebook product manager eventually removed the video, but only after a few hours, allowing the content to be viewed by more than 110,000 people. It was not the only example of manipulative content that was left up on Facebook Live before, during, and after Election Day.
Internal posts from Facebook employees and BuzzFeed News’ own monitoring of the social network’s livestreaming feature show that Facebook Live was littered with misinformation and hyperpartisan content that was algorithmically recommended to thousands of people during a presidential election. In some cases, the videos were moderated by Facebook, while in others, managers said they had no policy for handling the issue and let them stand.
Facebook has spent months telling the public and its own employees that it was better prepared for this election than in 2016, when it was caught flat-footed as hyperpartisan publishers and Russian state actors used Facebook and Instagram to spread disinformation and sow discord among the US electorate. The company has instituted moratoriums on political ads, created labels to place on misleading posts from the president, and hired thousands of third-party contract moderators — all changes that were praised by a Facebook executive Monday in a self-congratulatory companywide message.
And still the Facebook ecosystem, which includes platforms used by 3 billion people, was misused.
Do you work at Facebook or another technology company? We’d love to hear from you. Reach out at ryan.mac@buzzfeed.com, craig.silverman@buzzfeed.com, or via one of our tip line channels.
“This will not be the only election misinformation live video today, tomorrow, or in the future weeks,” the employee who flagged the conspiracy theory video wrote. “We should probably focus on what we can do to limit sources of election misinformation instead of focus on this single video.”
A product manager replied, noting that the page should have been “considered low News Ecosystem Quality and should be receiving a harsh demotion.”
“We have a mechanism to enforce on this,” the product manager wrote. “It seems like it wasn’t triggered on this surface specifically, and we’re following up on that.”
The live video suggesting a Democratic coup wasn’t the only election content Facebook employees flagged internally. The day before the election, the same concerned employee discovered the company was promoting a livestream from RT, the Russian state-funded and -controlled broadcaster.
The RT stream, which featured coverage of President Donald Trump’s campaign rallies accompanied by the hashtags #MAGA2020 and #TrumpRally, was the fourth recommended livestream, next to one from right-wing site Breitbart. RT’s video was promoted not just on Facebook’s mobile app, but also on Portal, its video chatting device that’s used by older demographics.
“Every news outlet has live coverage,” the employee wrote. “We put breitbart and RT up but missing many other news outlets there.”
“Fox News, [Washington Post,] and hundreds of pages had live election coverage,” they continued. “How did RT become 4th result?”
While Breitbart is one of Facebook’s trusted news partners, the social network said in June that it would label pages, posts, and ads from some state-controlled media outlets such as RT as “partially or wholly under the editorial control of a state.” At the time Facebook also announced it would block ads from state-controlled media outlets in the US starting later this summer “out of an abundance of caution.”
Despite the social network’s supposed caution around RT content, it still recommended the network’s livestream the night before Election Day. In 2017, the office of the US Director of Intelligence described RT as the Kremlin’s “principal international propaganda outlet.”
“Currently there isn’t a policy to reduce/demote this content or remove it from recommendations but we have pinged the News team for clarity,” the product manager wrote in response to the employee.
Two other workers wrote in to note that they had previously raised the issue, with one saying that they wanted the company to “revisit the policy of *unconnected* recommendation for state controlled media.”
“Should we promote election coverage from Russian state controlled media known to dabble in misinformation?”
Another manager chimed in to affirm that Facebook does not block “specific state controlled pages” and treats them the same as other pages on the platform. The recommendation feed is “ranked by a model” that weighs factors including viewers, likes, and comments, they said.
“‘Should we promote election coverage from russian state controlled media known to dabble in misinformation and which we prominently label if our models rank it high or should we remove it from recommendation surfaces?’ is something we should proactively bring up in policy review given the political context,” the flagging employee wrote.
As of Wednesday, the RT video had been viewed around 38,000 times.
Facebook spokesperson Liz Bourgeois said in a statement that the company blocks ads from state-controlled media entities. “Out of an abundance of caution given the election,” she said, “we’re also temporarily not recommending their content to people.”
She said the policy went in place on Election Day, “where we assessed there’s the most risk is the post election period.” The company has not determined when the suspension of state-controlled media recommendations would be lifted.
On Election Day and in the hours after polls had closed, Facebook Live continued to recommend hyperpartisan content to thousands of people. On Tuesday morning, the feature ranked a livestream of a video from former Trump campaign manager and alleged fraudster Steve Bannon in its top slot over election coverage from outlets including ABC and NowThis News. Later that election night, the product suggested users tune in to election coverage from Jay Sekulow, a lawyer for Trump.
On Wednesday, as some states continued to tabulate votes from the night before, Facebook Live featured a video titled “TRUMP CLAIMS VICTORY!!” attracting nearly 7,000 viewers. Later in the day, Facebook Live’s top slot pushed a stream from conservative commentator and Turning Point USA founder Charlie Kirk on Facebook’s desktop platform and on Portal.
As BuzzFeed News previously reported, Kirk has in the past been penalized for sharing misinformation on Instagram — a violation that was later personally escalated for review by Facebook Vice President of Global Public Policy Joel Kaplan. Strikes for misinformation, which are assessed by third-party fact-checkers that partner with Facebook, can lead to the demotion of content on the platform.
Earlier this year, the Washington Post revealed that Kirk’s Turning Point organization benefited from fake Facebook accounts linked to an agency it worked with.
“This is about Facebook’s unwillingness to take responsibility for misinformation, hate speech, and wild conspiracy theories on their platform.”
Yet Kirk’s stream was promoted by Facebook Live, amassing an audience of more than 7,000, as he slammed pollsters and those critical of Trump. On desktop, the stream autoplayed for users, featuring a small label noting, “Election officials follow strict rules when it comes to ballot counting, handling, and reporting.”
On Portal, Facebook ranked Kirk’s stream above offerings from ABC News and Fox News. There, it was presented without any fact-checking label.
“This is not about domestic or foreign actors finding loopholes,” said Arisha Hatch, the executive director for the political action committee of Color of Change, a progressive nonprofit that’s pushed Facebook on civil rights issues. “This is about Facebook’s unwillingness to take responsibility for misinformation, hate speech, and wild conspiracy theories on their platform.”
“The American public is beginning to become fed up with their inability to get control of this misinformation, because it continues to polarize the country and place people in harm’s way,” she added.
Katie Notopoulos contributed reporting.