During a contentious presidential election in the US, Facebook quietly stopped recommending that people join online groups dealing with political or social issues.
Mentioned in passing by CEO Mark Zuckerberg during a Senate hearing on Wednesday, the move was confirmed to BuzzFeed News by a Facebook spokesperson. The company declined to say when exactly it implemented the change or when it would end.
“This is a measure we put in place in the lead-up to Election Day,” said Facebook spokesperson Liz Bourgeois, who added that all new groups have been filtered out of the recommendation tool as well. “We will assess when to lift them afterwards, but they are temporary.”
Confirmation of the move, which Facebook did not publicly announce, comes after members of the Senate’s Commerce, Science, and Transportation Committee grilled Zuckerberg about Facebook Groups and the possibility for polarization and radicalization within them. Testifying alongside Twitter CEO Jack Dorsey and Google CEO Sundar Pichai about content moderation on their platforms, Facebook’s chief became the main focus of questioning from Massachusetts Sen. Ed Markey, who asked if the company would stop group recommendations on the social platform until the certification of results in the US presidential election.
“Senator, we have taken the step of stopping recommendations in groups for all political content or social issue groups as a precaution for this,” Zuckerberg replied.
Facebook’s use of algorithms to automatically identify and recommend similar groups for people to join was intended to boost engagement. Researchers have long warned that these recommendations can push people down a path of radicalization and that groups reinforce like-minded views and abet the spread of misinformation and hate.
More than a billion people are members of groups on Facebook, and the company has pushed users to join them by boosting their prominence in people’s News Feeds. In announcing the company’s new focus on groups in 2017, Zuckerberg said that the social network had built artificial intelligence “to see if we could get better at suggesting groups that will be meaningful to you.”
“And it works!” he wrote in a post titled “Bringing the World Closer Together.” “In the first 6 months, we helped 50% more people join meaningful communities. And there’s a lot more to do here.”
Group recommendations may be harmless in a group for dog enthusiasts, but they can become problematic for others that are circulating conspiracy theories or scientific misinformation, according to Claire Wardle, a cofounder of misinformation research nonprofit First Draft. She said that based on anecdotal evidence she’s seen, Facebook’s automated group suggestions can drive people down radicalizing “recommendation journeys.”
“If I’m in a [group protesting stay-at-home precautions] in Wisconsin, what other groups am I being recommended? Anti-vax groups? Yellow vest groups?” she said, noting that it was impossible to study on a wide scale because it happens on people’s individual News Feeds.
In May, the Wall Street Journal reported that an internal Facebook researcher found in 2016 that “64% of all extremist group joins are due to our recommendation tools,” including the platform’s “Groups You Should Join” and “Discover” algorithms. “Our recommendation systems grow the problem,” read the researcher’s presentation.
When asked about the internal research at Wednesday’s Senate hearing by Michigan Sen. Gary Peters, Zuckerberg said he was “not familiar with that specific study,” despite the fact that he had criticized the Journal’s story internally to employees, according to audio of a recent company-wide meeting obtained by BuzzFeed News. Zuckerberg did note in the Senate hearing, however, that Facebook had taken steps to prevent groups that foster extremism or the spread of misinformation from appearing in suggested groups.
Despite those changes, organizations that violate Facebook’s own rules have managed to maintain groups on the platform. After Facebook banned right-wing militant groups and pages in August, a watchdog group found dozens of extremist groups and pages on the platform.
Earlier this month, federal and state prosecutors in Michigan charged 14 people in a plot to kidnap and possibly kill Michigan Gov. Gretchen Whitmer. A day after authorities announced the Whitmer plot, which was partly coordinated on Facebook, BuzzFeed News reported that the social network’s recommendation tools continued to suggest users follow pages espousing extremist messages.
It’s unclear how many groups are currently affected by Facebook’s limiting of recommendations for political and social issue groups in the run-up to the election. Facebook spokesperson Bourgeois declined to provide further details or say when the temporary change would be lifted.
A test of the Facebook platform for political groups showed that while the algorithmically generated suggested groups feature may have been removed, group administrators still had the power to manually suggest groups to members. Facebook’s search tool also surfaced political and social issue groups as normal.
Wardle wondered why Facebook, which had publicized several tweaks to its platform, including temporary political ad bans for the election, chose not to publicly announce the change to group recommendations. On Thursday, Instagram, which is owned by Facebook, announced it would temporarily suspend the “Recent” tab from hashtag pages, which can gather recently uploaded content tagged with a given hashtag, “to reduce the real-time spread of potentially harmful content that could pop up around the election.”
“I’m all for all platforms taking stronger steps on these things, but they need to be studying them,” Wardle said, noting that nothing would be learned if Facebook continued with business as usual after the election.
Jane Lytvynenko contributed reporting to this story.