This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.
Facebook’s new “Supreme Court” is taking on its biggest case: Donald Trump.
The company’s recent decision to suspend Mr. Trump’s account after he incited a mob was — to put it mildly — contentious. On Thursday, the company asked its independent oversight body to review its decision and make a final call on whether the former president should be allowed back on Facebook and Instagram, which it owns.
Let me explain what this oversight board will do, and some of its benefits and limitations:
An independent arbiter is good. To a point: Facebook in 2019 outlined its plans for a court-like body to reconsider the most high-profile situations in which people think Facebook erred in applying its rules against hate speech, incitement of violence or other abuses.
Many people, including Facebook’s chief executive, Mark Zuckerberg, are uncomfortable with the idea of Facebook having the unquestioned power to silence world leaders and shape online discourse. The oversight board, whose rulings Facebook calls binding, is a measure of independent accountability for the site’s decisions.
The Trump suspension is by far the biggest case for the oversight board, which is made up of outside experts and just recently selected its first cases to review. The ruling will be closely watched and will influence the legitimacy of this new measure of Facebook justice.
(For deeper reading, check out this post by Evelyn Douek, a lecturer on Law and S.J.D. candidate at Harvard Law School who studies regulation of online speech.)
Is it time to change policy for world leaders? The oversight board is also being asked to consider a question that goes far beyond Mr. Trump: Should Facebook continue to give world leaders more leeway than the rest of us?
Both Facebook and Twitter allow top public authorities to post hateful or untrue things that would get most of us blocked or our posts deleted. The principle behind this is sound: What world leaders say is a matter of public importance, and the public should be able to see and evaluate their views without a filter.
There are real-world trade-offs, however, when powerful people have a megaphone to blare whatever they want.
In Myanmar, military leaders used Facebook to incite a genocide against the mostly Muslim Rohingya minority. In India, a prominent politician threatened to destroy mosques and called Muslims traitors in his Facebook posts. Iran’s Ayatollah Ali Khamenei has called for the destruction of Israel on Twitter. And on social media sites, Mr. Trump and Philippine President Rodrigo Duterte have alluded to shooting their own citizens.
Business & Economy
Those world leaders can and often do say the same things on television or in press statements, but when that happens there are usually opportunities for journalists to provide context and reactions.
Greg Bensinger, a member of the New York Times editorial board, recently argued that the social media companies’ world leader policy is backward. If anything, there should be more rules rather than fewer for world leaders on Facebook and Twitter, he said.
What the oversight body says about this question could reset a crucial global policy.
What about the other billions of people? Each year, Facebook makes billions of decisions on people’s posts, but the oversight board will only consider maybe dozens of high-profile disputes.
The board won’t help the many millions of people with far less power than Mr. Trump who have their voices silenced because of a decision Facebook made or failed to make.
This includes businesses and people who have their Facebook accounts locked and can’t get anyone at the company to pay attention. A teenager who is harassed on Facebook and quits the site doesn’t have someone to intervene on her behalf. And Rohingya who were slaughtered in their homes can’t appeal to this board.
The board’s decision on Mr. Trump may influence how online forums treat world leaders. But the fact remains that for most Facebook users, the company is the last and final word on what people can or can’t say. And Facebook faces little accountability for the consequences.