House lawmakers plan to interrogate the chief executives of Google, Facebook and Twitter at a hearing on Thursday about how disinformation spreads across their platforms, an issue that the tech companies were scrutinized for during the presidential election and after the Jan. 6 riot at the Capitol.
The hearing, held by the House Energy and Commerce Committee, will be the first time that Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google appear before Congress during the Biden administration. President Biden has indicated that he is likely to be tough on the tech industry. That position, coupled with Democratic control of Congress, has raised liberal hopes that Washington will take steps to rein in Big Tech’s power and reach over the next few years.
The hearing will also be the first opportunity since the Jan. 6 Capitol riot for lawmakers to question the three men about the role their companies played in the event. The attack has made the issue of disinformation intensely personal for the lawmakers since those who participated in the riot have been linked to online conspiracy theories like QAnon.
Before the hearing, Democrats signaled in a memo that they were interested in questioning the executives about the Jan. 6 attacks, efforts by the right to undermine the results of the 2020 election and misinformation related to the Covid-19 pandemic.
Republicans sent the executives letters this month asking them about the decisions to remove conservative personalities and stories from their platforms, including an October article in The New York Post about President Biden’s son Hunter.
Lawmakers have debated whether social media platforms’ business models encourage the spread of hate and disinformation by prioritizing content that will elicit user engagement, often by emphasizing salacious or divisive posts.
Some lawmakers will push for changes to Section 230 of the Communications Decency Act, a 1996 law that shields the platforms from lawsuits over their users’ posts. Lawmakers are trying to strip the protections in cases where the companies’ algorithms amplified certain illegal content. Others believe that the spread of disinformation could be stemmed with stronger antitrust laws, since the platforms are by far the major outlets for communicating publicly online.
“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” said Representative Frank Pallone, the New Jersey Democrat who is chairman of the committee.
The tech executives are expected to play up their efforts to limit misinformation and redirect users to more reliable sources of information. They may also entertain the possibility of more regulation, in an effort to shape increasingly likely legislative changes rather than resist them outright.
Mr. Zuckerberg, for example, is set to say that Facebook supports new rules that would make the Section 230 protections available only to certain platforms, according to his prepared testimony. Those platforms would have to confirm they were meeting certain standards for taking down some unlawful content, Mr. Zuckerberg is set to say.
“We are committed to keeping people safe on our services and to protecting free expression, and we work hard to set and enforce policies that meet those goals,” Mr. Zuckerberg will say, according to his prepared testimony.
The chief executives of Facebook, Alphabet and Twitter are expected to face tough questions from lawmakers on both sides of the aisle. Democrats have focused on disinformation, especially in the wake of the Capitol riot. Republicans, meanwhile, have already questioned the companies about their decisions to remove conservative personalities and stories from their platforms.
New York Times reporters have covered many of the examples that could come up. Here are the facts to know about them:
How a Stabbing in Israel Echoes Through the Fight Over Online Speech
After his son was stabbed to death in Israel by a member of the militant group Hamas in 2016, Stuart Force decided that Facebook was partly to blame for the death, because the algorithms that power the social network helped spread Hamas’s content. He joined relatives of other terror victims in suing the company, arguing that its algorithms aided the crimes by regularly amplifying posts that encouraged terrorist attacks. Arguments about the algorithms’ power have reverberated in Washington.
What is Section 230? Legal Shield for Websites is Targeted by Lawmakers
Section 230 of the Communications Decency Act, has helped Facebook, YouTube, Twitter and countless other internet companies flourish. But Section 230’s liability protection also extends to fringe sites known for hosting hate speech, anti-Semitic content and racist tropes. As scrutiny of big technology companies has intensified in Washington over a wide variety of issues, including how they handle the spread of disinformation or police hate speech, Section 230 has faced new focus.
Facebook Dials Down the Politics for Users
After inflaming political discourse around the globe, Facebook is trying to turn down the temperature. The social network started changing its algorithm to reduce the political content in users’ news feeds. Facebook previewed the change earlier this year when Mark Zuckerberg, the chief executive, said the company was experimenting with ways to tamp down divisive political debates among users. “One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” he said.
From Voter Fraud to Vaccine Lies: Misinformation Peddlers Shift Gears
As the Electoral College affirmed Joseph R. Biden Jr.’s election, voter fraud misinformation subsided. But peddlers of online falsehoods ramped up lies about the Covid-19 vaccines. Rep. Marjorie Taylor Greene, a Republican of Georgia, as well as far-right websites like ZeroHedge, have begun pushing false vaccine narratives, researchers said. Their efforts have been amplified by a robust network of anti-vaccination activists like Robert F. Kennedy Jr. on platforms including Facebook, YouTube and Twitter.
In Pulling Trump’s Megaphone, Twitter and Facebook Show Where Power Now Lies
In the end, two billionaires from California did what legions of politicians, prosecutors and power brokers had tried and failed to do for years: They pulled the plug on President Trump. Journalists and historians will spend years unpacking the improvisational nature of the bans, and scrutinizing why they arrived just as Mr. Trump was losing his power, and Democrats were poised to take control of Congress and the White House. The bans have also turned up the heat on a free-speech debate that has been simmering for years.