Facebook and Twitter took unusual steps Wednesday to limit readership of an article by the New York Post about alleged emails from Democratic presidential nominee Joe Biden’s son, one of the rare occasions they have sanctioned a traditional media outlet.
The social media giants took that action before verifying the contents of the article, in which President Donald Trump’s personal attorney Rudy Giuliani and his former top adviser Stephen Bannon claimed to have obtained and leaked a trove of private materials from Hunter Biden. The leaked documents suggested at one point he gave a Ukrainian executive the “opportunity” to meet the former vice president. The Biden campaign said his schedule indicated no such meeting took place.
Facebook preemptively limited the spread of the story while sending it to third-party fact-checkers, a decision the company said it has taken on various occasions but is not the standard process. Twitter allowed the story to surge to a No. 3 trending topic in the U.S., although later marked the link as “potentially unsafe” and blocked it. It also temporarily locked White House press secretary Kayleigh McEnany’s account, as well as the New York Post’s, adding notices to their tweets saying they violated Twitter’s rules on prohibiting publishing hacked materials. Trump’s campaign account was also temporarily locked.
The moves prompted an outcry from Trump, Republicans and right-leaning publications, which repeated claims of politically motivated censorship by Silicon Valley giants.
“So terrible that Facebook and Twitter took down the story of ‘Smoking Gun’ emails related to Sleepy Joe Biden and his son, Hunter, in the @NYPost,” Trump wrote on Twitter.
But Wednesday’s actions were the result of a year of tech companies' scenario-planning exercises for the 2020 election, including the possibility of a “hack-and-leak” situation of potentially unverified emails that could help swing the election.
Four years after Russian operatives exploited tech giants' services during a presidential contest, the companies' swift and aggressive steps in responding to the unverified story, and their divergent responses, are a real-time case study in their ability to protect the integrity of an election that has been marred by domestic disinformation and misleading accounts. That activity has included misinformation about Biden’s health, the dying wish of the late Supreme Court Justice Ruth Bader Ginsburg, and the validity of mail-in ballots - much of it spread by Trump and his supporters.
Some of the dozens of scenarios Facebook and Twitter prepared for were along the lines of the 2016 campaign, when Russia-tied WikiLeaks dumped the emails of Hillary Clinton’s campaign chair John Podesta. Many news organizations at the time covered the email-dump without first sufficiently exploring the organization’s political motivations and Russia ties.
Earlier in the day, Facebook spokesman Andy Stone tweeted that the company was “reducing” the story’s distribution while it was checked by independent fact-checkers. He pointed to a link on the company website to a year-old policy, which says that if the company has “signals” that a piece of content is false, its distribution could be reduced pending fact-checker review, part of an effort to take “faster action” to stop viral misinformation. Stone declined to comment on the signals the company used in this case, but said similar steps had been taken on several occasions but were not always publicized.
Google appeared to take a middle-of-the-road approach in its curation of information about the story. A search for “Hunter Biden” revealed the link to the original New York Post story, as well as links to stories and comments rebutting it or questioning its provenance.
Google did not respond to a request for comment.
Brandon Borrman, Twitter’s vice president of global communications, pointed to the company’s hacked materials policy, which says, “We don’t permit the use of our services to directly distribute content obtained through hacking that contains private information.” He said the company had blocked links before under the policy, but did not specify when.
As backlash against the companies continued throughout the day, Twitter CEO Jack Dorsey’s account tweeted late Wednesday that the company’s initial decision to block links without providing an explanation for it was “unacceptable.” The company’s Twitter Safety handle tweeted that the articles it blocked included images that contained personal and private information in violation of its rules.
The story initially surged on Twitter earlier Wednesday, rising to become the No. 3 trending topic in the U.S., thanks to viral tweets by White House press secretary Kayleigh McEnany, right-wing outlets such as One America News, Donald Trump Jr., and later, the Trump campaign. Twitter also wrote its own summary of the story to contextualize the trend.
Several hours after publication, Twitter blocked access to the link to the original story, including a warning that said, “This link may be unsafe.”
Trump campaign spokesman Tim Murtaugh called the campaign’s Twitter handle suspension election interference.
“For Twitter to lock the main account of the campaign of the President of the United States is a breathtaking level of political meddling and nothing short of an attempt to rig the election,” he said in a statement. “Joe Biden’s Silicon Valley pals are aggressively blocking negative news stories about their guy and preventing voters from accessing important information. This is like something from communist China or Cuba, not the United States of America.”
On Facebook, the limitations on the post meant that by Wednesday afternoon, there were fewer than 4,000 clicks, likes and shares of the original New York Post story on Facebook, and roughly 214,000 overall. That would be considered modest traffic for a major news story.
Facebook’s move immediately provoked the ire of Republicans, including Donald Trump Jr. and Sen. Josh Hawley, R-Mo., who said he sent a letter asking Facebook to explain its decision to “censor” the story.
“The seemingly selective nature of this public intervention suggests partiality on the part of Facebook,” he wrote. He also sent a letter to Dorsey.
Earlier this month, Facebook took action on a false story spread by the Trump campaign and Fox News, as well as a tweet by a New York Post reporter, that Biden was wearing an earpiece at the presidential debates. The company also limited the spread of a false story claiming that far-left activists started the wildfires in the Pacific Northwest this summer.
Determined to avoid a repeat of 2016, technology companies began planning for the 2020 elections almost immediately following the 2018 midterms.
Facebook has considered at least 70 possible situations it may need to respond to in the weeks leading up to the election or after. Last summer it hosted more than a dozen formal “tabletop sessions” - essentially drills or planning discussions. This summer, the company held simulations for a hack-and-leak operation, late-breaking foreign interference or overblown claims of foreign interference that may not actually be true but could still undermine trust in the election, as well as potential delays in races being called due to increases in mail-in voting, said spokeswoman Liz Bourgeois.
Twitter has held roughly a dozen such exercises since March, including the same scenarios as Facebook, as well as attempts to manipulate its trending topics feature and coordinated online voter suppression campaigns, said spokesman Nicholas Pacilio.
Based on these exercises, the companies have plans to block language by candidates calling for premature victory or disputing the results of the race.
Trump previously has refused to commit to a peaceful transfer of power should he lose the election.
Facebook will link such calls to the official results, according to Reuters, while Twitter will label any premature claims and automatically direct people to an election page with either announcements from state election officials or a public projection from at least two authoritative, national news outlets that make independent election calls. Both companies have also banned calls for and intimations of violence at the polls.
Facebook has taken steps to limit political ads in the week leading up to Election Day and the week after.
The companies said they are in close contact with election officials to field warnings about potential problems erupting on social media.
- - -
The Washington Post’s Isaac Stanley-Becker contributed to this report.