Nation/World

It’s not just the Russians anymore as Iranians and others turn up disinformation efforts ahead of 2020 vote

A recent tweet from Alicia Hernan - whose Twitter account described her as a wife, mother and lover of peace - did not mince words about her feelings for President Donald Trump: “That stupid moron doesn’t get that that by creating bad guys, spewing hate filled words and creating fear of ‘others’, his message is spreading to fanatics around the world. Or maybe he does.”

That March 16 tweet, directed to a Hawaii congressman, was not the work of an American voter venting her frustration. The account, "@AliciaHernan3," was what disinformation researchers call a "sock puppet" - a type of fictitious online persona perfected by Russians when they were seeking to influence the 2016 presidential election.

But it was Iranians, not Russians, who created @AliciaHernan3, complete with a picture of a blonde woman with large, round-framed glasses and a turtleneck sweater. It was one of more than 7,000 phony accounts from Iran that Twitter has shut down this year alone.

And Iran is far from the only nation that has, within its borders, substantial capacity to wage Russian-style influence operations in the United States ahead of next year's election. That means American voters are likely to be targeted in the coming campaign season by more foreign disinformation than ever before, say those studying such operations.

Former Special Counsel Robert Mueller III echoed the consensus of independent researchers in his congressional testimony Wednesday, saying of Russian online political interference, "It wasn't a single attempt. They're doing it as we sit here, and they expect to do it the next campaign." He then added that "many more countries" had developed similar capabilities, based in part on the Russian playbook.

[Russians likely targeted election systems in all 50 states, Senate intelligence report says]

[Election security divides Congress after Mueller’s testimony]

ADVERTISEMENT

A short list of countries that host online influence operations with a history of meddling across borders includes Saudi Arabia, Israel, China, the United Arab Emirates and Venezuela, researchers say.

They say it's not often not clear exactly who runs these operations - whether it's the governments themselves or some other actors - but they typically echo the talking points of the ruling powers and back their geopolitical goals through tweets, posts and online videos. Operations in all of these countries, meanwhile, have the means and potentially the motives to seek to influence an American election shaping up as among the most hotly contested in decades.

The influence operations in these countries, however, do not all share Russia's demonstrated preference for Trump and other Republicans. The Iranians, for example, typically oppose Trump in their disinformation messaging, criticizing his decision to pull the United States out of the 2015 nuclear deal with Iran and administration policy on other issues, including Israel and the civil wars in Yemen and Syria, research shows.

"Multiple foreign actors have demonstrated an ability and willingness to leverage these kinds of influence operations in pursuit of their geopolitical goals," said Lee Foster, head of the intelligence team investigating information operations for FireEye, a cybersecurity firm based in California. "We risk the U.S. information space becoming a free-for-all for foreign interference if, as a society, we fail to get an effective grasp on this problem."

Researchers for FireEye and other firms have reported suspected Iranian disinformation on most major social media platforms - Facebook, Instagram, YouTube, Google+ and others - and on standalone websites as well. In May, FireEye also alleged that U.S. news sites may have been tricked into publishing letters to the editor penned by Iranian operatives.

The firm's analysis spotted a number of instances where letters in newspapers in Virginia and Texas appeared to share similar characteristics to accounts on Twitter believed to be part of an Iran-based disinformation network. FireEye also catalogued fictitious Twitter personas used by Iranians including a Harvard student, a Michigan bodybuilder and an Iran-American woman from Seattle.

Some Iranian Twitter accounts, FireEye found, even sought to impersonate U.S. political candidates, including a California Republican who ultimately lost the general election for Congress. That account tweeted about the confirmation hearing for U.S. Supreme Court Justice Brett Kavanaugh and a British royal wedding before beginning to promote Iranian interests, including tweets condemning the Saudis' killing of Washington Post contributing columnist Jamal Khashoggi.

Some Iranian disinformation accounts, some of which were affiliated with state-controlled news operations, date back several years, but they have grown steadily more sophisticated. Twitter, Facebook and Google all have identified and taken offline accounts from Iran over the past year for engaging in coordinated, deceptive behavior.

"As part of our public archive of information operations, we have disclosed thousands of accounts and millions of Tweets originating in Iran that we have proactively removed," said Yoel Roth, Twitter's head of site integrity. "Every year is an election year on Twitter, and we will be applying all of our global learnings to protect and enhance conversations around the 2020 election."

The Iranian tactics differ somewhat from those of the Russians, who through the Internet Research Agency in St. Petersburg infiltrated the online conversations of a wide range of U.S. political groupings - immigration opponents, African Americans, veterans, evangelical Christians, environmentalists - with a range of messages, attuned to the way those communities already were speaking to each other on major online platforms.

The Iranian operations detected so far tend to lack that complexity, with messaging typically on a single side of an issue in line with government policy goals - countering Israel, for example - as opposed to multiple ones.

But there are clear signs of shifting tactics in the accounts identified by Twitter, Facebook and other companies so far. What's known, researchers say, may be only small parts of much larger operations that remain undetected.

"The Iranian operations were a wake-up call to remind us that the Russians were not the only ones doing information operations," said Camille Francois, chief innovation officer for Graphika, a network analysis firm based in New York that studies online disinformation.

Graphika found that among one set of 1,666 Iranian accounts taken down by Twitter in June, about one in four tweets were in English. Trump was mentioned more than 1,400 times - almost always in critical ways - with this anti-Trump tweeting peaking in early 2017, in the months around when he took office.

Researchers say that both the U.S. government and social media companies have grown more aggressive in battling online disinformation in the aftermath of the 2016 presidential election.

Cooperation between the FBI and Silicon Valley has improved markedly. U.S. Cyber Command blocked internet access to Russian disinformation teams during the congressional midterm vote in November 2018, scrambling operations. Some researchers express hope that this rising aggressiveness may thwart - or at least deter - some foreign-based influence operations from meddling in future U.S. elections.

All of the major social media companies also have established teams devoted to combating disinformation, typically by identifying and shutting down networks of fictitious, foreign-based accounts on an increasingly large scale.

ADVERTISEMENT

This shift has been dramatic since 2016, when the companies saw foreign threats mainly in terms of traditional cybersecurity - hacks and bugs - as opposed to influence operations conducted by foreign adversaries with substantial resources. The Russian disinformation campaign in 2016 spent more than $1 million a month, Mueller reported in an indictment last year against the Internet Research Agency.

As the companies crack down, the tactics of disinformation teams rapidly shift to improve operational security and more effectively evade detection. FireEye, for example, was able to identify some apparently fake Iranian accounts last year because contact numbers for supposed American Twitter users had the +98 country code from Iran, a tactical mistake operatives are unlikely to make again.

But, among independent researchers and some lawmakers, significant skepticism remains on whether enough has been done to prepare for the threat in 2020.

"In 2016, Russia used bots and fake accounts to launch an unprecedented social media campaign designed to influence the results of our presidential election," said Sen. Mark Warner of Virginia, the top Democrat on the Senate Intelligence Committee. "That playbook is out in the open now, and you can bet that unless the platform companies get their acts together, we're going to see more and more foreign-based actors using it to wreak havoc in our democratic process."

The nations hosting significant disinformation capabilities typically first saw them active in seeking to manipulate domestic audiences, shaping public perceptions in line with regime propaganda. The next step often was working regionally, by infiltrating online conversations in neighboring countries, as Russia did in Ukraine in 2014 as it annexed Crimea and fomented unrest elsewhere in the country.

Dishost teams in Iran initially developed their tactics while manipulating domestic political conversation before gradually expanding operations to include more languages, more themes and foreign targets.

Human rights lawyer Simin Kargar, of Harvard's Berkman Klein Center for Internet & Society, said Iran for years has harassed journalists, political dissidents and artists in its internal disinformation campaigns. She has watched as Iran increasingly deployed such tactics against foreign targets.

"I wouldn't be surprised if the Iranians weren't trying to expand their operations for the coming election, especially with the rising tensions between Iran and the United States," said Kargar. "They would be far more savvy by 2020. I wouldn't be surprised if they weren't just trying to harness as much division as possible."

ADVERTISEMENT

Disinformation teams in Saudi Arabia have worked both internally and to manipulate other Gulf states, including in the nation's struggle with rival Qatar, said researcher Mark Owen Jones, an assistant professor of Middle East Studies at Hamad bin Khalifa University in Doha, the capital of Qatar. He said tactics in Saudi Arabia typically involve both sock puppets and automated accounts, called "bots," echoing official government propaganda, including things said or tweeted by Trump.

Jones recently detailed in a series of tweets an apparent information operation emanating from Saudi Arabia following a visit to the White House this month by Tamim bin Hamad Al Thani, the emir of Qatar. Jones found a single tweet, "The Prince of Qatar a supporter of terrorism, should not be in the White House but be at Guantanamo," had been posted up to 800 times an hour over several days, from 2,582 unique accounts. The tweets mostly were directed at U.S.-based targets, including Trump, the CIA and Secretary of State Mike Pompeo, and at several news organizations, including Fox News, Reuters and The Washington Post.

"There's still this pro-Trump message coming from Saudi Twitter, and I don't think that's likely to change," Jones said. "They view Trump's re-election as key to their own survival."

The Saudi embassy in Washington did not respond immediately to a request for comment.

This trajectory from nationally focused to internationally focused disinformation campaigns raises longer-term worries about what other nations might have disinformation teams sharpening their chops on domestic audiences with an eye toward eventual use against foreign targets, including in the United States. In addition to those with known foreign disinformation capabilities, there are numerous nations - Turkey, Egypt, the Philippines, Qatar, Mexico and others - that now use such tactics mainly to influence domestic politics but could turn their attention to foreign targets.

In a related trend, online mercenaries have begun offering information operations as a commercial service. Facebook shut down 265 accounts from an Israeli company, Archimedes Group, in May for seeking to manipulate elections through social media targeting voters in Latin America, Africa and Southeast Asia. The company said on its website that it would “use every tool and take every advantage available in order to change reality according to our client’s wishes.”

ADVERTISEMENT