SAN FRANCISCO - Front-line health workers in the United States began receiving coronavirus vaccines this week. But on social media, false theories about the vaccines’ dangers and conspiracies about the government’s plans for it are multiplying.
This week, researchers at misinformation-research group Zignal Labs found that false narratives claiming that the vaccine contains tracking microchips and that the government will make vaccines mandatory continue to circulate. In some cases, right-leaning figures and news sites pushed the disinformation, as well as dubious websites and followers of bogus conspiracy theory QAnon.
Social media companies have been working all year to remove false and misleading information about the pandemic from their sites. Companies including Facebook and Twitter say they are redoubling their efforts with new rules related to vaccines this month, adding new labels and trying to point people to legitimate news.
Still, the sheer size of social media companies’ user bases - plus their long-standing commitments to free speech - make the misinformation difficult to police. Many of these rumors are spread by people who may not realize they are sharing questionable claims, and are sharing them to be helpful without realizing they are perpetuating them.
Here’s what you need to know about misinformation spreading on social media, how to spot it and what you can do to combat it.
Q: What is the vaccine misinformation circulating online?
A: Anti-vaccine groups have long taken to social media to tout misleading information regarding vaccines, and social media companies in recent years have tried to stop the spread.
Pinterest was one of the first social media companies to stop searches of “vaccines” on its site in 2017. In 2019, Facebook stopped recommending groups that spread hoaxes about vaccines, and Twitter launched a prompt to direct people to official sources of information when they look for vaccine-related terms.
It hasn’t stopped false information regarding dangers of vaccinations debunked by medical professionals from thriving online.
Renee DiResta, technical research manager at Stanford Internet Observatory and a longtime researcher of communities opposed to vaccination, said that no single coronavirus anti-vaccine narrative has taken widespread hold. Nonetheless, misleading stories that percolate less widely can still have an insidious influence. She pointed out that most of the narratives circulating about the vaccine are recycled versions of messages anti-vaccine groups have put out for years.
“They are inserting the word ‘covid’ into the usual canards,” she said. “They recognize the potential audience is much, much larger - it’s not just new parents searching for info. It is everyone.”
The coronavirus vaccines are so new that even accurate information can spread in a way that could be misunderstood or distorted without the proper context.
This week, a story line about people who took the Pfizer vaccine developing the disease Bell’s palsy, which temporarily paralyzes muscles in the face, exploded in popularity, according to Zignal.
While the original news article was accurate, fact-checking group PolitiFact has said that the Bell’s palsy story has become exaggerated and distorted. Scientists have said that the number of people who developed Bell’s palsy - 4 in a group of 22,000 - is consistent with the number of people who have the disease in the actual population, and may have nothing to do with the vaccine, according to PolitiFact. The Food and Drug Administration is monitoring the issue.
Q: What are social media sites doing about it?
A: Social media companies including Twitter, Facebook, YouTube and TikTok have had policies in place to restrict misinformation about the coronavirus all year, and most are updating their guidelines to encompass the vaccine.
Twitter announced Wednesday that it will require users to remove tweets that spread false info about the vaccines, including baseless claims that the vaccine is not necessary because the coronavirus is not real or serious. It will also put a warning label on tweets with disputed or unsubstantiated rumors about the vaccines.
Researchers say misinformation about vaccines can be hard to police because much of it relies on opinion and personal beliefs.
“The platforms cannot control people’s opinions,” said Clemson University social media researcher Darren Linvill. “They can’t stop someone from saying ‘I’m not going to take the vaccine because I don’t think it’s safe.’ And it’s those thoughts and opinions that have as much of an effect on online communities as actual fake news or actual disinformation.”
Also this week, popular short-form video app TikTok said it will start directing people to information from official health sources when they search for vaccine information. It will also add a tag to videos mentioning the vaccine with a link for people to find out more from official sources.
Facebook said this month that it will remove posts about the vaccine that contain information that has been debunked by public-health experts, including “false claims that coronavirus vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list.” Earlier this year, it also banned anti-vaccination ads.
YouTube also says it will remove information about the vaccines that “contradict expert consensus from local health authorities or the World Health Organization.” It has removed more than 700,000 videos with medical misinformation about the coronavirus, its parent company Google said in a blog last week.
Q: Why is it difficult to stop these claims from spreading?
A: Misinformation is infamously hard to quell because social media’s ability to quickly share posts makes it easy for claims to spread widely before companies can react. And sometimes companies haven’t agreed on a response, making matters worse.
Neil Johnson, a physics professor at George Washington University who maps the spread of misinformation online, said he thinks of it as if an entire neighborhood has a pest problem. If one neighbor cracks down on the pests, they just move to another yard - or, in this case, another social media site.
“They just go to the neighbor, regroup, and come back,” he said.
Still, it’s heartening that the companies seem to be taking action fairly early this time, said Samuel Woolley, a professor and director of a propaganda-research team at the University of Texas at Austin, even if there’s no hope of eradicating the information.
“I think that we’ve experienced disinformation in greater amounts and in more specified contexts than ever before in 2020, and the social media companies have been slowly but surely ramping up response to this,” he said.
Social media users can also help stop the spread, Woolley said, by only sharing information from official health sources and well-known credible news sources, and reporting violating posts online. It’s worth the time to look up claims on Snopes or the Associated Press or other fact-checking organizations before sharing, he said.
Social media companies rely in part on user reports to help moderators find offending posts.
Even if you are going to debunk a claim in the hope of pointing out a falsehood to friends and family, experts warn not to share the original false post. Social media companies reward engagement of any kind, said Joan Donovan, research director at Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy.
“Be cognizant of the role you play,” Donovan cautioned.
Q: What else should I know about the vaccines?
A: The Washington Post has been tracking vaccine developments. The first vaccine, created by Pfizer and BioNTech, was cleared by U.S. regulators last week and health-care workers began receiving doses on Monday. A second vaccine, developed by Moderna, is expected to be authorized this week.
No serious safety concerns have been reported in either vaccine trial. Health officials are working to get enough people vaccinated so the country can reach herd immunity, which occurs when enough people are immune to stop the spread of the disease.
- - -
The Washington Post’s Elizabeth Dwoskin contributed to this report.