By Brian Stelter and Katie Pellico, CNN
Last week, even as it came under fire from the White House over its role in spreading anti-vaccine misinformation, Facebook hadn’t taken the simple step of blocking the #VaccinesKill hashtag on its platform.
Now the hashtag is hidden on the platform, locked behind a message that says Facebook is “keeping our community safe.”
The change happened hours after CNN Business asked Facebook why the page full of anti-vaccination falsehoods was easy to find. If this sounds familiar, it’s because almost the exact same thing happened with Facebook-owned Instagram two years ago, during one of the company’s previous efforts to tell people that, seriously, it really was doing a great job of moderating anti-vaccine content.
It’s yet another example of the Whack-a-mole that happens all across social media. Reporters or other users notice content that clearly violates a platform’s policies; they ask why it is being permitted; the platform whacks it away; and then the cycle repeats.
The existence of the #VaccinesKill content was noticed by CNN last weekend, after President Joe Biden accused Facebook of “killing people” by letting lies spread on its platforms.
Biden later walked that back and focused his ire on individuals and organizations who use Facebook to spread disinformation.
It remains quite hard to get a handle on the scope of the problem. Many of the so-called “disinformation dozen” that Biden criticized, who were identified in a report by the Center for Countering Digital Hate as super-spreaders of anti-vaccine propaganda, have been banned in some way from one or another of Facebook’s platforms or have gone quiet. Some of the “dozen” have learned how to post in ways that create less risk Facebook will take action against them.
But in different corners of the never-ending website, there are egregious violations of Facebook policies that are meant to curb the Covid-19 pandemic.
A review of the #VaccinesKill hashtag page on Saturday showed posts from ordinary users with fear-mongering messages about “vaccines literally eating people’s brains” and shadowy forces launching a “population reduction plan.” Other posts warned people against “injecting this software into your system” and said “if you love your children then don’t let them get the jab!”
The hashtag page was not particularly active, but it was clear that some users wanted their Facebook friends to latch onto #VaccinesKill rhetoric, and were using the hashtag accordingly.
Some users attached videos from Fox host Tucker Carlson and InfoWars host Alex Jones. In another case, a user shared an anti-vaccination article from a website that pretended to be an authoritative news source. Some of the posts were accompanied by a Facebook label that pointed people to accurate information about vaccines.
Similar posts full of misinformation were seen on the #VaccinesKill hashtag on Instagram in 2019.
Back then, before the Covid-19 pandemic, CNN Business wrote about the harmful content on Instagram, and Instagram responded by blocking the #VaccinesKill hashtag. It still is blocked there.
But the hashtag remained active on Facebook. “Our process to determine whether a hashtag violates our policies takes several factors into account, including the percentage of content using the hashtag that violates,” a Facebook spokesperson said in response to questions.
“We began blocking the #vaccineskill hashtag on Instagram in 2019 because there was a substantial portion of content with the hashtag that violated our policies. At the time, Facebook content with the hashtag did not reach our threshold to block the hashtag,” the spokesperson said. “Now, the #vaccineskill hashtag on Facebook violates our policies against misinformation about COVID-19 and vaccines and we’ve blocked it from search.”
Facebook said it also removes individual posts with the hashtag “that violate Community Standards when we become aware of them.”
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.