For years, misinformation was tolerated on platforms such as Facebook. Since the beginning of the corona crisis, the rules have been tightened somewhat. But many distributors of misleading content are now established enough to still reach their followers.
The US vaccination campaign is in full swing. But at the same time, many users of social media are flooded with messages that try to undermine trust in the vaccines used. The operators of the large platforms assure that they would take decisive action against such propaganda. A real solution to the problem seems to be almost impossible – also because the internet companies let everything go for a long time for commercial reasons.
Digital networks such as Facebook, Instagram and Twitter have been effective mouthpieces for anti-vaccination users for years. As public pressure grows, companies from Silicon Valley have now introduced tools such as fact checks and warnings. A small part of the false information that is circulating is identified and blocked if necessary – but only a small part.
The short message service Twitter announced this month that it would remove dangerous falsehoods about the vaccines against the coronavirus – similar to what has been done with conspiracy theories and other misleadings in connection with the pandemic. A total of 8,400 corresponding tweets have actually been deleted since April 2020.
Opponents of vaccination avoid hurdles
But from the point of view of critics, this is far too little in view of the gigantic amount of fake news that reaches millions of users every day via Twitter. “While they remain inactive, lives are lost,” says Imran Ahmed, head of the Center for Countering Digital Hate, which campaigns against hatred and “fake news” on the Internet. In December, Ahmed and his colleagues discovered that a total of 59 million accounts were followed by distributors of “anti-vaccination propaganda” on the major social media platforms.
Because of the great social importance of nationwide vaccinations against the coronavirus, the problem has come more into focus. If, however, suddenly what was previously tolerated is restricted, this often leads to accusations of censorship. At the same time, those who oppose the vaccination are increasingly finding ways to circumvent the new hurdles.
“It’s a difficult situation because we’ve let it go for so long,” says Jeanine Guidry, who studies the interaction between social media and health information at Virginia Commonwealth University. “People who use social media have been able to really share what they want for almost a decade.”
Unsubstantiated claims
One of the Facebook pages that the Newsguard organization has identified as a “superspreader” is called “The Truth About Cancer”. She has more than a million followers and has been making unfounded claims for years – for example that vaccines could cause autism or threaten brain damage in children. Recently, however, posts are no longer published here directly via the Facebook page. Instead, users are asked to subscribe to a newsletter or visit an associated website.
According to its own statements, Facebook is now taking “offensive steps to combat false information”. “Millions of contributions to Covid-19 and vaccines” have been removed since the beginning of the pandemic, according to a statement. In addition, 167 million posts were provided with warnings. The company has also banned ads aimed at preventing people from getting vaccinated.
The video platform YouTube claims to have deleted more than 30,000 videos since a ban on false claims regarding corona vaccinations announced in October. Since February 2020, more than 800,000 videos containing dangerous or misleading information about the virus have been removed, says YouTube spokeswoman Elena Hernandez.
WHO falls on deaf ears
Before the pandemic began, however, the social media companies had done little to counter the spread of misinformation, says Andy Pattison, manager for digital solutions at the World Health Organization. During a measles outbreak in the northwest of the United States in 2019, he urged the platforms to consider tightening the rules – but to no avail.
The extent of the Corona crisis has at least set something in motion here. Pattison now meets weekly with representatives from Facebook, Twitter and YouTube to talk about trends and possible measures. “The really frustrating thing about misinformation about vaccines is that they have been around for years,” says the WHO expert.
And the actors behind it are difficult to stop even with the stricter rules. Many simply adapt their strategies. In order to prevent automatic blocking, they deliberately use incorrect spelling – instead of the English “vaccine”, for example “vackseen” or “v @ x”. Some sites have switched to more subtle forms of opinion making and are more likely to spread pictures or so-called memes.
Obstacle incorrect information
A survey by AP-Norc in February showed that around a third of Americans do not want to be vaccinated or probably not want to be vaccinated. Vaccine skepticism and misinformation could be a major obstacle to vaccinating a sufficient proportion of the population to end the crisis, ”says Vanderbilt University psychologist Lisa Fazio.
Some experts also see the cause of the problem in the business models of Internet companies. Because their advertising revenue depends heavily on the total number of users – even if some of these users share questionable content. Ahmed, the head of the Center for Countering Digital Hate, complains that there is a “seamless mixing of misinformation and information”. This would mean that the platforms would “continue to allow things to fall apart”.
–