YouTube will remove videos that contain incorrect information about coronavirus vaccines. The network will ban any related content that violates information from health authorities or the World Health Organization (WHO), Reuters reported. YouTube is expanding rules to prevent the spread of misinformation or conspiracy theories about the coronavirus pandemic.
–
For example, portal administrators will delete videos claiming that the vaccine kills, causes infertility, or that microchips will be introduced into humans along with vaccination.
–
“The covid-19 vaccine may be (developed) very soon, so we want to make sure we have the right rules in place so that we can remove misinformation about such a vaccine from the platform,” a spokesman for The Guardian told YouTube.
–
The Internet server claims that it is already removing videos that question coronavirus transmission, promote unscientific treatment methods, discourage people from seeking medical care, or question health authorities’ recommendations about the need for isolation or physical separation. Since the beginning of February, YouTube has removed over 200,000 videos with misinformation around covid-19.
–
YouTube follows the social network Facebook, which on Tuesday announced that it will start rejecting advertisements that are intended to discourage people from being vaccinated. However, a number of commentators have pointed out that lies about vaccination and other medical misinformation are widespread through groups and sites that are not covered by the new ban.
–
Facebook has previously begun to reduce the visibility of disinformation sites and does not allow ads containing hoaxes brought to the attention of the WHO or the US Centers for Disease Control and Prevention.
–
Disinformation is often disseminated by users themselves
Very often, hoaxes and misinformation about coronavirus are also spread through social networks such as Facebook or Twitter. Although Internet giants are trying to combat this, their work is being made more difficult by the users themselves.
–
They often spread fictional news – they simply believe what they read on the Internet. It is no exception that some messages have tens or hundreds of thousands of shares. Unknowingly, they help the creators of disinformation to earn a good bundle of money.
–
Similar cases of global reach are a gold mine for disinformation creators. They use people’s fears and panics to direct them to their websites. As a result, they profit from the displayed display, and even several million dollars.
—