YouTube has removed more than 30,000 videos containing misinformation about the coronavirus vaccine, the company said Thursday.
The videos “included claims about COVID-19 vaccinations that contradict local health authorities or the World Health Organization,” said Elena Hernandez, a YouTube spokesperson. “Overall, since February 2020, we have removed over 800,000 videos related to dangerous or misleading coronavirus information.”
YouTube’s policy against Covid vaccine misinformation dates to October of last year, when the company announced that false claims, such as that the vaccine is lethal or will lead to microchip tracking, would be removed.
YouTube has come under increasing scrutiny for the way its recommendation engine can lead unsuspecting users down extremist rabbit holes and spread misleading claims. Earlier this month, CEO Susan Wojcicki said former President Donald Trump’s account will eventually be restored after it was suspended for incitement.
YouTube has removed more than 30,000 misleading Covid-19 vaccination videos in the past five months, it said.
A YouTube spokeswoman said the videos contradicted vaccine information from the World Health Organization (WHO) or health authorities such as the NHS.
In October, it banned vaccine misinformation in a bid to clamp down on attempts to discredit the jabs.
It added that in the past year, it had removed more than 800,000 videos for coronavirus misinformation.
That figure covers more than just vaccines, but wider "medically unsubstantiated" claims about the virus.
We showed volunteers an anti-vax video. How did they react?
YouTube bans misleading Covid-19 vaccine videos
It includes false claims that the vaccine kills people, causes infertility, or contains a secret microchip that will be implanted into recipients.
In the early stages of the pandemic, YouTube was home to many conspiracy theories about the disease and even false claims of non-existent "cures".
Despite its ban on such content, finding and deleting it remains a struggle for YouTube and other social platforms.
2px presentational grey line
Analysis box by Marianna Spring, Disinformation and social media reporter
A universal criticism of social media sites throughout the pandemic has been the slow speed at which they have acted over harmful disinformation.
In recent months, attention has turned to how much they have allowed falsehoods about the vaccine to proliferate on their platforms.
YouTube has generally been ahead of the curve when it comes to introducing policies to tackle this.
But I have investigated the impact of anti-vaccine content online for a number of months, and there is no doubt that it had already scared people off the jab long before this announcement.
To make matters worse, the minority of committed activists who spread harmful anti-vaccine content online are using increasingly sophisticated tactics that pose problems for a video platform such as YouTube.
My recent investigation for BBC Panorama revealed how videos featuring people who brandish medical credentials to promote false vaccine claims are very effective at playing on pre-existing concerns.
These kinds of videos have thrived on platforms such as YouTube - and a number of the main culprits still use YouTube to cultivate an audience of hundreds of thousands of subscribers.
- The Biden administration endorsed a Senate deal for the second time in a day the President supports the compromise agreement