Facebook has said it will start removing false claims about new coronavirus vaccines after it was revealed jabs could be rolled out as soon as next week.
The social media site will remove disinformation – including claims vaccines contain microchips or anything else not on the official ingredient list – but warned it will “not be able to start enforcing these policies overnight”.
In October, the company announced it would ban ads that discourage people from getting vaccines.
This will now also apply to new Covid-19 vaccines.
A Facebook spokesperson said: “We are applying our policy to remove misinformation about the virus that could lead to imminent physical harm.
“This could include false claims about the safety, efficacy, ingredients or side effects of the vaccines.
“For example, we will remove false claims that Covid-19 vaccines contain microchips or anything else that isn’t on the official vaccine ingredient list.
“We will also remove conspiracy theories about Covid-19 vaccines that we know today are false, like specific populations are being used without their consent to test the vaccine’s safety.”
After Wednesday’s Pfizer/BioNTech vaccine announcement, there was a surge in disinformation being shared online, with widely debunked claims from opponents of vaccination being posted across various social media platforms.
FullFact, an independent fact-checking charity, has been working with Facebook to tackle disinformation.
Speaking on Wednesday, editor Tom Phillips told the PA news agency: “We have seen a lot of the internet platforms take stricter measures against vaccine misinformation and I think that is the correct approach. Could some of them go further? Yes, possibly.
“But at the same time, it is important to remember the importance of free speech. It’s not illegitimate to have questions or worries about the vaccine and it’s important that we don’t just react by trying to suppress those questions. We allow people to ask the questions, get good quality answers and make up their minds based on good quality information.”
The site said it will continue to regularly update the claims they remove based on current guidance from public health authorities.
Between March and October, Facebook and Instagram removed 12 million pieces of misinformation related to Covid-19.
In April alone, it put warning labels on about 50 million pieces of content, with 95% of people who saw the label not clicking past to view the content.
Between March and October, it put warning labels on 167 million pieces of content.
The spokesperson added: “We have directed over two billion people globally to authoritative information from public health authorities such as the WHO (World Health Organisation) and in the UK the NHS, and we will continue to help people stay informed about these vaccines by promoting authoritative sources of information through Facebook’s Covid-19 Information Centre.”
Caroline Dinenage, minister for Digital and Culture, said: “Quack pseudo-science and conspiracy theories spread through malice or ignorance on social media could put British lives at risk and I’m glad to see Facebook acknowledging the severity of this challenge.
“We’ll be closely monitoring the success of these measures and have stepped up our co-operation with Facebook, Twitter, Google and other platforms to remove dangerous disinformation on their sites and promote the truth about vaccines during this crucial time.”
Additional Reporting By The Press Association