TikTok will add information labels to posts about vaccines as UK threatens fines for fake news
- TikTok is stamping down on vaccine misinformation, introducing a new content screen for potentially harmful videos, and clarifying its rules worldwide.
- The announcement comes the same day the UK government warned of a "new age of accountability" for social media.
- Social media platforms operating in the UK could be fined 10% of global turnover for not following the rules.
- "It is more important than ever to ensure that misinformation that could harm wider public safety is not allowed to proliferate online," said Kevin Morgan of TikTok.
- Visit Business Insider's homepage for more stories.
TikTok has announced a suite of new global changes to tackle vaccine misinformation, to clarify rules about the app, and a new feature to protect users from questionable but allowable content.
In three announcements, the company said it will roll out information banners on any video posted to TikTok mentioning vaccines which will direct users to verifiable sources of information from 21 December.
Business Insider has previously reported on the rise of vaccine skeptics on social media, and the UK government's concerns around their rise.
As well as the vaccine information banner, TikTok will direct anyone searching for vaccine information on the app to a dedicated in-app hub filled with reputable information from sources including the World Health Organization. The vaccine hub will be accessible from 17 December.
"We know it is more important than ever to ensure that misinformation that could harm wider public safety is not allowed to proliferate online," said Kevin Morgan, TikTok's head of product and process in Europe.
Tech firms face fines of 10% of global turnover for misinformation
TikTok announced the news on the same day that the UK Department of Digital, Culture, Media and Sport unveiled new proposals for its Online Harms Bill, which would enshrine the requirements of big tech to protect users or face punitive fines.
The proposals include requirements to limit illegal content such as child sex abuse and terrorist material.
One of the proposals in the government's online harms bill is around vaccine dis- and misinformation: The proposed bill will include a requirement for social media platforms to enforce "clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological harm to adults." DCMS specifically highlighted vaccine disinformation and misinformation as one area.
Companies that do not follow those rules could be fined up to 10% of their annual global turnover, or £18 million (R361 million) - whichever is higher.
Ruth Smeeth, CEO of Index on Censorship, warned the DCMS proposals could limit free speech. "These proposals could establish a different legal framework for speech online as opposed to down the pub, and could have a significant chilling effect on our right to free expression," she said. "After all, no one is going to be fined for deleting too much content."
TikTok's vaccine hub goes live on the same day Theo Bertram, the company's director of government relations and public policy in Europe, speaks in front of the House of Commons Digital, Culture, Media and Sports committee.
TikTok also boosts support around mental health concerns
At the same time, TikTok has rewritten some of its community guidelines, the rules by which people operate on the app. The company has taken the advice of mental health experts to improve policies on self-harm, suicide and eating disorders.
The change in eating disorder policy comes a day before TikTok appears in front of lawmakers for the Commons Women and Equalities committee about body image.
In the next week, TikTok will also roll out new features to direct people struggling with mental health concerns to support. "Now, if someone searches for terms like 'selfharm' or 'hatemyself' they'll see evidence-based actions they can take," said Cormac Keenan, head of trust and safety at TikTok in Europe. "As before, access to Samaritans is available for emergency support."
Following negative headlines about potentially dangerous challenges proliferating among TikTok's younger users, the app has also bolstered its policy to limit, label or remove content promoting dangerous challenges or acts. "We encourage people to be creative and have fun, but not at the expense of an individual's safety, or the safety of others," said Keenan.
The app is also introducing an entirely new feature, opt-in viewing screens, which will block out potentially harmful or contentious videos that do not fall foul of TikTok rules, requiring users to say they're willing to see potentially challenging content.
The opt-in screen will appear on top of a video and give users the option to watch it or skip it without seeing it. The screen could appear on some educational videos that feature images of surgery, or other potentially violent content that is not banned by the app.