YouTube has, for the first time, revealed the scale of the task it faces in removing videos that violate its terms.
The Google-owned platform pulled down 8.3 million videos between October and December 2017, with more than half of the problem posts being spam or sexual content. Here's a full breakdown:
In a blog detailing the work it's doing to enforce its community guidelines, YouTube said the removed videos represented a "fraction of a percent of YouTube’s total views" during the final three months of last year.
The company added that 6.7 million of the problem videos were rooted out by the machines rather than humans. The rest were identified by trusted flaggers, users, non-governmental organisations, and government agencies.
"Automated flagging enables us to act more quickly and accurately to enforce our policies," YouTube said, adding that since introducing the system, more than half of videos removed for violent extremism had fewer than 10 views.
YouTube is under huge pressure to stay on top of inappropriate content, with the company being criticised for carrying videos promoting terrorism and child abuse. Inappropriate videos have also surfaced on YouTube Kids, including weird conspiracy videos.