Youtube announced on Monday that they took down more than 8 million videos in violation with their guidelines within a span of three months, between October and December. The company said that most of the videos were adult content that people had tried to upload, or simply spam.

"This regular update will help show the progress we're making in removing violative content from our platform," Yourube said in their blog post.

Out of that 8 million, 6.7 million were actually flagged my Youtube's algorithms, and three quarters of them didn't receive any views before taken down.

Youtube has more than a billion users (of whom we don't really know how many are unique, considering many people have multiple accounts). And every day roughly billion hours of video is watched.

To address the issue of problematic content, Google, the parent company of Youtube, has pledged to hire 10 000 new employees by end of year.

"YouTube and Google are facing increasing pressures to step up their screening and flagging efforts around inappropriate content. Transparency is key on this hot button issue and these quarterly blog posts are a sign that Google is aggressively focused on this area over the coming years," head of technology research at GBH Insights, Daniel Ives, said.