While the number sounds huge, surprisingly, 113 million is just 1% of the total videos published during that three-month period, The Verge reported.
The most common policy violation which caused the removal of videos was found to be along the lines of minor safety, accounting for nearly 44% of the content taken down. Other reasons include nudity and illegal activities.The platform's automated system is removing more and more videos. Over 95% of the videos removed in those three months were not reported by a user. The company's AI and human review detected users violating guidelines.“We expanded our capacity to iterate rapidly on our systems given the fast-changing nature of misinformation, especially during a crisis or event (e.g. the war in Ukraine or an election)," the report said.