Facebook came out with its monthly compliance report under the new IT rules on Friday. The social media giant removed 31.83 million content pieces between August 1 to August 31. The number is slightly lower than 33.3 million removed last month but over 45 days.

Among its ten categories, Facebook actioned 25.9 million posts reported as Spam, 2.6 million posts with violent and graphic content and 2 million posts containing adult nudity and sexual activity.

Instagram

During the same time, Facebook’s subsidiary Instagram actioned 2.2 million posts, including 885,700 pieces of violent and graphic content, 577,000 suicide and self-injury posts and 462,400 adult nudity and sexual activity. Contents were actioned across nine categories here, excluding spams.

“Given that such violations are also highly adversarial, country-level data may be less reliable. For example, bad actors may often try to avoid detection by our systems by masking the country they are coming from. While our enforcement systems are global and will try to account for such behaviour, this makes it very difficult to attribute and report the accounts or content by producer country (where the person who posted content was located),” the company stated in the report.

It added, “Given the global nature of our platforms where content posted in one country may be viewed almost anywhere across the world, other ways to attribute the country of content removed in a technically feasible and repeatable manner, become almost meaningless. So these estimates should be understood as directional best estimates of the metrics.”

comment COMMENT NOW