Following a series of suicides and murders that were streamed live or hosted on Facebook for hours before they were officially taken down, Mark Zuckerberg has announced that the company will be hiring an additional 3000 moderators to its global community operations team over the next year. That will bring the total size of the department to 7500, and the manpower will be dedicated to reviewing “the millions of reports we get every week, and improving the process for doing it quickly.”
Mark Zuckerberg wrote that these reviewers will “help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” and that the social network will continue working with law enforcement and local community groups who “are in the best position to help someone if they need it.”
In addition, Facebook will make it simpler for members to report problems and speed up the process for its reviewers to determine which posts violate community standards. The company previously opened up access to its suicide-prevention tools to all its users, and developed an AI system to identify potentially suicidal people.
One of the biggest criticisms against Facebook in the recent incidents is its delay in addressing the problematic content on its video sharing platform. Mark Zuckerberg appears to acknowledge that issue in his post, stating “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner, whether that’s responding quickly when someone needs help or taking a post down.”
He further stated, “In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.”