Facebook Inc will be adding 3,000 people to their community operations team around the world to review the millions of reports they get every week, and improve the process for doing it quickly. Currently Facebook team have around 4,500 people in team for moderating content to make community better.
Facebook is stepping up its efforts to keep inappropriate, violent and life threatening material, including videos of murders and suicides, extremist propaganda and hate speech. Aim of building this team for making community a place by responding quickly when someone needs help or taking a post down.
For years now, Facebook has outsourced content moderation to staffing companies who spend all day looking for offensive or illegal content. Facebook refused to comment on where those 3,000 workers will be employed or outsourced.
In post Mark Zuckerberg added,
We’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.
Previously, Facebook Carried Out Massive Spam Operation Against Fake Accounts. The team of Facebook is quite satisfied because their users are now quite feeling satisfied after this massive spam operation against fake profiles.
Facebook had 1.94 billion monthly active users as of the end of March, up 17 percent from a year earlier. Daily active users were 1.28 billion, on average, for the month of March.