To curb violent content on its video platform, Facebook is planning to hire extra 3,000 reviewers to filter content. These facebook reviewers help to remove violent content amid growing criticism. The social media giant is accused of allowing the platform to be used for promoting violence and hateful activities.
“If we’re going to build a safe community, we need to respond quickly,” chief executive Mark Zuckerberg said on his Facebook page. Recently, a 20-year-old Thai man broadcasted a live video of him killing his baby daughter before committing suicide.
However, Facebook removed the footage within an hour of attack and CEO Zuckerberg acknowledged that Facebook should stop spreading such.
“We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down,” says, Zuckerberg.
The 3000 new recruits of Facebook reviewers will be added in the next 1 year will increase the size of Facebook’s community operations team. These new additional reviewers help in removing things like hate speech and child exploitation which Facebook don’t allow to broadcast. Besides that, Facebook is also blamed for slow reaction to online violence.
CEO declared that Facebook is working on to identify violence and inappropriate content better. Recently, Facebook stepped up its security to oppose efforts by governments and others to spread misinformation or manipulate discussions for political reasons.
Revenge porn, a new tool unleashed by Facebook across all its platforms like Messenger and Instagram.
For more news on latest technology and gadgets Click here