Facebook will ramp up efforts to curb coordinated activities from real users who are connected to dangerous activities in the real world, such as promotion of vaccine misinformation and organizing violence, the company said Thursday.
The new policy is an attempt to address a gap in the platform’s enforcement against real individuals who band together to repeatedly violate the platform’s standards. The plan is based on Facebook’s existing efforts to scrub its platform of fake accounts.
“From a security perspective, our goal is to borrow from the cybersecurity world and build an in-depth approach here, where we have multiple layers to catch violating activity that can cause harm to people on our platform,” Nathaniel Gleicher, Facebook’s head of security policy, said Thursday in a call with reporters.
Facebook will take a range of actions against violating accounts, including reducing content reach and disabling violating accounts.
The new policies build on Facebook’s work cracking down on inauthentic coordinated behavior, like those employed by Russia’s Internet Research Agency in 2016. Facebook has since removed thousands of fake pages and inauthentic accounts related to political efforts from nations including Egypt and Iran, as well as hundreds of accounts linked to a conservative organization,