Meta revealed on Thursday that it had recently taken down a network of activities linked to the violent extremist group after it found members logging back into Facebook and Instagram. The company says it has removed about 480 Proud Boys accounts, pages, groups and events through a strategy it calls “strategic network destruction” — essentially neutralizing a network of activities associated with a banned group in a targeted, simultaneous cleanup.
By using this tactic, Meta says it is able to act effectively against dangerous organizations such as hate and terror groups that seek to maintain a foothold on the platform, reducing the chances that these accounts will coordinate and re-emerge.
“While there is no silver bullet here, our approach is having an impact on these dangerous organizations and we can see adversaries try harder to hide their affiliation and change their tactics,” Metta Head of Counter-Terrorism Policy Dina Hussain wrote on Twitter. “We will continue to be vigilant and share our findings.”
In addition to this specific targeted enforcement, Meta says it has also removed 750 other Proud Boys-related accounts, groups, pages and events during its normal moderation efforts in 2022 so far. Some of that activity involved Proud Boys members directing Facebook users to other platforms where the organization is not banned, though Meta declined to name those services.
Facebook then banned Proud Boys in October 2018 Twitter’s solution to do the same in August, designating the group as a dangerous hate organization under the platform’s rules. Ahead of the ban, TechCrunch explored how The Proud Boys used Facebook as a key recruitment huboperating a national network of well-organized chapters to expand its ranks through social network groups and algorithmic recommendations.
While the Proud Boys were once out and proud on Facebook, their efforts to rebuild their presence there are now much more elusive. This includes disguising members’ affiliations, promoting fake groups, and pushing more benign content that does not contain overtly extremist messages.
Meta doesn’t always share the moves it takes against extremists and hate groups, especially when those moves are part of ongoing efforts. On Twitter, Hussain contextualized the company’s decision to share its recent actions against Proud Boys to “highlight the hostile mutations we’re seeing” among banned groups making steady efforts to claw their way back onto the platform.
Meta’s approach to extremism has evolved significantly since the online heyday of the Proud Boys, QAnon conspirators, and countless violent anti-government militias that once organized openly on Facebook and Instagram. Meta is now applying the lessons learned through its more traditional, long-standing counterterrorism efforts as well as its more recently developed strategies to deal with what it calls “coordinated inauthentic behavior” — influencing disinformation or other propaganda campaigns that are often tied to authoritarian governments.
The violent far-right organization known as inciting street fights in left-leaning American cities during the Trump era is now at the center of the investigation into the Jan. 6 Capitol attack. This June, the Department of Justice accused five members, including former “Proud Boys” leader Henry “Enrique” Tario with a seditious conspiracy for their alleged role in planning and participating in the attack.