Facebook can now remove accounts for interacting with violating content

Facebook ▪ Community Guidelines ▪ September 20, 2025

By Saumyaa Naidu, a third-party contributor

Facebook updated the human exploitation provision in their Community Guidelines to remove accounts based on their interactions with violating accounts, content, and groups. Based on the policy, violating accounts are those posting content that “recruits people for, facilitates, or exploits people through human trafficking”. Facebook will categorise interactions with such accounts as “behavioral signals” to detect and take action on violating accounts.

This section of the policy no longer specifies removing content that offers jobs in areas flagged as high-risk for labor exploitation by law enforcement and local NGOs. However, it is still mentioned under the standards that require additional information and/or context to enforce. 

Based on the change, Facebook sees interactions with violating accounts on the platform as signals for harmful activity, and may not require additional information to take action. This can cause overreaching enforcement and lead to unwarranted removal of accounts based on association as opposed to any direct violation.