Facebook has revised its policies on hate speech to cut down on a growing number of offensive and insensitive comments shared on its social network.
The changes will include a review and update of the guidelines employed by the User Operations team, which investigates and acts on violations of the Community Standards policy, in addition to updated training for this same team. Facebook will be seeking feedback from legal experts and representatives of groups that have historically faced discrimination.
Facebook will also be cracking down on “cruel or insensitive” content, holding the creators of such content to greater accountability than before. It said a new feature, which requires the authors of such content to link their authentic identity to it, has been in testing for several months now.
A more formal line of communication between opponents of hate speech and Facebook will be opened, with members of various communities invited to participate, so that content believed to violate the website's policies can be investigated.
Finally, Facebook plans to conduct research on the effects of online hate speech and plans to continue to work with the Anti-Defamation League's Anti-Cyberhate working group to better understand how to balance removal of offensive material with maintenance of free speech.
The move follows complaints made primarily by Women, Action and The Media and The Every Sexism Project over images and content inciting gender-based violence and hate, but Facebook also acknowledged complaints made by representatives from the Jewish, Muslim and LGBT communities.