Facebook Community Guidelines
Publish date: May 15, 2022


[ Part III, Section 2.1 ] Community participation to ensure model diversity in content moderation

Facebook’s content moderation is currently performed by a diverse jury of 97,357 Machine Learning models. Neurodiversity in neural nets has led to improvements in decision-making around a wider variety of content. The current jury process is particularly well suited to edge cases where typical models fail to make reasonable decisions.

Facebook encourages users to submit their own Machine Learning models to the jury — currently, over 80% of the jury comprises models selected from Facebook’s user community. Guidelines on model parameters and training data are provided in Appendix E of this document. The final outcome of a jury proceeding is determined through an automated democratic process. When content passes through moderation filters, each model gets to cast a single vote assigning the given content to a category. Votes per category are tallied and simple majorities win. In case of a tie, a human moderator is alerted to make the final decision.

Udit Vira
Actions