Facebook’s Technocratic Reports Hide Its Failures on Abuse

These reports obscure a torrent of hate speech and other toxic content

Jada Gomez
Momentum
Published in
Sep 1, 2020

--

“In other words, there’s too much toxic content for Facebook to ever really see it all (hello, scale) so Facebook has a formula for determining the amount based on random sampling. Abstractly, perhaps everyone understands this, but to think about it another way: There’s so much sewage on the platform that the company must continually guess how much of it people are actually seeing.” — Chris Gilliard

--

--

Jada Gomez
Momentum

Senior Platform Editor at Medium. Girl with the long last name from the Empire State. NYU Alum. Runner. Puppy Mommy. Smiler.