Member-only story
Facebook’s Technocratic Reports Hide Its Failures on Abuse
These reports obscure a torrent of hate speech and other toxic content
“In other words, there’s too much toxic content for Facebook to ever really see it all (hello, scale) so Facebook has a formula for determining the amount based on random sampling. Abstractly, perhaps everyone understands this, but to think about it another way: There’s so much sewage on the platform that the company must continually guess how much of it people are actually seeing.” — Chris Gilliard