您现在的位置是:百科 >>正文
【】
百科19人已围观
简介Facebook's policies for moderating hate speech were exposed Wednesday in what appears to be the most ...
Facebook's policies for moderating hate speech were exposed Wednesday in what appears to be the most in-depth investigations into how and why these decisions were made, ProPublicareported.
One of the most cringeworthy parts shows a slide asking, "Which of the below subsets do we protect?" It reads female drivers, black children, and white men. The section for white men shows a picture of the Backstreet Boys.
Tweet may have been deleted
The answer? The Backstreet BoysWhite men.
The policies, according to ProPublica, allowed moderators to delete hate speech against white men because they were under a so-called "protected category" while the other two examples in the above side were in "subset categories," and therefore, attacks were allowed.
Looks bad -- quite bad.
Facebook's moderation policies is a complicated system, where these "protected categories" are based on race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation and serious disability/disease, according to ProPublica. Meanwhile, black children wouldn't count as a protected category because Facebook does not protect age, and female drivers wouldn't count because Facebook does not protect occupation.
According to Facebook, it's policies aren't perfect.
“The policies do not always lead to perfect outcomes,” Monika Bickert, head of global policy management at Facebook, told ProPublica. “That is the reality of having policies that apply to a global community where people around the world are going to have very different ideas about what is OK to share.”
That's a similar excuse Facebook put forth in a blog Tuesday as part of its "Hard Questions" series. When asked about the ProPublicareport, Facebook pointed Mashable to this blog post.
In good news, Facebook is trying to better itself. That includes being more transparent about its practices, which comes only after reports like ProPublica's and The Guardian's recent "Facebook Files" series.
At least Facebook's come far. ProPublica revealed that back in 2008, when the social network was four years old, Facebook only had a single page for its censorship rulebook, and there was a glaring overall rule:
"At the bottom of the page it said, ‘Take down anything else that makes you feel uncomfortable,’” said Dave Willner, who had joined Facebook’s content team in 2008, toldProPublica.
Willner then worked to created a 15,000-word rulebook, which is still in part used today at the company. And yet, there remains to be many problematic areas on how the network polices itself. The Guardian's Facebook Files revealed numerous issues like how Facebook allows bullying on the site and other gray areas.
Featured Video For You
TopicsFacebookSocial Media
Tags:
转载:欢迎各位朋友分享到网络,但转载请说明文章出处“夫榮妻貴網”。http://www.new.maomao321.com/news/1f55099448.html
相关文章
Twitter grants everyone access to quality filter for tweet notifications
百科Twitter introduced two features Thursday in an effort to give users more control on what notificatio ...
【百科】
阅读更多Old photos of Jeff Bezos are truly a sight to behold
百科Amazon CEO Jeff Bezos might be jacked and powerful now, but his photo shoot panache has really taken ...
【百科】
阅读更多Sports Illustrated predicted Astros World Series win in 2014
百科When the Houston Astros clinched the team's first-ever World Series championship just a minute or tw ...
【百科】
阅读更多
热门文章
- Dramatic photo captures nun texting friends after Italy earthquake
- In China's biggest viral game, you clap for its president while he gives a speech
- People on Twitter drag reporter who claims nobody goes to libraries anymore
- AT&T will open AI marketplace to the public next year
- You can now play 'Solitaire' and 'Tic
- Harrison Ford isn't a fan of the Trump administration's climate agenda
最新文章
More than half of women in advertising have faced sexual harassment, report says
Girl films people constantly tripping over a step at a college football game
The Phoenix Suns were secretly replaced by robots. Here's the proof.
Waymo found drivers asleep, so it dumped partial self
Donald Trump's tangled web of Russian influence
Taylor Swift's song 'Gorgeous' is actually about a cute dog