您现在的位置是:探索 >>正文
【】
探索2人已围观
简介Here's a study supported by the objective reality that many of us experience already on YouTube.The ...
Here's a study supported by the objective reality that many of us experience already on YouTube.
The streaming video company's recommendation algorithm can sometimes send you on an hours-long video binge so captivating that you never notice the time passing. But according to a study from software nonprofit Mozilla Foundation, trusting the algorithm means you're actually more likely to see videos featuring sexualized content and false claims than personalized interests.
In a study with more than 37,000 volunteers, Mozilla found that 71 percent of YouTube's recommended videos were flagged as objectionable by participants. The volunteers used a browser extension to track their YouTube usage over 10 months, and when they flagged a video as problematic, the extension recorded if they came across the video via YouTube's recommendation or on their own.
The study called these problematic videos "YouTube Regrets," signifying any regrettable experience had via YouTube information. Such Regrets included videos "championing pseudo-science, promoting 9/11 conspiracies, showcasing mistreated animals, [and] encouraging white supremacy." One girl's parents told Mozilla that their 10-year-old daughter fell down a rabbit hole of extreme dieting videos while seeking out dance content, leading her to restrict her own eating habits.

What causes these videos to become recommended is their ability to go viral. If videos with potentially harmful content manage to accrue thousands or millions of views, the recommendation algorithm may circulate it to users, rather than focusing on their personal interests.
YouTube removed 200 videos flagged through the study, and a spokesperson told the Wall Street Journalthat "the company has reduced recommendations of content it defines as harmful to below 1% of videos viewed." The spokesperson also said that YouTube has launched 30 changes over the past year to address the issue, and the automated system now detects and removes 94 percent of videos that violate YouTube's policies before they reach 10 views.
While it's easy to agree on removing videos featuring violence or racism, YouTube faces the same misinformation policing struggles as many other social media sites. It previously removed QAnon conspiracies that it deemed capable of causing real-world harm, but plenty of similar-minded videos slip through the cracks by arguing free speech or claiming entertainment purposes only.
YouTube also declines to make public any information about how exactly the recommendation algorithm works, claiming it as proprietary. Because of this, it's impossible for us as consumers to know if the company is really doing all it can to combat such videos circulating via the algorithm.
While 30 changes over the past year is an admirable step, if YouTube really wants to eliminate harmful videos on its platform, letting its users plainly see its efforts would be a good first step toward meaningful action.
TopicsYouTube
Tags:
转载:欢迎各位朋友分享到网络,但转载请说明文章出处“夫榮妻貴網”。http://www.new.maomao321.com/news/75e2899896.html
相关文章
Mom discovers security cameras hacked, kids' bedroom livestreamed
探索A mother in Houston, Texas woke up one morning to pretty much every parent's worst-case scenario.。 " ...
【探索】
阅读更多Justin Trudeau and Emmanuel Macron should just get matching jackets already
探索The seemingly already robust friendship between Justin Trudeau and Emmanuel Macron is the toast of t ...
【探索】
阅读更多The Warriors have been unbeatable since Klay Thompson fulfilled a weird autograph request
探索The Golden State Warriors are unbeatable right now. Literally. The team America loves to hate has be ...
【探索】
阅读更多
热门文章
- Pole vaulter claims his penis is not to blame
- WWDC 2017: iOS 11, HomePod, and everything else you need to know
- Forget streaks, Snapchat's 'friend filters' are the new way to chat with BFFs
- Trump Twitter bot reminds us that all his tweets are coming from the White House
- This 'sh*tpost' bot makes terrible memes so you don't have to
- Netflix cancels 'Sense8,' and fans are pissed at the timing
最新文章
Australian football makes history with first LGBT Pride Game
Surprise, no one wants to be Ted Cruz's Secret Santa
Cadillac's vehicle
Longchamp and Burberry have set up stores on WeChat, China's top messenger app
Old lady swatting at a cat ends up in Photoshop battle
People are pretending the floor is lava and no, you haven't traveled back in time
