YouTube, the Great Radicalizer - The New York Times


77 bookmarks. First posted by tfinin march 2018.


It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.
youtube 
5 weeks ago by pcta-ptech
The videos it recommends seem to get more and more extreme.
GuillaumeChaslot  YouTube  YouTubeÖkonomie  Aufmerksamkeitsökonomie  Extremismus 
10 weeks ago by amprekord
It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident. YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims. Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.
12 weeks ago by sechilds
"What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general."
youtube  politics  culture  advertising  tech 
march 2018 by evilsofa
" Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century."
googlization  technology  culture 
march 2018 by jbushnell
Opinion | YouTube, the Great Radicalizer via Instapaper http://nyti.ms/2GAxjVd
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
instapaper 
march 2018 by patrick
YouTube pushes people toward more extreme content of whatever they're consuming, and this contributes to political radicalization.
youtube  radicalization 
march 2018 by gunsch
At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.
march 2018 by jbenton
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by michaelfox
Recommendation engines deemed dangerous. What about radicalisation for example. Domestic terror, etc
ads  ai  radicalisation 
march 2018 by traggett
"YouTube, the Great Radicalizer"
from twitter
march 2018 by peterjblack
the food metaphor comes up again. you could probably switch 'youtube' with 'facebook', 'instagram', 'twitter' and have more or less the same article.
via:popular  social-media 
march 2018 by mozzarella
via Pocket, YouTube, the Great Radicalizer
ifttt  pocket 
march 2018 by snehavii
This is not, by any means, a surprise. But the implications are terrifying. What the hell do we do about this?
march 2018 by miss_s_b
At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations. Soon I noticed something peculiar. via Pocket
IFTTT  Pocket 
march 2018 by hansdorsch
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by johnrclark
At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations. Soon I noticed something peculiar.
article 
march 2018 by mud
YouTube could be one of the most powerful radicalizing tools of this century, given its billion users and algorithms that recommend ever more extreme videos
march 2018 by joeo10
"It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes."
youtube  recsys  google  advertising 
march 2018 by arsyed
At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations. Soon I noticed something peculiar.
Archive  Pocket  feedly  ifttt 
march 2018 by brokenrhino
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by anglepoised
YouTube may well be operating as a giant radicalizing engine through its recommendation algorithm—leading people down a rabbit hole of misinformation, hoaxes…
from instapaper
march 2018 by davegullett
Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre-college laptop education market in the United States, typically come loaded with ready access to YouTube // you mean, like anything with a web browser. but the point is strong. pure #algokitsch
algokitsch 
march 2018 by yorksranter
As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.
march 2018 by leolaporte
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by AramZS
Optimisation algorithms… /via @charlesarthur. Thinks: learning analytics / personalisation / optimisation…
optimisation  youtube  culture  algorithmics  radicalisation 
march 2018 by psychemedia
"It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes."
video  society  software  psychology 
march 2018 by kevan
Opinion | YouTube, the Great Radicalizer via Instapaper http://ift.tt/2GcrjBO
IFTTT  Instapaper  Archive  Article 
march 2018 by TypingPixels
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by mikerugnetta
YouTube, the Great Radicalizer
from twitter
march 2018 by hawaii
At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations. Soon I noticed something peculiar. via Pocket
pocket  favorites 
march 2018 by bschlagel
"It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended.

Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident."
recommendation  algorithms  google  youtube  politics 
march 2018 by ssam
It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.
youtube  culture  politics  radicalization  crazy  machine-learning  google  zeynep-tufekci 
march 2018 by jm
Zeynep Tufekci watched some Trump videos on YouTube in 2016, and found it recommended more and more right-wing content:
<p>Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes.

What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.</p>


She compares it to how we feast on fatty foods - driven by our evolutionary instincts, which lead us astray when such foods aren't rare but instead are plentiful.

The question now is, will YouTube accept this, and fix it?
youtube  culture  radicalisation  extremism 
march 2018 by charlesarthur
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by nsfmc
YouTube may well be operating as a giant radicalizing engine through its recommendation algorithm—leading people down a rabbit hole of misinformation, hoaxes…
from instapaper
march 2018 by mathewi
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by hustwj
This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.
march 2018 by hakan
At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations. Soon I noticed something peculiar. via Pocket
IFTTT  Pocket 
march 2018 by schmitz
RT : Das ist alles so kaputt, unglaublich.
from twitter
march 2018 by diaeter
What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.
google  marketing  ethics  business 
march 2018 by jasonsamuels
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by yudha87
What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.
fakenews  conspiracy  politics  media  culture 
march 2018 by craniac
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by indirect
"At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content"
politics 
march 2018 by Zero_Dogg
Favorite tweet:

YouTube may well be operating as a giant radicalizing engine through its recommendation algorithm—leading people down a rabbit hole of misinformation, hoaxes and incendiary content. My latest for the New York Times on one of the most overlooked issues. https://t.co/nWIBXEr71P pic.twitter.com/IslV5jshcL

— zeynep tufekci (@zeynep) March 10, 2018
TwitterFav 
march 2018 by justinsincl
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by kohlmannj
“It seems as if you are never ‘hard core’ enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes.… Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.”
media  politics  culture  nyt 
march 2018 by syskill
Jennifer Heuer At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an…
from instapaper
march 2018 by spinnerin