FilterBubble   518

« earlier    

Digitaler Tribalismus und Fake News | ctrl+verlust
In dieser Untersuchung werden wir zeigen, wie sich Fake News links und rechts des politischen Spektrums tatsächlich verbreiten und wie soziale Strukturbildung im Netz zu hermetischen Gruppen führen kann. Wir werden auch zeigen, was die Theorie der „Filterbubble“ erklären kann und was nicht. Wir werden eine völlig neue Perspektive auf die rechte Internetszene bieten, die vielleicht dabei hilft zu verstehen, wie sich Hass im Netz findet, verhärtet, wächst und organisiert.
fakenews  longform  media  society  psychology  filterbubble  research 
13 days ago by SimonHurtz
YouTubes Algorithmen sorgen dafür, dass AfD-Fans unter sich bleiben - Motherboard
Eine Datenanalyse, die zwei Kommunikationswissenschaftler heute auf Motherboard veröffentlichen, zeigt, dass es Filterblasen auch auf YouTube gibt – und wie nah sich NPD und AfD auf der Video-Plattform sind.
youtube  algorithms  filterbubble  study  data 
26 days ago by SimonHurtz
When It Comes To Politics, The Internet Is Closing Our Minds | All Debates | Debates | IQ2US Debates
Does the internet poison politics? Is the rise of social media really broadening our world views, or narrowing them?
watchthis  filterbubble  filter  bubble 
7 weeks ago by jccalhoun
Facebook Figured Out My Family Secrets, And It Won't Tell Me How
Rebecca Porter and I were strangers, as far as I knew. Facebook, however, thought we might be connected. Her name popped up this summer on my list of “People You May Know,” the social network’s roster of potential new online friends for me.
filterbubble  analytics  data  facebook  privacy  bigdata  Unread  3PW  genealogy  recommendations  surveillance 
7 weeks ago by jccalhoun
The Filter Bubble - On The Media - WNYC
Does the internet allow users to limit their interaction to like-minded people, or does access to the World Wide Web ...
filterbubble  filter  bubble 
7 weeks ago by jccalhoun
John Lanchester reviews ‘The Attention Merchants’ by Tim Wu, ‘Chaos Monkeys’ by Antonio García Martínez and ‘Move Fast and Break Things’ by Jonathan Taplin · LRB 17 August 2017
"What this means is that even more than it is in the advertising business, Facebook is in the surveillance business. Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens. It’s amazing that people haven’t really understood this about the company. I’ve spent time thinking about Facebook, and the thing I keep coming back to is that its users don’t realise what it is the company does. What Facebook does is watch you, and then use what it knows about you and your behaviour to sell ads. I’m not sure there has ever been a more complete disconnect between what a company says it does – ‘connect’, ‘build communities’ – and the commercial reality. Note that the company’s knowledge about its users isn’t used merely to target ads but to shape the flow of news to them. Since there is so much content posted on the site, the algorithms used to filter and direct that content are the thing that determines what you see: people think their news feed is largely to do with their friends and interests, and it sort of is, with the crucial proviso that it is their friends and interests as mediated by the commercial interests of Facebook. Your eyes are directed towards the place where they are most valuable for Facebook."

"Here in the rich world, the focus is more on monetisation, and it’s in this area that I have to admit something which is probably already apparent. I am scared of Facebook. The company’s ambition, its ruthlessness, and its lack of a moral compass scare me. It goes back to that moment of its creation, Zuckerberg at his keyboard after a few drinks creating a website to compare people’s appearance, not for any real reason other than that he was able to do it. That’s the crucial thing about Facebook, the main thing which isn’t understood about its motivation: it does things because it can. Zuckerberg knows how to do something, and other people don’t, so he does it. Motivation of that type doesn’t work in the Hollywood version of life, so Aaron Sorkin had to give Zuck a motive to do with social aspiration and rejection. But that’s wrong, completely wrong. He isn’t motivated by that kind of garden-variety psychology. He does this because he can, and justifications about ‘connection’ and ‘community’ are ex post facto rationalisations. The drive is simpler and more basic. That’s why the impulse to growth has been so fundamental to the company, which is in many respects more like a virus than it is like a business. Grow and multiply and monetise. Why? There is no why. Because.

Automation and artificial intelligence are going to have a big impact in all kinds of worlds. These technologies are new and real and they are coming soon. Facebook is deeply interested in these trends. We don’t know where this is going, we don’t know what the social costs and consequences will be, we don’t know what will be the next area of life to be hollowed out, the next business model to be destroyed, the next company to go the way of Polaroid or the next business to go the way of journalism or the next set of tools and techniques to become available to the people who used Facebook to manipulate the elections of 2016. We just don’t know what’s next, but we know it’s likely to be consequential, and that a big part will be played by the world’s biggest social network. On the evidence of Facebook’s actions so far, it’s impossible to face this prospect without unease."
Facebook  socialMedia  ZuckerbergMark  attention  business  psychology  ThielPeter  mimeticDesire  GiraudRene  filterBubble  identity  fakeNews  misinformation  Russia  TrumpDonald  advertising  surveillance  surveillanceCapitalism  businessModels  targeting  personalData  monetisation  tracking  Experian  creditCards  algorithms  auctions  Google  monopoly  duopoly  manipulation  emotion  happiness  mentalHealth  dctagged  dc:creator=LanchesterJohn  LRB 
8 weeks ago by petej
Inside The Partisan Fight For Your News Feed
How ideologues, opportunists, growth hackers, and internet marketers built a massive new universe of partisan news on the web and on Facebook.
facebook  politics  usa  filterbubble  media  data 
10 weeks ago by SimonHurtz
Using social media appears to diversify your news diet, not narrow it
Despite widespread fears that social media and other forms of algorithmically-filtered services (like search) lead to filter bubbles, we know surprisingly little about what effect social media have on people’s news diets.

Data from the 2017 Reuters Institute Digital News Report can help address this. Contrary to conventional wisdom, our analysis shows that social media use is clearly associated with incidental exposure to additional sources of news that people otherwise wouldn’t use — and with more politically diverse news diets.

This matters because distributed discovery — where people find and access news via third parties, like social media, search engines, and increasingly messaging apps — is becoming a more and more important part of how people use media.

The fear of filter bubbles and the end of incidental exposure
The role social media plays varies by context and by user. For some highly engaged news lovers, it may be seen as an alternative way of accessing news that allow them to sidestep traditional brands, or as a convenient way of accessing news from multiple sources in one place.

Importantly, however, most people do not consume news online in this way. For them, the Internet — and social media in particular — is just as likely to be a means of passing the time, staying in touch with friends and family, or a source of entertainment.

Some scholars have worried that, in media environments that offer unprecedented choice, people uninterested in news will simply consume something else, with the effect of lowering knowledge, civic engagement, and political participation amongst the population as a whole.

Even for those who are interested enough to pay attention to news on social media, self-selection and ever-more responsive algorithmic selection could combine to trap people inside “filter bubbles,” where they only ever see things they like or agree with, from sources they have used in the past. The central fear, as Eli Pariser has put it, is that “news-filtering algorithms narrow what we know.”

This, at least, is the theory. These ideas, however, largely fail to take account of the potential for incidental exposure to news on social media: situations where people come across news while using media for other, non-news-related purposes. In the 20th century, incidental exposure was relatively common, as people purchased newspapers to read the non-news content, or left their televisions on between their favorite programs, and in the process, came across news without actively seeking it out. At the beginning of the 21st century, it was hard to see how this could be replicated online, leading people to conclude that incidental exposure would wane. Even as social media reintroduced this potential — by supplementing people’s active choices (accessing specific websites) with algorithmic filtering automatically offering up a range of content when people accessed a site or app — the concern was that their underlying logic would have a limiting effect on exposure by giving people more of what they already used and less of other things.

Our evidence, however, suggests that the opposite is happening on social media, at least for now. (The algorithms, of course, continually change.)

Incidental exposure to news on social media

To assess whether distributed discovery leads to filter bubbles or more diverse news diets, we focus on social media, the most important and widely used form of off-site discovery and consumption when it comes to news.

Using data from the 2017 Reuters Institute Digital News Report, we divided survey respondents into three non-overlapping groups. One group consists of those who say they intentionally use social media for news. We call them news users. Another group are those who do not use social media at all, the non-users. Importantly, there is large middle group who do use social media, but who in the survey say they do not intentionally use it for news. Those we called the incidentally exposed, because they might come across news while they use social media for other purposes.1

If we compare the number of online news sources used on average in the last week by people within each of these three groups — across the U.K., Germany, and the U.S., three very different media markets — we can see that the incidentally exposed report using more sources of news than people who do not use social media at all. The results are in Figure 1. In the U.S., for example, non-users of social media use on average 1.80 online news sources a week. But this figure rises to 3.29 for those who use social media for purposes other than news, and again to 5.16 for people who intentionally use social media for news. These differences remain statistically significant after controlling for a range of demographic and news attitude variables. (We focus on social media here but have found similar results for other forms of algorithmic filtering like search engines and news aggregators.)



Are social media users exposed to more of the same, or to more diverse content?

More sources does not necessarily mean more diverse. Consuming news from three right-wing sources arguably constitutes a less diverse news diet than from one left-wing and one right-wing source.

But the average number of sources reported in Figure 1 are important to keep in mind. For most ordinary people, incidental exposure to news on social media is associated with a step from using only about one (in the U.K. and Germany) or two (in the U.S.) online news sources per week to an average of about two (in the U.K. and Germany) or three (in the U.S.). When dealing with such low numbers, it is likely that any increase in the number of sources will necessarily lead to more diverse consumption. Using two right-wing sources is arguably more diverse than using only one.

We can go one step further, however, and measure whether social media users — and especially those incidentally exposed to news while using social media for other purposes — do in fact report using more politically diverse sources of news. We do this by assessing the partisan leanings of different news sources and in turn using this measure to calculate the political diversity of people’s news diets.

In each country, we divide news sources into those with a mostly left-leaning audience, and those with a mostly right-leaning audience (with the midpoint the average position on the left-right spectrum amongst the population as a whole).2 When we do this for the 15 most popular news sources in each country, we can visualize it in a manner similar to Figure 2. In the U.S., 43 percent of Huffington Post news users self-identify on the left, compared to just 10 percent on the right, meaning that the news audience for The Huffington Post is to the left of the population as a whole. Conversely, just 9 percent of Fox News online users are left-leaning, and 48 percent are right-leaning. This way, we can use the partisan composition of an outlet’s audience as a proxy for its political leaning.


Incidental exposure across the left/right divide

With these partisan leanings of individual outlets in mind, we can look at our three groups of social media users (news users, those incidentally exposed to news on social media, and the non-users) and determine the proportion within each group who say they use at least one source from both sides of the political spectrum (i.e. from both sides of the “midpoint within country”). The results are in Figure 3.

Two things are immediately striking. First, the majority in most countries and in most groups do not use sources from across the political spectrum. But also, second, that both social media news users and those incidentally exposed to news on social media not only (a) consume news from more sources but also (b) have a more politically diverse online news diet than those who do not use social media at all. In the U.S., just 20 percent of those who do not use social media consume news from online brands with left-leaning and right-leaning audiences. Few people, when left to their own devices, opt for a politically diverse news diet. However, the figure rises to 37 percent for those incidentally exposed to news on social media, as they see news links posted by people with different views and different patterns of news consumption. 44 percent of those who use social media for news end up using sources from both the left and the right — more than double the number for non-users. We see the same pattern in both Germany and the U.K. Again, these differences remain significant after we control for other factors.

The future of distributed discovery and filter bubbles

We have focused here on whether social media use leads to narrow filter bubbles or whether algorithmic filtering in its current forms drives greater diversity through distributed discovery. We have shown that social media use is consistently associated with more, and more diverse, news diets, and that the difference is clear even for the incidentally exposed, those who use social media for other purposes and come across news while doing so. Preliminary analysis of other forms of algorithmic filtering like search engines and news aggregators indicate similar results.

These findings underline that the services offered by powerful platform companies like Facebook and Google, despite what critics fear, may in fact currently contribute to more diverse news diets, rather than narrow filter bubbles. Whether they will still do so after the next algorithm update only they know.
Filterbubble  DasGeileNeueInternet  db  media 
10 weeks ago by walt74
Facebook: Was ich in der rechten Filterblase lernte - Digital - Süddeutsche.de
Ende 2015 erstellte unser Autor ein zweites Facebook-Profil. "Tim" öffnete ihm die Tür zu einer Parallelwelt, die ihn zwischenzeitlich an seinen Überzeugungen zweifeln lässt.
facebook  filterbubble  simon  afd 
july 2017 by SimonHurtz

« earlier    

related tags

2017  2read  341webmgmt  351  3pw  advertising  afd  affect  ai  algorithm  algorithms  altright  analytics  article  attention  auctions  bayesian  behaviour  benkler  bias  bigdata  blah  bloomberg  books  boyd  brexit  bubble  business  businessmodels  buzzfeed  buzzword  capitalism  censorship  china  comic  community  confidence  connectedness  conservative  conspiracytheorists  consumer  creditcards  critique  dasgeileneueinternet  data  datascience  db  dc:creator=lanchesterjohn  dctagged  ddj  democracy  diversity  donaldtrump  duopoly  echo  echochamber  election  elections  emotion  enviroscan  ethics  eugenewei  experian  facebook  fake  fakenews  filter  filterbubbla  filterbubble  filtering  freakonomics  genealogy  germany  giraudrene  globalisation  google  government  hack  hacking  happiness  hatespeech  hitler  hoaxes  holocaust  identity  impact  internet  internetculture  interview  irritation  journalism  knowledge  letter  longform  lrb  machinelearning  manipulation  meatspace  media  mentalhealth  mimeticdesire  misinformation  monetisation  monopoly  networkanalysis  networking  networks  news  newsmedia  nudging  opinions  partisan  partisanship  personaldata  platform  platformcapitalism  platforms  polarisation  polarization  political  politics-newmedia  politics  posttruth  privacy  prywatnosc  psychology  racism  recommendations  recommenders  research  results  review  russia  selfie  serendipity/difference/getting_lost  sexism  simon  singapore  smartcity  social  socialmedia  socialnetworking  socialnetworks  society  sociology  statistics  studies  study  surveillance  surveillancecapitalism  talkradio  targeting  theory  thielpeter  timeline  to_read  tools  tracking  trump  trumpdonald  trust  truthiness  tussellconjugation  twitter  unread  usa  verification  video  virtualcommunity  visualization  watchthis  youtube  zuckerbergmark 

Copy this bookmark:



description:


tags: