rvenkat + gafa   26

YouTube, the Great Radicalizer - The New York Times
--With all necessary respect to Zeynep, implicit in her argument is a lack of agency among a society's denizens. I've noticed that she gives too much credit to the idea of a fair minded, thoughtful, reasonable, ethical good citizen, and the idea that most of blame lies on private and public institutions. I remain very suspicious of this idea and it continues to irk me a little bit.

Propensity to radicalize reinforced by recommendation algorithms is the problem. I have a feeling any non-zero propensity to polarize would be sufficient to radicalize certain individuals, no matter how careful we are at designing such algorithms. The question is whether one can demonstrate that such algorithms polarize even the _normal_ denizens. Yes, unregulated artificial intelligence gone wild is a problem but so is extant natural stupidity.
polarization  radicalization  algorithms  ethics  networked_public_sphere  platform_studies  GAFA  zeynep.tufekci 
march 2018 by rvenkat
Platform Capitalism | Critical Theory | Continental Philosophy | General Philosophy | Subjects | Wiley
What unites Google and Facebook, Apple and Microsoft, Siemens and GE, Uber and Airbnb? Across a wide range of sectors, these firms are transforming themselves into platforms: businesses that provide the hardware and software foundation for others to operate on. This transformation signals a major shift in how capitalist firms operate and how they interact with the rest of the economy: the emergence of ‘platform capitalism’.

This book critically examines these new business forms, tracing their genesis from the long downturn of the 1970s to the boom and bust of the 1990s and the aftershocks of the 2008 crisis. It shows how the fundamental foundations of the economy are rapidly being carved up among a small number of monopolistic platforms, and how the platform introduces new tendencies within capitalism that pose significant challenges to any vision of a post-capitalist future. This book will be essential reading for anyone who wants to understand how the most powerful tech companies of our time are transforming the global economy.""

also this
https://en.wikipedia.org/wiki/Inventing_the_Future:_Postcapitalism_and_a_World_Without_Work

--Marxism and Critical theory tags are tentative as the author belongs to *speculative realism* school, which allegedly moves beyond anti-realist tendencies of the past
GAFA  capitalism  market_failures  monopoly  critique  book  critical_theory  marxism  ? 
january 2018 by rvenkat
Surveillance Intermediaries by Alan Z. Rozenshtein :: SSRN
Apple’s 2016 fight against a court order commanding it to help the FBI unlock the iPhone of one of the San Bernardino terrorists exemplifies how central the question of regulating government surveillance has become in American politics and law. But scholarly attempts to answer this question have suffered from a serious omission: scholars have ignored how government surveillance is checked by “surveillance intermediaries,” the companies like Apple, Google, and Facebook that dominate digital communications and data storage, and on whose cooperation government surveillance relies. This Article fills this gap in the scholarly literature, providing the first comprehensive analysis of how surveillance intermediaries constrain the surveillance executive. In so doing, it enhances our conceptual understanding of, and thus our ability to improve, the institutional design of government surveillance.

Surveillance intermediaries have the financial and ideological incentives to resist government requests for user data. Their techniques of resistance are: proceduralism and litigiousness that reject voluntary cooperation in favor of minimal compliance and aggressive litigation; technological unilateralism that designs products and services to make surveillance harder; and policy mobilization that rallies legislative and public opinion to limit surveillance. Surveillance intermediaries also enhance the “surveillance separation of powers”; they make the surveillance executive more subject to inter-branch constraints from Congress and the courts, and to intra-branch constraints from foreign-relations and economics agencies as well as the surveillance executive’s own surveillance-limiting components.

The normative implications of this descriptive account are important and cross-cutting. Surveillance intermediaries can both improve and worsen the “surveillance frontier”: the set of tradeoffs — between public safety, privacy, and economic growth — from which we choose surveillance policy. And while intermediaries enhance surveillance self-government when they mobilize public opinion and strengthen the surveillance separation of powers, they undermine it when their unilateral technological changes prevent the government from exercising its lawful surveillance authorities.
surveillance  big_data  privacy  algorithms  ethics  law  civil_rights  GAFA 
october 2017 by rvenkat
UW ADINT: Advertising as Surveillance
Targeted advertising is at the heart of the largest technology companies today, and is becoming increasingly precise. Simultaneously, users generate more and more personal data that is shared with advertisers as more and more of daily life becomes intertwined with networked technology. There are many studies about how users are tracked and what kinds of data are gathered. The sheer scale and precision of individual data that is collected can be concerning. However, in the broader public debate about these practices this concern is often tempered by the understanding that all this potentially sensitive data is only accessed by large corporations; these corporations are profit-motivated and could be held to account for misusing the personal data they have collected. In this work we examine the capability of a different actor -- an individual with a modest budget -- to access the data collected by the advertising ecosystem. Specifically, we find that an individual can use the targeted advertising system to conduct physical and digital surveillance on targets that use smartphone apps with ads

--over dramatized version here
https://www.wired.com/story/track-location-with-mobile-ads-1000-dollars-study
computaional_advertising  surveillance  data  privacy  technology  GAFA 
october 2017 by rvenkat
Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation by Jack M. Balkin :: SSRN
We have now moved from the early days of the Internet to the Algorithmic Society. The Algorithmic Society features the use of algorithms, artificial intelligence agents, and Big Data to govern populations. It also features digital infrastructure companies, large multi-national social media platforms, and search engines that sit between traditional nation states and ordinary individuals, and serve as special-purpose governors of speech.

The Algorithmic Society presents two central problems for freedom of expression. First, Big Data allows new forms of manipulation and control, which private companies will attempt to legitimate and insulate from regulation by invoking free speech principles. Here First Amendment arguments will likely be employed to forestall digital privacy guarantees and prevent consumer protection regulation. Second, privately owned digital infrastructure companies and online platforms govern speech much as nation states once did. Here the First Amendment, as normally construed, is simply inadequate to protect the practical ability to speak.

The first part of the essay describes how to regulate online businesses that employ Big Data and algorithmic decision making consistent with free speech principles. Some of these businesses are "information fiduciaries" toward their end-users; they must exercise duties of good faith and non-manipulation. Other businesses who are not information fiduciaries have a duty not to engage in "algorithmic nuisance": they may not externalize the costs of their analysis and use of Big Data onto innocent third parties.

The second part of the essay turns to the emerging pluralist model of online speech regulation. This pluralist model contrasts with the traditional dyadic model in which nation states regulated the speech of their citizens.

In the pluralist model, territorial governments continue to regulate the speech directly. But they also attempt to coerce or co-opt owners of digital infrastructure to regulate the speech of others. This is "new school" speech regulation. Digital infrastructure owners, and especially social media companies, now act as private governors of speech communities, creating and enforcing various rules and norms of the communities they govern. Finally, end users, civil society organizations, hackers, and other private actors repeatedly put pressure on digital infrastructure companies to regulate speech in certain ways and not to regulate it in others. This triangular tug of war -- rather than the traditional dyadic model of states regulating the speech of private parties -- characterizes the practical ability to speak in the algorithmic society.

The essay uses the examples of the right to be forgotten and the problem of fake news to illustrate the emerging pluralist model -- and new school speech regulation -- in action.

As private governance becomes central to freedom of speech, both end-users and nation states put pressure on private governance. Nation states attempt to co-opt private companies into becoming bureaucracies for the enforcement of hate speech regulation and new doctrines like the right to be forgotten. Conversely, end users increasingly demand procedural guarantees, due process, transparency, and equal protection from private online companies.

The more that end-users view businesses as governors, or as special-purpose sovereigns, the more end-users will expect -- and demand -- that these companies should conform to the basic obligations of governors towards those they govern. These obligations include procedural fairness in handling complaints and applying sanctions, notice, transparency, reasoned explanations, consistency, and conformity to rule of law values -- the “law” in this case being the publicly stated norms and policies of the company. Digital infrastructure companies, in turn, will find that they must take on new social obligations to meet these growing threats and expectations from nation states and end-users alike.
freedom_of_speech  internet  regulation  governance  administrative_state  big_data  algorithms  privacy  data  artificial_intelligence  machine_learning  ethics  philosophy_of_technology  new_media  social_media  networked_public_sphere  public_sphere  GAFA 
september 2017 by rvenkat

related tags

?  administrative_state  algorithms  artificial_intelligence  attention_economy  autocracy  automation  bias  big_data  book  bots  brexit  britain  buzzfeed  capitalism  civil_rights  civil_war  computaional_advertising  conspiracy_theories  contagion  critical_theory  critique  cybernetics  cybersecurity  data  democracy  digital_economy  disinformation  economic_geography  ethics  european_politics  for_friends  freedom_of_expression  freedom_of_speech  GAFA  germany  global_politics  good_old_days_illusion  governance  hactivism  international_affairs  internet  i_remain_skeptical  josh.marshall  journalism  law  machine_learning  market_failures  market_microstructure  marxism  media_studies  misinformation  monopoly  natural_language_processing  networked_life  networked_public_sphere  networks  news_media  new_media  NYTimes  observational_studies  online_experiments  philosophy_of_technology  phobia  platform_economics  platform_studies  polarization  political_economy  political_psychology  privacy  privatization  protests  public_opinion  public_sphere  radicalization  regulation  report  russia  sentiment_analysis  social_media  social_movements  social_networks  social_science  surveillance  teaching  technology  the_atlantic  tim.wu  tpm  twitter  united_states_of_america  us_politics  us_supreme_court  via:henryfarrell  via:zeynep  WaPo  zeynep.tufekci 

Copy this bookmark:



description:


tags: