ayjay + tech   486

Is Email Making Professors Stupid?
One solution is to directly confront the zero sum trade-off generated by service obligations. Professors have a fixed amount of time; the more that’s dedicated to service, the less that can be dedicated to research and teaching. Instead of ignoring this reality, we should clearly articulate these trade-offs by specifying the exact amount of time a faculty member is expected to devote to service each year. That amount would be negotiated between a professor and a department chair, and the professors would be encouraged to enforce the limits of their service budgets.

These budgets would vary depending on the career phase and interests of individual professors. A faculty member actively engaged in research or creating new courses might be responsible for only a handful of service hours per week, while others would have more substantial obligations. Pre-tenure faculty members would presumably have smaller budgets than full professors who no longer need to consider promotion, and so on. The occasional major service commitment, like serving as department chair, would necessitate a large budget, but even in this case, making the trade-off clear is important. If the time required to be department chair is absurd, it’s useful to quantify this absurdity as a stark case for additional administrative support, or to help calibrate the proper compensation in terms of course buyouts or leave.
academentia  tech  from instapaper
3 days ago by ayjay
Tim Harford's technological adjustments
First, I didn’t miss being plugged into Twitter at all. I’ve been ignoring notifications for years — thus missing some of the benefit and much of the aggravation of the platform — but have still been tweeting away out of some strange combination of duty and inertia.

My new plan is to log in for a few hours on Friday, set up some links to my columns and other projects that may interest some people, and log out again. If I ever see a good reason to use the platform more intensively, I’ll be back.

Second, I enjoyed having a more boring phone. With very little on it now but an easily emptied email inbox and the FT app, I pick it up less often and for less time, and am more likely to do something useful with it when I do check it.

I did reinstall Feedly — which I find essential for my job — but will keep an eye on my usage. With no tweets to send, the app has become more useful. I read for the sake of learning rather than for the sake of tweeting.
tech  from instapaper
20 days ago by ayjay
Opinion | Steve Jobs Never Wanted Us to Use Our iPhones Like This
To succeed with this approach, a useful first step is to remove from your smartphone any apps that make money from your attention. This includes social media, addictive games and newsfeeds that clutter your screen with “breaking” notifications. Unless you’re a cable news producer, you don’t need minute-by-minute updates on world events, and your friendships are likely to survive even if you have to wait until you’re sitting at your home computer to log on to Facebook or Instagram. In addition, by eliminating your ability to publish carefully curated images to social media directly from your phone, you can simply be present in a nice moment, free from the obsessive urge to document it.

Turning our attention to professional activities, if your work doesn’t absolutely demand that you be accessible by email when away from your desk, delete the Gmail app or disconnect the built-in email client from your office servers. It’s occasionally convenient to check in when out and about, but this occasional convenience almost always comes at the cost of developing a compulsive urge to monitor your messages constantly. If you’re not sure whether your work requires phone-based email, don’t ask; just delete the apps and wait to see whether it causes a problem — many people unintentionally exaggerate their need to constantly be available.
Technopoly  tech  socialmedia  from instapaper
28 days ago by ayjay
Childhood's End | Edge.org
These new hybrid organizations, although built upon digital computers, are operating as analog computers on a vast, global scale, processing information as continuous functions and treating streams of bits the way vacuum tubes treat streams of electrons, or the way neurons treat information in a brain. Large hybrid analog/digital computer networks, in the form of economies, have existed for a long time, but for most of history the information circulated at the speed of gold and silver and only recently at the speed of light.

We imagine that individuals, or individual algorithms, are still behind the curtain somewhere, in control. We are fooling ourselves. The new gatekeepers, by controlling the flow of information, rule a growing sector of the world. 

What deserves our full attention is not the success of a few companies that have harnessed the powers of hybrid analog/digital computing, but what is happening as these powers escape into the wild and consume the rest of the world.
tech  AI  algorithms 
7 weeks ago by ayjay
The Searcher of Patterns and the Keeper of Things - Los Angeles Review of Books
As if to forefront both projects’ interest in the form information takes, each exists in two states itself: first as a book, with its mechanics of linearity and indexicality contextualized, and then also as a website. In Silver’s case, this involves a born-digital museum containing high-resolution images of the objects discussed in his book; in Pasanek’s a fully searchable and still-growing database of metaphors, of which his chapters are to be read as samples, continues to be updated. These websites illuminate, expand, and feed back into the books they serve, but they also concretize the lessons Silver and Pasanek have to offer about the mechanics of information-gathering and retrieval. If you followed either of the links above, you’ll get the picture, as well as the practical feel of this claim. Silver and Pasanek both study cognition as it has been imagined physically, through metaphor and the early history of extended cognition. But both also insist at the level of media use that it really has always had a physical substratum.
history  tech  mind  from instapaper
9 weeks ago by ayjay
The Open Office and the Spirit of Capitalism - American Affairs Journal
Fittingly, the philosopher Byung-Chul Han classifies the signature affliction of our current age as neuronal violence, as opposed to the “immunological” violence of last century, which took place along clearly demarcated borders. With barriers literally down, the paranoid totalizing of the corporate office space comes to embody the ethos of Foucault’s disciplinary society, with one important twist. What replaces the disciplinary society, Han tells us, is the “achievement society.” Now the question is no longer, “What am I allowed to do?” but “What can I do?” This shift is profound. It takes us from the firmly hierarchical paranoia and conformity of the skyscraper to the depressed, ADHD-afflicted chaos of the open office space. The company man was never allowed to be himself. The unpaid intern, by contrast, must always be performing himself. The result of the now largely dematerialized office is that this very performance of self becomes the office. Han is worth quoting at length on this:

“The society of laboring and achievement is not a free society. It generates new constraints. Ultimately, the dialectic of master and slave does not yield a society where everyone is free and capable of leisure, too. Rather, it leads to a society of work in which the master himself has become a laboring slave. In this society of compulsion, everyone carries a work camp inside. This labor camp is defined by the fact that one is simultaneously prisoner and guard, victim and perpetrator. One exploits oneself. It means that exploitation is possible even without domination.”

The purpose of the open office was always self-exploitation. It exists like some evolutionary link between the confined counting houses of the past and the dematerialized configurations of “the office” yet to come. Tracing the arc of the office’s development through time, and then anticipating its curve beyond, we could do worse than to extrapolate from existing data points like the shared workspace, working remotely, and the commodification of daily life into internet content (think here of unboxing videos or the selling of consumer preference data). Jonathan Crary writes in 24/7: Late Capitalism and the Ends of Sleep (Verso, 2013): “As the opportunity for electronic transactions of all kinds becomes omnipresent, there is no vestige of what used to be everyday life beyond the reach of corporate intrusion. An attention economy dissolves the separation between the personal and professional, between entertainment and information, all overridden by a compulsory functionality that is inherently and inescapably 24/7.”30 What this suggests is that as the office walls come down, so will the temporal and ideological barriers separating work from nonwork. The office of the future, in other words, won’t be a place, but an identity. The office of the future will be your most intimate conceptions of self, somehow put to work.
modernity  tech  capitalism  from instapaper
12 weeks ago by ayjay
Apple designer Jony Ive has explained how ‘teetering towards the absurd’ helped him make the iPhone
"The necessary resolve to find solutions to the problems that stand between a tentative thought and something substantial, that resolve and that focus very often seems in direct conflict with most creative behaviour. Honestly, I can't think of two ways of working, two different ways of being, that are more polar. On one hand to be constantly questioning, loving surprises, consumed with curiosity and yet on the other hand having to be utterly driven and completely focused to solve apparently insurmountable problems, even if those solutions are without precedent or reference. And so, of course, this is where it becomes sort of ironic and teeters towards the utterly absurd.

"You see, in the mode of being unreasonable and resolute, you have to solve hard problems. But solving those problems requires new ideas. And so, we're back to needing ideas and back to having to be open and curious. This is not a shift that occurs once or twice in a multi-year project. I find it happens to me once or twice a day and that frequency of shifting between two such different ways of seeing and thinking is fantastically demanding."
tech  apple  from instapaper
november 2018 by ayjay
Andrew Sullivan: What Happens If Americans Stop Trusting the System?
Yes, I know the urban coasts are where the future nerds and business whizzes want to live (and increasingly do). But that’s part of our problem, is it not? We’re geographically sorting — and the left-behinds are getting more left behind in the middle of the country. And if there’s something we really don’t need in D.C. right now is more of the cognitive elite! The place is crawling with irritating young white people who go to CrossFit, ride those creepy scooters, and never look up from their phones. A company like Amazon could actually have had the clout to bring those types back to the heartland and do some small thing to rebalance the country. Wages would be effectively higher given the lower cost of housing. The Millennial migrants could even help turn Texas blue! The cultural shift and economic boost might even lure a few opioid users toward an ounce of hope and a middle-class salary in Indiana or Ohio or West Virginia.

This matters. As the country becomes increasingly culturally, economically, and socially bifurcated, we need responsible corporate actors to help bridge this gap. Amazon — one of the most trusted brands in America — is perfectly poised to pull this off. Its decisions could be a critical help in keeping this country from the kind of yawning divisions and vast inequalities which are fast hardening into permanent social chasms.
politics  city  tech  from instapaper
november 2018 by ayjay
Desperately Seeking Cities
As a substitute for more concerted city planning, urbanism has had little success in encouraging the diversity it claims to seek. As a cover for the true nature of the neoliberal city, it has been a triumph.

It is beyond question that, in whatever city it chose to grace, Amazon would bring neither the jobs that that city needed, nor the public works that it needed. In his latest variation on the urbanist delusion, written for the Financial Times, the much-pilloried Richard Florida plaintively appealed to Amazon not to “accept any tax or financial incentives,” but rather to pledge to “invest alongside cities to create better jobs, build more affordable housing, and develop better schools, transit, and other badly needed public goods, along with paying its fair share of taxes.” The depths of Florida’s naiveté cannot be overstated. Not only is Amazon categorically unlikely to pledge what he wants (or, even if it did, make even the slightest effort to deliver on such a pledge), but Florida openly expresses his desire to cede all urban political power and every human demand to the whims of the company. In this respect, too, the Amazon HQ2 contest has been clarifying.
economics  city  tech  from instapaper
november 2018 by ayjay
Technology: The Emergence of a Hazardous Concept
When the word technology was introduced into American English in the nineteenth century, it referred to the study of—or treatises in—the mechanic arts. It did not achieve currency as a reference to those arts as such until c. 1900–1930. By the 1840s, however, when Daniel Webster eloquently summarized the remarkable transformation of life attributable to advances in science and what later would come to be called technology, public discourse was filled with evidence of a semantic void that eventually would be filled by the concept—the word—technology. Two kinds of development—one conceptual or ideological, and the other substantive or mechanical—created the need for the new concept. By the end of the century it had become evident that Webster had been seeking to identify—to name—a novel form of human power with far greater efficacy and scope than that previously ascribed to the mechanic or useful arts. The new power is what we call technology.
tech 
october 2018 by ayjay
The Crisis of Intimacy in the Age of Digital Connectivity - Los Angeles Review of Books
The basic contradiction is as simple as it is desperate: the sharing of private experience has never been more widespread while empathy, the ability to recognize the meaning of another’s private experience, has never been more rare. In Philosophical Investigations, Wittgenstein confronted exactly this problem, of the meaning of intimacy and the intimacy of meaning. “The essential thing about private experience is really not that each person possesses his own exemplar, but that nobody knows whether other people also have this or something else,” he wrote. “The assumption would thus be possible — though unverifiable — that one section of mankind had one sensation of red and another section another.” Wittgenstein thought it was unverifiable, but the internet has verified it. Is the dress blue or gold? Do you hear Yanni or Laurel?

The connection of the TCP/IP promises universality of reference; it does not promise shared sensation. And shared sensation is the essence of intimacy — the conviction that I feel what another or others are feeling, and another or others feel what I’m feeling. It’s the desperate human question: Do you feel what I feel? Is the little tremor in my heart meaningful to others? Wittgenstein posed this pathetically needy, essentially human question in his famous parable of the beetle in the box...
tech  socialmedia  philosophy  psychology  from instapaper
october 2018 by ayjay
Opinion | The Tyranny of Convenience
We need to consciously embrace the inconvenient — not always, but more of the time. Nowadays individuality has come to reside in making at least some inconvenient choices. You need not churn your own butter or hunt your own meat, but if you want to be someone, you cannot allow convenience to be the value that transcends all others. Struggle is not always a problem. Sometimes struggle is a solution. It can be the solution to the question of who you are.

Embracing inconvenience may sound odd, but we already do it without thinking of it as such. As if to mask the issue, we give other names to our inconvenient choices: We call them hobbies, avocations, callings, passions. These are the noninstrumental activities that help to define us. They reward us with character because they involve an encounter with meaningful resistance — with nature’s laws, with the limits of our own bodies — as in carving wood, melding raw ingredients, fixing a broken appliance, writing code, timing waves or facing the point when the runner’s legs and lungs begin to rebel against him.
tech  from instapaper
october 2018 by ayjay
The future’s so bright, I gotta wear blinders | ROUGH TYPE
For much of this year, I’ve been exploring the biases of digital media, trying to trace the pressures that the media exert on us as individuals and as a society. I’m far from done, but it’s clear to me that the biases exist and that at this point they have manifested themselves in unmistakable ways. Not only are we well beyond the beginning, but we can see where we’re heading — and where we’ll continue to head if we don’t consciously adjust our course.

Is there an overarching bias to the advance of communication systems? Technology enthusiasts like Kelly would argue that there is — a bias toward greater freedom, democracy, and social harmony. As a society, we’ve largely embraced this sunny view. Harold Innis had a very different take. “Improvements in communication,” he wrote in The Bias of Communication, “make for increased difficulties of understanding.” He continued: “The large-scale mechanization of knowledge is characterized by imperfect competition and the active creation of monopolies in language which prevent understanding and hasten appeals to force.” Looking over recent events, I sense that Innis may turn out to be the more reliable prophet.
futurism  tech 
october 2018 by ayjay
Tom Vanderbilt Explains Why We Could Predict Self-Driving Cars, But Not Women in the Workplace
Like the hungry person who orders more food at dinner than they will ultimately want—to use an example from Lowenstein and colleagues—forecasters have a tendency to take something that is (in the language of behavioral economics) salient today, and assume that it will play an outsized role in the future. And what is most salient today? It is that which is novel, “disruptive,” and easily fathomed: new technology.

As the theorist Nassim Nicholas Taleb writes in Antifragile, “we notice what varies and changes more than what plays a larger role but doesn’t change. We rely more on water than on cell phones, but because water does not change and cell phones do, we are prone to thinking that cell phones play a larger role than they do.”

The result is that we begin to wonder how life was possible before some technology came along. But as the economist Robert Fogel famously noted, if the railroad had not been invented, we would have done almost as well, in terms of economic output, with ships and canals.3 Or we assume that modern technology was wonderfully preordained instead of, as it often is, an accident. Instagram began life as a Yelp-style app called Burbn, with photos an afterthought (photos on your phone, is that a thing?). Texting, meanwhile, started out as a diagnostic channel for short test messages—because who would prefer fumbling through tiny alphanumeric buttons to simply talking?
tech 
october 2018 by ayjay
We need a new model for tech journalism
There is currently high-level global debate as to whether the tech giants should be broken up in the public interest. We should also have a debate about whether tech journalism should be broken up for the same reason: We need a new journalism which treats tech the same as every other major vested corporate interest—people who can sit back and aside from the tech industry maelstrom and try to see the picture from above.

Maybe we should simply scrap the idea of a “tech desk” altogether: The sector needs scrutiny, but since technology now touches every aspect of our society, keeping it siloed from the rest of the newsroom now feels artificial. Let it be covered, extensively, across desks.
tech  journalism  socialmedia  from instapaper
july 2018 by ayjay
Does Facebook Need a Constitution?
Infowars has, among other things, claimed that the Sandy Hook shootings were a staged “false flag” event, that Democrats were planning on launching a civil war on July 4, and that the government is putting chemicals in the water that are turning frogs gay. At the very least, setting “banning” aside, it seems less than ideal to allow a publication like that to represent itself on Facebook as a “News & Media Website.” Similarly, Holocaust deniers are engaged in a specific political project intended to diminish the impact of anti-Semitism and rehabilitate the Nazi state. It’s naïve, at best, to say you can’t “impugn” their intent.

But at the same time, you can understand the company’s anxiety. It’s not just that Facebook is wary of activating the grievance machinery of modern conservatism (though it very obviously is), it’s also that it has a philosophical, institutional allergy to making qualitative judgments about truth and falsehood. And, frankly, shouldn’t it? I’m pretty sure that I don’t want to live in a world where Mark Zuckerberg gets to determine what counts as true and what doesn’t, even if he and I agree about Infowars and the Holocaust. (Especially since he seems to be under the impression that there’s some large portion of Holocaust deniers who are merely misinformed, not actively mendacious.)
tech  news  media 
july 2018 by ayjay
Letter from Shenzhen
Part of the original shanzhai economy began with copying DVDs. Since copied DVDs couldn’t be played by name-brand players (an attempt to control piracy or simply due to DVD quality issues), a whole set of products were created to support the copied DVDs — and from there, a wildly creative ecosystem appeared.

This is the new shanzhai. It’s open-source on hyperspeed — where creators build on each other’s work, co-opt, repurpose, and remix in a decentralized way, creating original products like a cell phone with a compass that points to Mecca (selling well in Islamic countries) and simple cell phones that have modular, replaceable parts which need little equipment to open or repair.

Shanzhai’s past has connotations of knock-off iPhones and fake Louis Vuitton bags. New shanzhai offers a glimpse into the future: its strength is in extreme open-source, which stands in stark contrast to the increasingly proprietary nature of American technology. As startups in the Bay Area scramble to make buckets of money, being in this other Greater Bay Area makes it clear why there’s so much rhetoric about China overtaking the US. It is.
tech  China  from instapaper
july 2018 by ayjay
Offscreen Magazine Interview: Craig Mod
That’s why I try to subvert my weaknesses, to subvert that persona. The easiest way is to turn off the internet. When I go to bed at night, the internet goes off. Phone into airplane mode. It doesn’t come back on until after lunch the next day (at the earliest). The difference in the quality of the day ahead between starting my morning with the internet on versus off is enormous.

If I wake up and touch my phone, I’ve already lost hours. Not because I’m browsing social media for hours, but because the mind has already been agitated, made unquiet, and the context switch back into thoughtfulness can take the whole morning. In other words, the addict part of my brain takes over and contaminates my ability to be contemplative. I lose the grace to dive into other worlds, the worlds of writing or programming or images.
In one of your essays you describe going offline for such a long period as a privilege. Does that mean that in the future going offline will be a luxury that only rich people can afford?

The default expectation today is “always available.” The systems we created are so frictionless that we haven’t noticed how insidiously over-engaged we are. Step by step we’re optimizing ourselves to “maximum” productivity without defining or thinking about “productivity” on a human scale. The digital world abstracts. One could argue most problems contemporary society faces are problems of over-abstraction. As an employer with a global workforce, you have no idea where your employees might be or what they might be doing, so you expect them to answer immediately. The concept of downtime is elusive.

So yes, it’s already a great privilege to be able to say ‘no’ to that system.
tech  socialmedia  from instapaper
march 2018 by ayjay
We Fear What We Can't Control About Uber and Facebook
When I see a new story or criticism about the tech world, I no longer ask whether the tech companies poll as being popular (they do). I instead wonder whether voters feel in control in a world with North Korean nuclear weapons, an erratic American president and algorithms everywhere. They don’t. Haven’t you wondered why articles about robots putting us all out of work are so popular during a time of full employment?

We are about to enter a new meta-narrative for American society, which I call “re-establishing the feeling of control.” Unfortunately, when you pursue the feeling rather than the actual control, you often end up with neither.
tech  socialmedia  from instapaper
march 2018 by ayjay
The Undeath of Cinema - The New Atlantis
Peter Cushing’s spare frame, sharp cheekbones, and long limbs are part of what made him him; they are essential to his Cushing-ness. Creating a convincing facsimile of his living, breathing, moving form after his death should not be undertaken lightly, any more than exhuming his corpse should be. The grave-robbing version is surely more egregious. Yet if it would be wrong to make a puppet of a dead man’s mortal remains, then it is also wrong to make a puppet of a dead man’s imitated form. A simulacrum is fraught with the dignity of the individual it represents.

Dishonoring the remains of the dead is a near-universal, but poorly articulated, taboo. Many people agree that it is wrong without having a metaphysical framework that justifies their belief in the dignity of the human body. But the widespread unease at the CGI Cushing testifies to the power and wisdom of this taboo, however inchoate.

The technology of digitally bringing deceased actors back to the screen runs counter to this humane impulse, this feeling that it is proper to allow the dead to remain buried. Perhaps it is not only technological advances but also the normalization of destructive means of disposing of dead bodies (like cremation) that allowed Industrial Light & Magic to contemplate Frankensteining Peter Cushing. The central violation of at-will digital resurrection is that it wrongs the dead subject by making him into a puppet.
film  tech  from instapaper
march 2018 by ayjay
Wittgenstein as a Philosopher of Technology: Tool Use, Forms of Life, Technique, and a Transcendental Argument
The work of Ludwig Wittgenstein is seldom used by philosophers of technology, let alone in a systematic way, and in general there has been little discussion about the role of language in relation to technology. Conversely, Wittgenstein scholars have paid little attention to technology in the work of Wittgenstein. In this paper we read the Philosophical Investigations and On Cer- tainty in order to explore the relation between language use and technology use, and take some significant steps towards constructing a framework for a Wittgensteinian philosophy of technology. This framework takes on board, and is in line with, insights from postphenomenological and hermeneutic approaches, but moves beyond those approaches by benefiting from Wittgenstein’s insights into the use of tools, technique, and performance, and by offering a transcendental interpretation of games, forms of life, and grammar. Focusing on Wittgenstein’s philosophy of language in the Investigations, we first discuss the relation between language use and technology use, understood as tool use, by drawing on his analogy between language and tools. This suggests a more general theory of technology use, understood as performance. Then we turn to his epistemology and argue that Wittgenstein’s understanding of language use can be embedded within a more general theory about technology use understood as tool use and technique, since language-in-use is always already a skilled and embodied technological practice. Finally, we propose a transcendental interpretation of games, forms of life, and grammar, which also gives us a transcendental way of looking at technique, tech- nological practice, and performance.
philosophy  tech 
february 2018 by ayjay
How technology is designed to bring out the worst in us
Technology feels disempowering because we haven’t built it around an honest view of human nature. The reason we called our new project the Center for Humane Technology is it starts with a view of ourselves.

Silicon Valley is reckoning with having had a bad philosophical operating system. People in tech will say, “You told me, when I asked you what you wanted, that you wanted to go to the gym. That’s what you said. But then I handed you a box of doughnuts and you went for the doughnuts, so that must be what you really wanted.” The Facebook folks, that’s literally what they think. We offer people this other stuff, but then they always go for the outrage, or the autoplaying video, and that must be people’s most true preference.

If you ask someone, “What’s your dream?” that’s not a meaningless signal. A psychotherapist going through an interview process with someone is accessing parts of them that screens never do. I think the [traffic] metrics have created this whole illusion that what people are doing is what people want, when it’s really just what works in the moment, in that situation.
tech  Technopoly  ethics  person  from instapaper
february 2018 by ayjay
Orbital Operations | Warren Ellis
I’m an edge case.  I want an untangled web. I want everything I do to copy back to a single place, so I have one searchable log for each day’s thoughts, images, notes and activities.  This is apparently Weird and Hermetic if not Hermitic.

I am building my monastery walls in preparation for the Collapse and the Dark Ages, damnit. Stop enabling networked lightbulbs and give me the tools to survive your zombie planet
tech 
february 2018 by ayjay
Why We Forget Most of the Books We Read - The Atlantic
“Memory generally has a very intrinsic limitation,” says Faria Sana, an assistant professor of psychology at Athabasca University, in Canada. “It’s essentially a bottleneck.”

The “forgetting curve,” as it’s called, is steepest during the first 24 hours after you learn something. Exactly how much you forget, percentage-wise, varies, but unless you review the material, much of it slips down the drain after the first day, with more to follow in the days after, leaving you with a fraction of what you took in.

Presumably, memory has always been like this. But Jared Horvath, a research fellow at the University of Melbourne, says that the way people now consume information and entertainment has changed what type of memory we value—and it’s not the kind that helps you hold onto the plot of a movie you saw six months ago.
memory  tech 
january 2018 by ayjay
Medieval Robots
The fourth chapter illustrates the imaginative movement from understanding automata within a framework of natural philosophy to one that included mechanics from the twelfth to the mid-fifteenth century, through an examination of automata, drawn from textual examples, that guard or memorialize the dead. Automata in these settings demonstrate the ways the boundaries between nature and art, between verisimilitude and fraud, and between life and death were contested and negotiated. This chapter opens with twelfth-century literary examples and then moves into a case study of Hector’s tomb in three fictional accounts of the story of Troy, documenting the way that increasingly mechanical explanations of Hector’s preserved corpse replace magical explanations, and coincide with a heightened emphasis on technical skill. The final example, from John Lydgate’s Troy Book (ca. 1420), anticipates Hobbes’s characterizations of the artificial life of mechanical things and the mechanical nature of the body, as in it Hector’s body is kept artificially alive by a complicated system of tubes and wires that replace his nerves and blood vessels.

The growing emphasis on technical skill and fine technology found in Lydgate’s version of Hector’s preserved corpse reflects the development of complex machinery in the fourteenth century, and the more widespread appearance of mechanical marvels at princely courts in Europe in the fourteenth and fifteenth centuries. Beginning with the notional automata in the notebook of Villard de Honnecourt in the mid-thirteenth century, mechanical automata became more common, albeit still the province of the very wealthy. The fifth chapter probes the consequences of the diffusion of mechanical knowledge in the form of mechanical marvels. I argue that the human and animal automata created for the public display of majesty at the courts of Artois and Burgundy, Richard II of England, and the Valois are central to understanding the reappearance of mechanistic thinking, as well as for the technological developments that allowed for the creation of increasingly complex machines.
medieval  robots  tech 
january 2018 by ayjay
Estonia, the Digital Republic
Estonian folklore includes a creature known as the kratt: an assembly of random objects that the Devil will bring to life for you, in exchange for a drop of blood offered at the conjunction of five roads. The Devil gives the kratt a soul, making it the slave of its creator.

“Each and every Estonian, even children, understands this character,” Kaevats said. His office now speaks of kratt instead of robots and algorithms, and has been using the word to define a new, important nuance in Estonian law. “Basically, a kratt is a robot with representative rights,” he explained. “The idea that an algorithm can buy and sell services on your behalf is a conceptual upgrade.” In the U.S., where we lack such a distinction, it’s a matter of dispute whether, for instance, Facebook is responsible for algorithmic sales to Russian forces of misinformation. #KrattLaw—Estonia’s digital shorthand for a new category of legal entity comprising A.I., algorithms, and robots—will make it possible to hold accountable whoever gave a drop of blood.
tech  Technopoly  from instapaper
january 2018 by ayjay
Some 2018 Predictions
Industrialization of conversation. We have not come to terms with how the digitalization of conversation allows for its industrialization. And how it’s industrialization allows for manipulation that is more massive and immediate than what we’ve previously seen in the conversational space. We need to develop tools and norms to protect conversation from industrialization. And we desperately need to stop conceptualizing the discourse space on the web as a bunch of individual actors expressing an emergent will.
tech 
december 2017 by ayjay
Living well in the technosocial world – a review of Shannon Vallor’s Technology and the Virtues
The “technosocial” world in which we live is one wherein our technologies cannot be safely fenced off, instead our changing technologies are “embedded in co-evolving social practices, values, and institutions” (5). Yet, even in the midst of the “technosocial” our ability to discern where we are going, or where we even are now, is quite deficient. As Vallor notes, we are beset by “growing technosocial blindness” a condition she calls “acute technosocial opacity” which makes it “increasingly difficult to identify, seek, and secure the ultimate goal of ethics—a life worth choosing; a life lived well” (6). Our “acute technosocial opacity” keeps us from recognizing that when we choose to use certain technologies we may be choosing to go along with these technologies’ vision of “a life lived well” instead of our own. Alas, the vision of the “life lived well” by many of these technologies is simply a life that supplies an endless stream of data to be processed and sold to advertisers; it can be profoundly antihumanistic and relentlessly capitalistic. Indeed, many of the habits that technologies seem to encourage and celebrate are the opposite of virtues: they are vices.
tech  ethics  from instapaper
december 2017 by ayjay
The technologist's responsibilities and social change
Weiser's Principles of Inventing Socially Dangerous Technology:

1. Build it as safe as you can, and build into it all the safeguards to personal values that you can imagine.

2. Tell the world at large that you are doing something dangerous.

Principle 1 ensures that, as an engineer, you have demonstrated to all concerned that it is possible to construct your system with appropriate safeguards. Unfortunately, it is rarely possible to build them in such a way that they cannot be removed. And so, as Doheny-Farina clearly questions, what prevents "organizations less enlightened" than PARC from removing those safeguards?

Principle 2 comes to the rescue then by providing a basis for informed discussion and action by anyone. Most engineers will defend as strongly as possible the value of their work and leave it to others to find fault. But that is not enough if one is doing something that one knows has possibly dangerous consequences. The responsible engineer in this case must pro-actively begin debate about how the technology should be used. For one thing, he or she may learn of more things to be applied under Principle 1. And for another, informed people are less likely to let safeguards be removed.

In practice, as we learned, Principle 2 sometimes becomes Principle 2-A: Cooperate with overblown and distorted media stories about your work. I have heard NBC Nightly News describe Xerox PARC as being at the forefront of "big brother technology," as though the world were sprinkled with surveillance laboratories, but we happened to be the best. And so that leads to Principle 2-B: Better that people are too scared about what you are doing than they not find out at all until it's too late. As we write (and say) in the computer biz: "sigh."

Principle 2 is far from a guarantee that evil will not be done. But I know of no way to provide such a guarantee for any technology. Refusing to work on such technology is the approach of the ostrich. However, I am an optimist. I think that people will eventually figure out how to use technology for their benefit, including, if necessary, passing laws or establishing social conventions to avoid its worst dangers. It should be every engineer's role to provide as much information as possible in the debate leading to these new laws and conventions.
tech 
december 2017 by ayjay
Warren Ellis interviews Adam Greenfield
1) Networked digital communications technologies worm their way into everything - or, at least, are often forced into things. I commented in the newsletter last week that Facebook alone is viscous and invasive, and apparently some people have a longstanding urge to outsource the operation of their front doors and pet crates to networked services. Is there a strategy of refusal of a deeply & chaotically networked world that doesn't look like the Amish?

I think one can certainly opt out of much contemporary technology, sure, in the limited sense of clearing it from your personal sphere and living space. You can, if you wish, forego a smartphone. You can surely do without personal biometric monitors and "smart speakers" and a profile on Tinder. Plenty of people do, and so far as I can tell they don't suffer overmuch as a result of this choice.
But there are two circumstances presented to us by networked technologies that you can't opt out of quite so easily, if at all. The first is being an operand, an object of the networked data-collection and -processing techniques that are now brought to bear on you by various institutional actors both public and private, and which will increasingly determine the shape of your life chances and the choices that are available to you. And the second is living in a world where the great majority of the other human beings you interact with have chosen to embrace these technologies, to a more or less conscious degree, and have had their subjectivities altered by exposure to them.

The first circumstance means that nothing short of a Kaczynskian retreat from public life will stop various kinds of actors from attempting to gather information about you, correlating that information with other information already at hand, building models of your personality and psychic state, or using those models to project and anticipate your future behavior — and what is more, any such retreat will necessarily come at the cost of meaningful participation in the contemporary economy. The second circumstance means that any social interaction whatsoever with people who haven't undertaken that kind of retreat will henceforth come at the cost of inviting the network into your life, albeit indirectly. So while you yourself may somehow be able to absent yourself from networked visibility, all of your interlocutors will be people whose tastes, preferences, capabilities and desires have all been inflected by their long-term immersion in the networked condition.

All of this is just a very longwinded way of saying no: no, at this point in time, there is no meaningful gesture of refusal available to the overwhelming majority of us. If we want to create spaces in which refusal is possible, we have to do the work of organizing, articulating our grievances and our desires, and bringing those spaces into being.

2) I'm always interested in production. How long did it take you to write this book, and what did a usual day's work look like? (Context: some people write in the mornings, some people do 500 words and walk away, some people just hammer at the thing day and night, etc)
There are two ways of answering this question, both true. The first way is to say that writing the book took me six months, because that's about how long it took me to produce a manuscript once I knew what the book was really about. But that's only part of the story, because it neglects the eight years of slow gestation and consolidation that got me to that point.

There are surviving passages in the book that date to 2007 — not many of them of any great length, because of the rapidly-evolving nature of what I'm writing about. But certainly turns of phrase here and there, and even one or two extended arguments. I think sensitive readers will pick up on this. They'll twig that the book as published is an unruly accretion of different chunks of thought that were developed in different places and times, rather than anything that proceeded smoothly and elegantly from a clear thesis to a well-structured argument.

And of course you'l grasp right away what that implies, which is that there was never any such thing as a "usual day's work." There were stretches where I'd be disciplined and Writerly and put my ass in the chair for hours a day every day, whether that chair was in the Rose Reading Room or the window of a Starbucks in Ebisu. But more often I'd wake up in the middle of the night to email myself a single, fully-formed sentence, and that'd be it for the day...or the week. There were long arcs where I'd read and study and develop a considered line of thought about some topic that never even made it into the final manuscript, and there were days where I'd brew a big pot of coffee and power through most of a section in one sitting.
It's kind of amazing to me that I was ever able to suture all of that together into a coherent work. And, for that matter, I might not have, except for not wanting to disappoint my partner, my editor and the readers of my earlier books.

3) I try to use the term "machine learning" rather than "artificial intelligence," because the latter term seems weighted with implications of self-reflecting consciousness, sentience, which seems to me to remain a pipe dream. Am I wrong?

I don't mind the word "intelligence," but I don't think we even understand what our own intelligence is, and I'm not at all sure we'd recognize an intelligence that took any particularly different form from those we're used to. Here's where reading science fiction like Peter Watts's Blindsight is helpful, or Other Minds from the factual side of things, because they remind us how much more varied the possibilities are than we generally imagine — how resplendent the forms acuity might take, and how little they might correspond with our own notions of capability or consciousness or subjectivity. What Haldane said in 1927 still seems sound to me, and if anything only improved by the contemporary spin on his choice of words: the universe is not only queerer than we suppose, it's queerer than we can suppose.
tech  writing  from notes
december 2017 by ayjay
Network Neutrality Can't Fix the Internet
It’s true that one set of giant internet companies, like Comcast and Verizon, can’t currently mess with what people read, watch, and explore online. But another faction of giant internet companies can and do exert that power and control. Google, Facebook, Apple, Amazon, Netflix, and others manage access to most of the content created and delivered via broadband and wireless networks. Google appears to handle over 63 percent of searches, and it is projected to control 80 percent of the search ad market by 2019. Facebook exerts enormous control over access to news online, and its unmanaged ad network appears to have torn democracy asunder.

Net-neutrality telecommunications policy might benefit the public by providing impartial access to online services. But even so, Big Tech’s stranglehold on those services puts the lie to the underlying freedom and openness those services ultimately offer. When it comes to ISPs, a more effective solution would involve local-loop unbundling—requiring telcos to lease last-mile connections to competitors. Even if that worked and a thousand broadband providers bloomed, the internet would still operate in fundamentally the same way. All the internet Davids might not have to pay for placement with the telco giants, but they must do so to the tech Goliaths.
tech 
november 2017 by ayjay
The Ethics of Technological Mediation | L.M. Sacasas
Verbeek comments on some of the advantages of virtue ethics. To begin with, virtue ethics does not ask, “What am I to do?” Rather, it asks, in Verbeek’s formulation, “What is the good life?” We might also add a related question that virtue ethics raises: “What sort of person do I want to be?” This is a question that Verbeek also considers, taking his cues from the later work of Michel Foucault.

The question of the good life, Verbeek adds,
does not depart from a separation of subject and object but from the interwoven character of both. A good life, after all, is shaped not only on the basis of human decisions but also on the basis of the world in which it plays itself out (de Vries 1999). The way we live is determined not only by moral decision making but also by manifold practices that connect us to the material world in which we live. This makes ethics not a matter of isolated subjects but, rather, of connections between humans and the world in which they live.

Virtue ethics, with its concern for habits, practices, and communities of moral formation, illuminates the various ways technologies impinge upon our moral lives. For example, a technologically mediated action that, taken on its own and in isolation, may be judged morally right or indifferent may appear in a different light when considered as one instance of a habit-forming practice that shapes our disposition and character.
tech  ethics 
november 2017 by ayjay
Hubert Dreyfus - Highway Bridges and Feasts: Heidegger and Borgmann on How to Affirm Technology
This resistance to technological practices on the behalf of focal practices is the primary solution Borgmann gives to saving ourselves from technological devastation. Borgmann cannot find anything more positive in technology--other than indulging in good running shoes and a Big Mac every now and then--because he sees technology as the highest form of subjectivity. It may fragment our identities, but it maintains us as desiring beings not world disclosers. In contrast, since Heidegger sees technology as disaggregating our identities into a contingently built up collection of skills, technological things solicit certain skills without requiring that we take ourselves as having one kind of identity or another. This absence may make our mode of being as world disclosers invisible to us. This would be what Heidegger calls the greatest danger. But this absence allows us to become sensitive to the various identities we may have when we are engaged in disclosing the different worlds focused by different kinds of things. As such disclosers we may even respond to technological things as revealing one kind of world among others. Hence, Heidegger's view of technology allows him to find a positive relation to it, but only so long as we maintain skills for disclosing other kinds of local worlds. Freeing us from having a total fixed identity so that we may experience ourselves as multiple identities disclosing multiple worlds is what Heidegger calls technology's saving power. [...]

Heidegger's thinking until 1955, when he wrote "The Question Concerning Technology," was like Borgmann's current thinking in that for him preserving things was compatible with awaiting a single God. Heidegger said as early as l946 that the divinities were traces of the lost godhead. But Heidegger came to think that there was an essential antagonism between a unified understanding of being and local worlds. Of course, he always realized that there would be an antagonism between the style set up by a cultural paradigm and things that could only be brought out in their ownness in a style different from the dominant cultural style. Such things would inevitably be dispersed to the margins of the culture. There, as Borgmann so well sees, they will shine in contrast to the dominant style but will have to resist being considered irrelevant or even wicked. But, if there is a single understanding of being, even those things that come into their own in the dominant cultural style will be inhibited as things. Already in his "Thing" essay Heidegger goes out of his way to point out that, even though the original meaning of 'thing' in German is a gathering to discuss a matter of concern to the community, in the case of the thing thinging, the gathering in question must be self contained. The focal occasion must determine which community concerns are relevant rather than the reverse.
tech  philosophy  textpatterns  from instapaper
september 2017 by ayjay
Amazon.com: The Real-Town Murders eBook: Adam Roberts: Kindle Store
'In the eighteenth century the really expensive things were food and clothes. But we soon found ways of undermining the scarcity of both those things, and both became trivially cheap. In the late twentieth and early twenty-first century the really expensive thing was housing, because people insisted on preferring large detached properties and there wasn’t enough space for everyone to have one. But now, people can live in as much or as little space, as much or as little luxury, as they desire – in the Shine. All they need for a real-world base is a cupboard. So what does that leave us?’

‘I’m confident,’ said Alma, ‘that this disquisition is going somewhere.’

‘There’s a price to be paid for living in the Shine,’ said Pu. ‘It is that you must open yourself. You render yourself easy to track, easy to surveil, easy to monitor and therefore easy to control. People in the Shine don’t care, because they’re too caught up in their various actualised fantasies. But the people who do the surveilling do care, because it’s the grounds of their power, and once you get a taste for it, power is something you never get enough of.’
surveillance  SF  power  tech  textpatterns 
september 2017 by ayjay
LRB · John Lanchester · You Are the Product: It Zucks!
Google and Facebook have both been walking this line from the beginning. Their styles of doing so are different. An internet entrepreneur I know has had dealings with both companies. ‘YouTube knows they have lots of dirty things going on and are keen to try and do some good to alleviate it,’ he told me. I asked what he meant by ‘dirty’. ‘Terrorist and extremist content, stolen content, copyright violations. That kind of thing. But Google in my experience knows that there are ambiguities, moral doubts, around some of what they do, and at least they try to think about it. Facebook just doesn’t care. When you’re in a room with them you can tell. They’re’ – he took a moment to find the right word – ‘scuzzy’. [...]

The view of human nature implied by these ideas is pretty dark. If all people want to do is go and look at other people so that they can compare themselves to them and copy what they want – if that is the final, deepest truth about humanity and its motivations – then Facebook doesn’t really have to take too much trouble over humanity’s welfare, since all the bad things that happen to us are things we are doing to ourselves. For all the corporate uplift of its mission statement, Facebook is a company whose essential premise is misanthropic. It is perhaps for that reason that Facebook, more than any other company of its size, has a thread of malignity running through its story. The high-profile, tabloid version of this has come in the form of incidents such as the live-streaming of rapes, suicides, murders and cop-killings. But this is one of the areas where Facebook seems to me relatively blameless. People live-stream these terrible things over the site because it has the biggest audience; if Snapchat or Periscope were bigger, they’d be doing it there instead.

In many other areas, however, the site is far from blameless. The highest-profile recent criticisms of the company stem from its role in Trump’s election. There are two components to this, one of them implicit in the nature of the site, which has an inherent tendency to fragment and atomise its users into like-minded groups. The mission to ‘connect’ turns out to mean, in practice, connect with people who agree with you. We can’t prove just how dangerous these ‘filter bubbles’ are to our societies, but it seems clear that they are having a severe impact on our increasingly fragmented polity. Our conception of ‘we’ is becoming narrower. [...]

What this means is that even more than it is in the advertising business, Facebook is in the surveillance business. Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens. It’s amazing that people haven’t really understood this about the company. I’ve spent time thinking about Facebook, and the thing I keep coming back to is that its users don’t realise what it is the company does. What Facebook does is watch you, and then use what it knows about you and your behaviour to sell ads. I’m not sure there has ever been a more complete disconnect between what a company says it does – ‘connect’, ‘build communities’ – and the commercial reality. Note that the company’s knowledge about its users isn’t used merely to target ads but to shape the flow of news to them. Since there is so much content posted on the site, the algorithms used to filter and direct that content are the thing that determines what you see: people think their news feed is largely to do with their friends and interests, and it sort of is, with the crucial proviso that it is their friends and interests as mediated by the commercial interests of Facebook. Your eyes are directed towards the place where they are most valuable for Facebook.
tech  Facebook  textpatterns  from instapaper
september 2017 by ayjay
engscisoc / Acceleration of Tranquility
Were we gods, we might be able to live well without rest and contemplation, but we are not and we cannot. Whereas our physical capacities are limited, those of the machine are virtually unlimited. As the capabilities of the machine are extended, we can use it--we imagine--to supplement our own in ways that will not strain our humanity. Had we no appetite or sin, this might be true, but our desires tend to lead us to excess, and as the digital revolution has quickly progressed, we have not had time to develop the protocols, manners, discipline, and ethics adequate for protecting us from our newly augmented powers.
tech  anthropocene  from instapaper
august 2017 by ayjay
Fundamental Materials Research and the Course of Human Civilization - Nicola Spaldin
But this silicon revolution will soon be forced to come to an end as we start to run into fundamental physical limits, set by the size of the individual atoms that make up the silicon material. And this means that the steady march towards faster, smaller lighter products with more and more functionality can't continue within our existing framework. Now, while this might not seem so disastrous (certainly the controls on my smartphone are already smaller than I can see without my reading glasses), it is in fact a profound problem for society: As living standards improve in emerging regions and the “internet of things” becomes more widespread, worldwide use of microelectronics is expanding more rapidly than ever before, so that by most projections more than half of the world’s energy will be consumed by information technologies within a couple of decades. And this is not sustainable. So, we need to take the step beyond the silicon age, we need to develop an entirely new device paradigm, and to do this we need a new material. Without a new material, we are stuck with our existing concepts for information technol- ogy and we have an energy bottleneck in human progress. And fundamental research in Materials Science – very likely with a complete change in direction – underpins the invention of this material. [...]

So, what next? Well, like many others in the Materials Physics community, I’m working to understand the so-called strong correlations between electrons in solids. Why, if one electron somewhere in a material rearranges a little bit, this explicitly and profoundly a ects all of the other electrons. is research is very fundamental and might never lead to anything useful. Even in that case I would argue that it is worthwhile: Exposing the profound beauty of interacting electrons is comparable to imaging the complexity of our galaxy, the satisfaction of finding a new elementary particle at CERN, or the joy of listening to the Tonhalle Orchestra play a Brahms symphony; all activities which as a society we find worthwhile to invest in. On the other hand, understanding strong electron correlations could be the first step towards making a room-temperature superconductor, a material that conducts electricity without any resistance, under everyday conditions. Such a material would revolutionize energy production, transmission and storage: Imagine power grids that don’t lose energy, portable MRI machines, cheap and widespread “Maglev” trains and paradigm shifts in computing technologies. A room-temperature superconductor would be utterly geopolitically transformative. Then I would bet that the next era of human civilization would be named after this as-yet undiscovered material.
science  tech  engineering 
august 2017 by ayjay
That Google memo about women in tech wasn't wrong
And yet, you still have to ask whether shamestorming Damore and getting him sacked was really the best way to convince him — or anyone else — that he's mistaken. Did anyone's understanding of the complex quandaries of gender diversity advance? If there were guys at Google wondering whether the women around them really deserved their jobs, did anyone wake up the morning after Damore's firing with the revelation: "Good God, how could I have been so blind?" No, I suspect those guys are now thinking: "You see? Women can't handle math or logic."

The mob reaction did prove that women indeed have some power in tech. But the power to fire people is not why most people get into engineering. Good engineers want to make things. The conversation around Damore's memo hasn't made the world a better place, as they say in Silicon Valley. It has just made a lot of people angry.
tech  gender  sociology  from instapaper
august 2017 by ayjay
rhetoric in the late age of the internet
As much as the needed response is not a technological fix, it also is not not a technological fix. We simply need, for one thing, a better understanding of our digital media-ecological rhetorical situation. That’s something rhetoricians can provide, and while I wouldn’t say it’s the biggest piece of the puzzle, there’s still plenty of work to do. The question the late age of the internet poses is what will follow. That is, what follows on the social media communities and digital marketplaces that typify our daily engagement with the web and represent the globe’s most visited websites? The web began in the nineties as a fantasy about escaping the real world, as a place where we would have separate second lives and form parallel virtual communities. And the social web that followed in the next decade largely built on that fantasy by making it more accessible. But we can’t really think about the web that way. The digital world is not a separate world, as if it ever really was. We need a new web, one that supplants the social web as the social web supplanted web 1.0, one that recognizes the rhetorical-material stakes differently.
tech  technique  Technopoly  socialmedia  from instapaper
august 2017 by ayjay
Implementing Webmentions
As social media sites gained traction, those communities moved away from blog commenting systems. Instead of reacting to a post underneath the post, most people will now react with a URL someplace else. That might be a tweet, a Reddit post, a Facebook emission, basically anywhere that combines an audience with the ability to comment on a URL.

Oh man, the memories of dynamic text replacement and the lengths we went to just to get some non-standard text. https://t.co/f0whYW6hh1 — One Bright Light ☣️ (@onebrightlight) July 13, 2017

Whether you think that’s a good thing or not isn’t really worth debating – it’s just the way it is now, things change, no big deal. However, something valuable that has been lost is the ability to see others’ reactions when viewing a post. Comments from others can add so much to a post, and that overview is lost when the comments exist elsewhere.
tech  socialmedia  from instapaper
august 2017 by ayjay
Notes From An Emergency
Given this scary state of the world, with ecological collapse just over the horizon, and a population sharpening its pitchforks, an important question is how this globalized, unaccountable tech industry sees its goals. What does it want? What will all the profits be invested in?

What is the plan?

The honest answer is: rocket ships and immortality.

I wish I was kidding.

The best minds in Silicon Valley are preoccupied with a science fiction future they consider it their manifest destiny to build. Jeff Bezos and Elon Musk are racing each other to Mars. Musk gets most of the press, but Bezos now sells $1B in Amazon stock a year to fund Blue Origin. Investors have put over $8 billion into space companies over the past five years, as part of a push to export our problems here on Earth into the rest of the Solar System.
politics  tech  internet 
june 2017 by ayjay
Philosophers have a new job: coaching Silicon Valley executives to question everything
Still, practical philosophers like Taggart insist philosophical inquiry is the essence of an executive’s job. Philosophy, unlike other fields, offers no assumptions, just relentless inquiry. By subjecting every belief to critical reflection, Taggart’s clients start down a path of inquiry that can lead to genuine understanding, better business decisions, and, eventually, happiness. But that only happens after a painful period of reflection, which will often involve abandoning the deceptive stories we tell ourselves.

“Philosophers arrive on the scene at the moment when bullshit can no longer be tolerated,” says Taggart. “We articulate that bullshit and stop it from happening. And there’s just a whole lot of bullshit in business today.” He cites the rise of growth hackers, programming “ninjas,” and thought leaders whose job identities are invented or incoherent.
philosophy  solutionism  tech  from instapaper
april 2017 by ayjay
Netflix's biggest competitor? Sleep
Sometimes, tech firms have a different view of their competition than everyone else because of the sheer scale on which they operate. Google may think Google+ is a Facebook competitor, for instance; but Facebook thinks its competitors are video games and TV. You aren’t going to leave Facebook for another social network, and it knows that, so its job is to maximise the amount of time you actually spend using it. For that, it needs to be more compelling than all the other things you could be doing with your time.
tech  socialmedia  from instapaper
april 2017 by ayjay
My life with Oliver Sacks: ‘He was the most unusual person I had ever known’ | Books | The Guardian
Not long after I moved to New York, Michael Jackson died. O had no idea who Michael Jackson was. “What is Michael Jackson?” he asked me the day after the news – not who but what – which seemed both a very odd and a very apt way of putting it, given how much the brilliant singer had transmuted from a human into an alien being. O often said he had no knowledge of popular culture after 1955, and this was not an exaggeration. He did not know popular music, rarely watched anything on TV but the news, did not enjoy contemporary fiction, and had zero interest in celebrities or fame (including his own). He didn’t possess a computer, had never used email or texted; he wrote with a fountain pen. This wasn’t pretentiousness; he wasn’t proud of it; indeed, this feeling of “not being with it” contributed to his extreme shyness. But there was no denying that his tastes, his habits, his ways – all were irreversibly, fixedly, not of our time.

“Do I seem like I am from another century?” he would sometimes ask me, almost poignantly. “Do I seem like I am from another age?”

“You do, yes, you do.”
tech  bloggable 
april 2017 by ayjay
Weaponized Narrative Is the New Battlespace - Defense One
Weaponized narrative seeks to undermine an opponent’s civilization, identity, and will by generating complexity, confusion, and political and social schisms. It can be used tactically, as part of explicit military or geopolitical conflict; or strategically, as a way to reduce, neutralize, and defeat a civilization, state, or organization. Done well, it limits or even eliminates the need for armed force to achieve political and military aims.

The efforts to muscle into the affairs of the American presidency, Brexit, the Ukraine, the Baltics, and NATO reflect a shift to a “post-factual” political and cultural environment that is vulnerable to weaponized narrative. [...]

In the hands of professionals, the powerful emotions of anger and fear can be used to control adversaries, limit their options, and disrupt their functional capabilities. This is a unique form of soft power. In such campaigns, facts are not necessary because – contrary to the old memes of the Enlightenment – truth does not necessarily prevail. It can be overwhelmed with constantly repeated and replenished falsehood. Especially powerful are falsehoods or simplifications that the target cohort has been primed to believe by the underlying narratives with which they are also being supplied.
politics  tech  bloggable 
march 2017 by ayjay
Against Everything | George Scialabba
Illich proposed “a new kind of modern tool kit”—not devised by planners but worked out through a kind of society-wide consultation that he called “politics,” undoubtedly recognizing that it bore no relation to what currently goes by that name. The purpose of this process was to frame a conception of the good life that would “serve as a framework for evaluating man’s relation to his tools.” Essential to any feasible conception, Illich assumed, was identifying a “natural scale” for life’s main dimensions. “When an enterprise [or an institution] grows beyond a certain point on this scale, it first frustrates the end for which it was originally designed, and then rapidly becomes a threat to society itself.”

A livable society, Illich argued, must rest on an “ethic of austerity.” Of course, he didn’t mean by “austerity” the deprivation imposed by central bankers for the sake of “financial stability” and rentier profits. Nor, though he rejected affluence as an ideal, did he mean asceticism. He meant “limits on the amount of instrumented [i.e., technical or institutional] power that anyone may claim, both for his own satisfaction and in the service of others.” Instead of global mass society, he envisioned “many distinct cultures . . . each modern and each emphasizing the dispersed use of modern tools.”
tech  sociology 
march 2017 by ayjay
Living without money: what I learned | Environment | The Guardian
Yet our activism today has become as tame and timid as our neatly-trimmed gardens. The worlds of political, social and ecological campaigning can no longer continue with activism-as-usual. It is simply not working. None of this is a criticism of the determined people who participate in these movements for change, and I am not suggesting that there are no success stories. But if you step back and honestly look at the state of our ecological and social landscapes, all the indicators of health are on a steep decline. To have some chance of returning these landscapes to vitality, our political landscape needs rewilding.

It is a terrifying, yet exciting, time to be alive. We can turn the biggest crises of our age into something that gives our lives a renewed sense of meaning and purpose. But to do so, I believe we have to upgrade the three r’s of the climate change generation from “reduce, reuse, recycle” to something more befitting of the crises unfolding before us: “resist, revolt, rewild”.

Now is the time to be bold. We need to stop the onslaught of the machine into the natural world using every means that is effective, or before we know it we will have witnessed the devastation and loss of all the beauty that still remains. If we allow that to happen, we shall deserve our fate. Instead, if we fight back then we may earn ourselves a future that, at this dark hour before the dawn, we cannot even imagine yet.
tech  ecology  economics  money 
february 2017 by ayjay
Technology destroys people and places. I’m rejecting it | Mark Boyle | Opinion | The Guardian
Living without complex technology has its own difficulties, especially for people like me who were never initiated into those ways. But already I much prefer it. Instead of making a living to pay bills, I make living my life. Contrary to expectation, my biggest issue is not being bored, but how to do all the things I’d love to do. Of course hand-washing your clothes can be a pain sometimes, but that minor inconvenience is hardly worth destroying the natural world over.

Well-intentioned friends often try to convince me to go off-grid, but in using batteries, electrical cables and photovoltaic panels (as I once did), I would still be connected, by a peculiar sort of invisible cable, to the global network of quarries, factories, courtrooms, mines, financial institutions, bureaucracies, armies, transport networks and workers needed to produce such things. They also ask me to stay on social media to speak out about the technology issue, but I say I’m denouncing complex technology simply by renouncing it. My culture made a Faustian pact, on my behalf, with those devilish tyrants Speed, Numbers, Homogeneity, Efficiency and Schedules, and now I’m telling the devil I want my soul back.

My life has its fair share of irony, and it can look hypocritical. Despite originally writing these words (a technology) with a pencil (a technology) in a hand-crafted cabin (a technology), the irony of this being an online blog is not lost on me. That is my compromise for now, for if you want to contribute to a healthier society, compromise can be a healthy thing if you know your boundaries. Being a hypocrite is always my highest ideal, as it means I’ve set higher standards for myself to strive for than I’m achieving at any one moment.
tech  textpatterns 
february 2017 by ayjay
Infernal Machine Collective Manifesto: On the Occasion of the Inauguration | The Infernal Machine
During the age of high technology the academic study of media developed its own high towers and professional enclaves: communications; radio, film, and television; cinema.  It also included courses from journalism, speech communication, economics, business, and literature. Each operated on its own frequency. Technology studies, meanwhile, built an edifice (rather plain and drab at first, until a Gothic renovation by a Frenchman, Bruno Latour, with a penchant for networks, actants, and jokes). If the age of high technology yielded a change in the categories, such that agency was distributed and binaries upended (a “general cyborg condition,” as Donna Haraway put it), then what does the fast-advancing Digital Era call for? What philosophy will grasp this history?

A chorus on the Left decries the “fading of fact,” as though we had not attached media and rhetoric to the disappearance of fact for half a century—or since Plato. How can our self-proclaimed sophisticates have failed to see this continent of intellectual energy emerging outside their media, yet on the platforms those media share? How can those trained to think of Enlightenment as having the darkest of sides, a necessary backlash in its very heart, be so naively surprised by this predictable development?
textpatterns  tech  media 
january 2017 by ayjay
Equipping people to stay ahead of technological change | The Economist
WHEN education fails to keep pace with technology, the result is inequality. Without the skills to stay useful as innovations arrive, workers suffer—and if enough of them fall behind, society starts to fall apart. That fundamental insight seized reformers in the Industrial Revolution, heralding state-funded universal schooling. Later, automation in factories and offices called forth a surge in college graduates. The combination of education and innovation, spread over decades, led to a remarkable flowering of prosperity.
Today robotics and artificial intelligence call for another education revolution. This time, however, working lives are so lengthy and so fast-changing that simply cramming more schooling in at the start is not enough. People must also be able to acquire new skills throughout their careers.
Unfortunately, as our special report in this issue sets out, the lifelong learning that exists today mainly benefits high achievers—and is therefore more likely to exacerbate inequality than diminish it. If 21st-century economies are not to create a massive underclass, policymakers urgently need to work out how to help all their citizens learn while they earn. So far, their ambition has fallen pitifully short.
economics  tech 
january 2017 by ayjay
Democracy is in crisis, but blaming fake news is not the answer | Evgeny Morozov
The big threat facing western societies today is not so much the emergence of illiberal democracy abroad as the persistence of immature democracy at home. This immaturity, exhibited almost daily by the elites, manifests itself in two types of denial: the denial of the economic origins of most of today’s problems; and the denial of the profound corruption of professional expertise.

The first type manifests itself whenever phenomena like Brexit or Donald Trump’s electoral success are ascribed primarily to cultural factors such as racism or voter ignorance. The second type denies that the immense frustration many people feel towards existing institutions stems not from their not knowing the whole truth about how they operate but, rather, from knowing it all too well. [...]

The problem is not fake news but the speed and ease of its dissemination, and it exists primarily because today’s digital capitalism makes it extremely profitable – look at Google and Facebook – to produce and circulate false but click-worthy narratives.

To recast the fake news crisis this way, however, would require the establishment to transcend one of their denials and dabble in the political economy of communications. And who wants to acknowledge that, for the past 30 years, it has been the political parties of the centre-left and centre-right that touted the genius of Silicon Valley, privatised telecommunications and adopted a rather lax attitude to antitrust enforcement?
democracy  tech  socialmedia  from instapaper
january 2017 by ayjay
The Watchers
Benkler advocates systems that allow personal data to remain in the hands of consumers—minimizing the privacy risks posed by governments, corporations, and hackers because personal information is not concentrated in a single place. (The technical term is “distributed network ecosystems based on open-source software.”) “Relying on a small number of high-end companies to provide security creates a single point of failure for hundreds of millions,” he says, referring to the 2014 theft of Yahoo user accounts. “If all those…people had decentralized email storage at home, and sign-on credentials that were not valid for diverse critical sites, collecting [that information] would be much harder.”

“It’s a challenge to get people to adopt safe habits,” he admits, “but it’s not impossible. You have to change users’ culture, and you have to design secure systems that are under the control of end users, not single companies.” The iPhone, secured with a PIN or a fingerprint, is an example of such encrypted, secure-by-default systems. Such devices aren’t hard to build—but, he says pointedly, “It’s hard to do so [within] a business model that depends on spying on your customers so you can sell them to advertisers.”
security  tech  from instapaper
january 2017 by ayjay
Welcome to the Future Nauseous - Venkatesh Rao
My new explanation is this: we live in a continuous state of manufactured normalcy. There are mechanisms that operate — a mix of natural, emergent and designed — that work to prevent us from realizing that the future is actually happening as we speak. To really understand the world and how it is evolving, you need to break through this manufactured normalcy field. Unfortunately, that leads, as we will see, to a kind of existential nausea. [...]

So what about elements of the future that arrive relatively successfully for everybody, like cellphones? Here, the idea I called the Milo Criterion kicks in: successful products are precisely those that do not attempt to move user experiences significantly, even if the underlying technology has shifted radically. In fact the whole point of user experience design is to manufacture the necessary normalcy for a product to succeed and get integrated into the Field. In this sense user experience design is reductive with respect to technological potential.

So for this bucket of experiencing the future, what we get is a Darwinian weeding out of those manifestations of the future that break the continuity of technological experience. So things like Google Wave fail. Just because something is technically feasible does not mean it can psychologically normalized into the Field.
futurism  tech  from instapaper
december 2016 by ayjay
Dancing the flip-flop | Robin Sloan
Let’s start with a definition, and then I’ll show you some examples.

the flip-flop (n.) the process of pushing a work of art or craft from the physical world to the digital world and back again—maybe more than once

That’s pretty abstract. Here’s an example recipe:

1) Carve a statue out of stone. PHYSICAL

2) Digitize your statue with a 3D scanner. DIGITAL

3) Make some edits. Shrink it down. Add wings. STILL DIGITAL

4) Print the edited sculpture in plastic with a 3D printer. PHYSICAL AGAIN

It’s step three above that is most crucial to the flip-flop, because that’s where it becomes clear you aren’t aiming for fidelity in these transitions from physical to digital and back.

When you do the flip-flop, you achieve effects that aren’t possible in physical or digital space alone. You also achieve effects that are less predictable. Weird things happen in the borderlands.
art  tech 
november 2016 by ayjay
Why the high-tech ideas of ‘Bucky’ Fuller are back in vogue – Samanth Subramanian | Aeon Essays
Fuller’s career seemed to evolve in reaction to failure. That winter in 1927, he’d been determined to drown himself after his business venture went under. Then, when he failed to commit suicide, he became an inventor. And when his inventions – the house, the car, the bathroom – proved stillborn, he became a futurist. In the 1960s, he stopped building things he thought the world required and started forecasting those requirements instead.

As a result, Fuller’s attentions started to span the entire planet. If, earlier, he was boiling down his ideas into the design of a showerhead, now he was scaling them up into abstract global systems: international power grids; simulation games that played for world peace; supply chains for metals, minerals and other planetary resources. (Admittedly, the technology for a global information network didn’t exist at the time. Fuller would have been thrilled by the internet, and its capacity to trace and accelerate the allocation of goods.) This kind of ‘comprehensivist’ approach, Fuller’s clunker of a term, was the only means ‘to make the world work for 100 per cent of humanity, in the shortest possible time, through spontaneous cooperation, without ecological offence or the disadvantage of anyone’. [...]

Fuller’s advocacy of technology as a salve for the wounds of modernity found a fierce critic in the sociologist Lewis Mumford, who longed for a more organic humanism. The two men proposed such contrasting versions of the future that Horizon magazine wondered, in 1968: ‘Which guide to the Promised Land? Fuller or Mumford?’ Mumford deplored the sterility of the sort of future that techno-faddists wanted for the human race. In an acid passage from 1956 that might have been aimed squarely at Fuller and his bubble-domed cities, Mumford wrote:
If the goal of human history is a uniform type of man, reproducing at a uniform rate, in a uniform environment, kept at a constant temperature, pressure and humidity, like a uniformly lifeless existence, with his uniform physical needs satisfied by uniform goods… most of the problems of human development would disappear. Only one problem would remain: why should anyone, even a computer, bother to keep this kind of creature alive?
futurism  tech  from instapaper
november 2016 by ayjay
What Will Break People’s Addictions to Their Phones?
While some blame our collective tech addiction on personal failings, like weak willpower, Harris points a finger at the software itself. That itch to glance at our phone is a natural reaction to apps and websites engineered to get us scrolling as frequently as possible. The attention economy, which showers profits on companies that seize our focus, has kicked off what Harris calls a “race to the bottom of the brain stem.” “You could say that it’s my responsibility” to exert self-control when it comes to digital usage, he explains, “but that’s not acknowledging that there’s a thousand people on the other side of the screen whose job is to break down whatever responsibility I can maintain.” In short, we’ve lost control of our relationship with technology because technology has become better at controlling us.

Under the auspices of Time Well Spent, Harris is leading a movement to change the fundamentals of software design. He is rallying product designers to adopt a “Hippocratic oath” for software that, he explains, would check the practice of “exposing people’s psychological vulnerabilities” and restore “agency” to users. “There needs to be new ratings, new criteria, new design standards, new certification standards,” he says. “There is a way to design based not on addiction.”
tech  socialmedia  addiction  from instapaper
october 2016 by ayjay
Politics Is Upstream of AI
Technology is developed by humans inside institutions governed by states. To understand the influence of politics on AI, it is necessary to imagine the relationship between states and AI projects. Typical discussions of AI focus on the relationship between the AI and the researchers, but I believe we should be examining the entire stack that creates the AI—which includes the state.

It’s likely that AI projects will be funded by states, and even if they are not, the state will be unable to stay away if the projects show any signs of progress. States cannot afford to leave potentially powerful technology on the table, especially when there are security implications. They have a mandate to monitor any dangerous technologies, and they want to stay at the cutting edge of technology themselves.

Try thinking like any of the major superpower states, and you will get a better sense of what that state’s incentives are. If other states are developing advanced technologies, then states have to participate in an arms race. An arms race is a pessimistic subject, but my analysis suggests that states have already formed opinions about the game theory of future technologies. That ship has sailed.
politics  tech  AI  NRx 
september 2016 by ayjay
The Dwindling Promise of Social Media
I care about most of the things my readers do — corporate encroachment of education, a desire for “free-range” education, emergence, creativity, what-not. But at the root of all of it for me is a simple dream I had that we all shared, that we could use technology to make ourselves smarter and better people. And it seemed for a while like we were heading there, until the current interests took over and turned technology into Skinner boxes for advert agencies.

I’ll tell you the truth. I don’t even give so much a crap about all Google’s data mining and analytics. I’d deal with it, if Google could just get that one fricking cardiologist a Google Now message that says “Hey, update: Opioids are addictive.” But Google Now is not going to do that, because the dream of Google is not the dream of Engelbart or Kay. Those inventors wanted a world where we became better people, better doctors, better citizens, better architects. Google Now doesn’t give a crap about any of that. Google Now doesn’t want to make you a better doctor or a more compassionate human. It just wants to get AI down enough that it can sell you a Starbucks on your morning commute. And eventually, maybe opioids for your back pain too.

Because it’s all just data, right?

Good job everyone. Welcome to the future.
tech 
august 2016 by ayjay
Remembering the Office of the Future: The Origins of Word Processing and Office Automation
Coinage of word processing is usually attributed to Ulrich Steinhilper, a German IBM typewriter sales executive. In his memoir, Steinhilper wrote that he devised the concept in the mid-1950s and promoted it for many years within IBM’s Office Products Division. He submitted the diagram shown in Figure 1 to IBM’s internal suggestion program, receiving just 25 Deutsch Marks and a reply that the idea was “too complicated to explain.” According to Steinhilper, the term finally caught on after he used it in a 1966 speech to senior Office Products Division managers gathered at the Miami meeting of the Hundred Percent Club of successful IBM salespeople where he lobbied, unsuccessfully, for Word Processing as a new name for the entire Office Products Division. In 1971, once the concept finally gained traction, Steinhilper was awarded an Outstanding Achievement Award and a trip around the world for having authored and promoted it. It had particular appeal to typewriter salespeople within IBM as a linguistic means of putting the Office Products Division (formerly the Electric Typewriter Division) on a more equal basis with the mighty Data Processing Division. The word processing concept cast the two groups as responsible for different, but equally important-sounding, kinds of business processing.
tech  wordprocessing  writing 
august 2016 by ayjay
Gabriel (-Honoré) Marcel (Stanford Encyclopedia of Philosophy)
“I should like to start,” Marcel says, “with a sort of global and intuitive characterization of the man in whom the sense of the ontological—the sense of being, is lacking, or, to speak more correctly, the man who has lost awareness of this sense” (Marcel 1995, p. 9). This person, the one who has lost awareness of the sense of the ontological, the one whose capacity to wonder has atrophied to the extent of becoming a vestigial trait, is an example of the influence of the misapplication of the idea of function. Marcel uses the example of a subway token distributor. This person has a job that is mindless, repetitive, and monotonous. The same function can be, and often is, completed by automated machines. All day this person takes bills from commuters and returns a token and some change, repeating the same process with the same denominations of currency, over and over. The other people with whom she interacts engage her in only the most superficial and distant manner. In most cases, they do not speak to her and they do not make eye contact. In fact, the only distinction the commuters make between such a person and the automatic, mechanical token dispenser down the hall is to note which “machine” has the shorter line. The way in which these commuters interact with this subway employee is clearly superficial and less than desirable. However, Marcel's point is more subtle.

What can the inner reality of such a person be like? What began as tedious work slowly becomes infuriating in its monotony, but eventually passes into a necessity that is accepted with indifference, until even the sense of dissatisfaction with the pure functionalism of the task is lost. The unfortunate truth is that such a person may come to see herself, at first unconsciously, as merely an amalgamation of the functions she performs. There is the function of dispensing tokens at work, the function of spouse and parent at home, the function of voting as a citizen of a given country, etc. Her life operates on a series of “time-tables” that indicate when certain functions—such as the yearly maintenance trip to the doctor, or the yearly vacation to rest and recuperate—are to be exercised. In this person the sense of wonder and the exigence for the transcendent may slowly begin to wither and die. In the most extreme cases, a person who has come to identify herself with her functions ceases to even have any intuition that the world is broken.

A corollary of the functionalism of the modern broken world is its highly technical nature. Marcel characterizes a world such as ours—in which everything and everyone becomes viewed in terms of function, and in which all questions are approached with technique—as one that is dominated by its “technics.” This is evident in the dependence on technology, the immediate deferral to the technological as the answer to any problem, and the tendency to think of technical reasoning as the only mode of access to the truth. However, it is clear that there are some “problems” that cannot be addressed with technique, and this is disquieting for persons who have come to rely on technics. While technology undoubtedly has its proper place and use, the deification of technology leads to despair when we realize the ultimate inefficacy of technics regarding important existential questions. It is precisely this misapplication of the idea of function and the dependence on technics that leads to the despair that is so prevalent in the broken world. Obviously, we cannot turn back the clock with regard to technological progress, and Marcel acknowledges that technology is not necessarily detrimental to the life of the spirit; nevertheless, it often is, because: “does not the invasion of our life by techniques today tend to substitute satisfaction at a material level for spiritual joy, dissatisfaction at a material level for spiritual disquiet?” (Marcel 1985, p. 57).
philosophy  tech  self  THM 
july 2016 by ayjay
About 1999.io
1999.io is easy for writers to get started with, is completely customizable by designers, and can be extended by programmers through easy APIs and full access to the server code.
blogging  software  tech 
july 2016 by ayjay
Faculty Spotlight: Erik Hurst | Becker Friedman Institute
In this strand of my research, I’m almost flipping that theory on its head by asking if it is possible that technology can also affect labor supply. In our culture, where we are constantly connected to technology, activities like playing Xbox, browsing social media, and Snapchatting with friends raise the attractiveness of leisure time. And so it goes that if leisure time is more enjoyable, and as prices for these technologies continue to drop, people may be less willing to work at any given wage. This explanation may help us understand why we see steep declines in employment while wages remain steady – a trend that has been puzzling economists.

Right now, I’m gathering facts about the possible mechanisms at play, beginning with a hard look at time-use by young men with less than a four-year degree. In the 2000s, employment rates for this group dropped sharply – more than in any other group. We have determined that, in general, they are not going back to school or switching careers, so what are they doing with their time? The hours that they are not working have been replaced almost one for one with leisure time. Seventy-five percent of this new leisure time falls into one category: video games. The average low-skilled, unemployed man in this group plays video games an average of 12, and sometimes upwards of 30 hours per week. This change marks a relatively major shift that makes me question its effect on their attachment to the labor market.

To answer that question, I researched what fraction of these unemployed gamers from 2000 were also idle the previous year. A staggering 22% - almost one quarter – of unemployed young men did not work the previous year either. These individuals are living with parents or relatives, and happiness surveys actually indicate that they quite content compared to their peers, making it hard to argue that some sort of constraint, like they are miserable because they can’t find a job, is causing them to play video games. The obvious problem with this lifestyle occurs as they age and haven’t accumulated any skills or experience. As a 30- or 40-year old man getting married and needing to provide for a family, job options are extremely limited. This older group of lower-educated men seems to be much less happy than their cohorts.
economics  games  tech 
july 2016 by ayjay
my first commencement speech | Abler.
I won’t cheapen this day by offering you a simple victory narrative. If only, IF ONLY the doors of the world were entirely made of wood and steel. If only it were so simple—to make the world better, just using atoms and bits.

Think about the doors of the immaterial kind: the portals, the thresholds, the entry points to human flourishing that are only open to some, and sealed shut for others. These are doors whose pushing open and pulling closed are social, political, interpersonal mechanisms—mechanisms that no amount of physics alone can sway.

In other words: to find yourself equipped as an engineer in the physical, technical sense—to be able to intervene and even dismantle the doors of the tangible, built world—is still to find yourself an ordinary citizen with a much harder set of questions to engage. How do we share this planet? How do we talk to each other, people unlike ourselves? How do we grapple with the legacies of history? How do we build not only the future we can construct, but the just and sustainable future we want to live in, one that includes all of us? To pry open and build these kinds of entrances, you will use your engineering, yes, but you’ll need so much more than that. You’ll need wisdom, and you’ll have to look for it and recognize it far outside of technology.
tech  engineering  ethics 
july 2016 by ayjay
Instagram and the Fantasy of Mastery
Style annuls the impersonal. This is what separates style from a look, because looks, hammered out by filters, presets, and templates—in short, by techniques—depend on unanimity: between a fast, evocative image that conjures up other, more established images (drip paintings and blotched monochromes; the color and light of contemporary Hollywood action movies; the “haziness” of certain films from the 1970s, often achieved by “flashing” or exposing film stock prior to processing)and a viewer on whom nothing is ever lost. Looks, to the extent they have any connection to the idea of tradition, treat the history of images as a history of changing qualities of resolution. Each technological feat—analog to digital, standard definition to high definition—­becomes an endorsement of newer classes of “sharper” images, each with its own reproducible artifacts and flaws.
art  socialmedia  tech  from instapaper
july 2016 by ayjay
Erratum to: Book Symposium on Peter Paul Verbeek’s Moralizing Technology: Understanding and Designing the Morality of Things. Chicago: University of Chicago Press, 2011 | SpringerLink
With mediation theory functioning as the main engine of philosophical analysis, Moralizing Technology raises the following questions:

• How should the moral significance of technology be conceptualized? Are the intellectual resources found in mainstream meta-ethics and engineering ethics sufficient for answering this question? What is the most justified way to go beyond the commonplace instrumentalist perspective, which restricts the moral status of technologies to the causal role they play in realizing and impeding human moral intentions?

• What conception of subjectivity is appropriate for understanding who human beings are when they inhabit a lifeworld of ubiquitous technological mediation? Does such a subject possess sufficient autonomy to qualify as a moral agent? Or, is the concept of “moral agency” in a need of rethinking so as to better accord with the phenomenological facts captured by mediation theory analyses of technological use?

• Should technologies be recognized as a new category of moral agents?

• Is moral reason giving a sufficient response to the fundamental problems posed by technology? Or, is the conception of the philosopher as the preeminent producer of archive friendly texts outdated and in need of replacement by a materialist ethics of social design?

• How can mediation theory be applied to the emerging fields of ambient intelligence and persuasive technology?

• Do structures of intentionality exist that fall beyond the scope covered by mediation theory? If so, what is their significance?
tech  bloggable 
july 2016 by ayjay
The Unexotic Underclass | The MIT Entrepreneurship Review
The space  that caters to my demographic – the cushy 20 and 30-something urbanites – is oversaturated. It’s not rocket science: people build what they know.  Cosmopolitan, well-educated young men and women in America’s big cities are rushing into startups and building for other cosmopolitan well-educated young men and women in big cities.  If you need to plan a trip, book a last minute hotel room , get your nails done, find a date, get laid, get an expert shave, hail a cab, buy clothing, borrow clothing, customize clothing, and share the photos instantly, you have Hipmunk, HotelTonight, Manicube, OKCupid, Grindr, Harry’s, Uber, StyleSeek, Rent the Runway, eshakti/Proper Cloth and Instagram respectively to help you. These companies are good, with solid brains behind them, good teams and good funding.

But there are only so many suit customisation, makeup sampling, music streaming, social eating, discount shopping, experience  curating companies that the market can bear.  If you’re itching to start something  new, why chase the nth  iteration of a company already serving the young, privileged, liberal jetsetter? If you’re an investor, why revisit the same space as everyone else?  There is life, believe me, outside of NY, Cambridge, Chicago, Atlanta, Austin, L.A. and San Fran.

It’s where the unexotic underclass lives.  It’s called America.
poverty  tech 
june 2016 by ayjay
My Wine Accumulation | Hazlitt
Is it then fair to say that, in this second decade of the new millennium, between those two poles is the screen? Each spring, my Instagram feed blooms bright pink, images of rosé and the seductive allure of life’s greatest pleasure—day drinking—melding with crostini and flushed limbs, splayed out across tables or lawns. Scrolling through it on a warm Sunday evening, you can start to piece together the collective yearning of a generation. Food and drink in particular have become these loci of desire because they are so easy to perform, universal, but infinitely capable of eliciting want. Yes, we are always hungry, but we never stop craving the feeling of craving itself, seeking out the things that spur us toward desire. Perhaps it’s how temporary food and drink are: As Edward Lee put it in a Mind of a Chef episode (titled “Impermanence,” of course), the gourmet is a simple, accessible metaphor for mortality. A generation was told that theirs will be the first to not exceed the wealth of its parents: artfully arranging food and, just before it is gone forever, taking a picture of it, is a futile and entirely understandable attempt to hold on to pleasure.

Wine is the goal of a life well lived, but also a way to temporarily forget you aren’t there yet.
food  melancholy  consumption  tech  from instapaper
june 2016 by ayjay
'Ulysses' and the Lie of Technological Progress
The relatively unknown world of of electronic literature, of which Twittering Rocks is a humble example, has been hard-hit by infrastructural decay. Works produced for HyperCard, Apple’s once-popular programming tool, have long since ceased to be viewable on modern computers. That includes works by Douglas Adams, Michael Crichton, and others published under the Voyager Expanded Books shingle in the early 1990s. Likewise, works created for the early hypertext authoring system Storyspace, including Michael Joyce’s afternoon, a story and Shelley Jackson’s Patchwork Girl cannot easily be experienced as they were originally conceived. And works like Natalie Bookchin’s The Intruder, created with Shockwave, once as ubiquitous as Flash once was, also can’t run anymore. If you’ve never heard of or experienced any of these works, that’s sort of the point. One of the features that allowed Ulysses to become canon—hardly the only one, but one nevertheless—was its ability to be read by human eyeballs in 2016 as much as in 1986, 1956, or 1926....

It turns out that the horror of seeing our culture break and decay and disappear is also part of the delight of that culture. Today, everything everyone does is captured, stored, retrievable with a few taps on a magic, universal remote. This prospect is so terrifying that our European friends have been pursuing a “right to be forgotten,” lest Google’s index unfairly rule peoples’ futures. But at the same time, all the software and devices that do all that recording and storage also cease to function quickly—almost immediately, from the vantage point of historical time. The right to be remembered is no less at risk.

Carnivals embrace bacchanalia, the resignation to the pleasures of the flesh from those of the mind. But among those pleasures is the pleasure of destruction, of neglect, of abandonment. Bloomsday is the most contemporary of holidays, because it puts the lie to the conceit of contemporary life: that we move ever-forward through progress, amassing knowledge and innovation and adeptness. How quickly forgotten is the fundamental lesson of modernism—that entropy rules, but that we can still simulate order, even if just for a time, by reassembling shrapnel plucked from the atmosphere. Persistence is always vulgar before it is honorable. And that’s what Bloomsday is really for. It celebrates the ultimate technological advancement no matter the period: not discovery or innovation, but the warm, drunk rumble of the conversation between progress and decay.
elit  tech  reading  obsolescence 
june 2016 by ayjay
Smartphones Won’t Make Your Kids Dumb. We Think. — How We Get To Next
One approach that has been shown to help under-threes learn better is to build tools that use “nudge technologies” geared at the parents. This could be text messages or emails that remind parents to sing or talk with their baby, to help both parents and child disengage from technology and apply learnings to the real world. Children’s tablet maker LeapFrog does something similar with its LeapPad devices. Parents receive emails about what their child has learned from the touchscreen, along with ideas of how they could apply this new knowledge away from the screen.

“The extent to which parents are tied up with these devices in ways that disrupt the interactions with the child has potential for a far bigger impact,” says Heather Kirkorian, who heads up the Cognitive Development & Media Lab at the University of Wisconsin-Madison. “If I’m on the floor with a child but checking my phone every five minutes, what message does that send?” How much parents play with and talk to their kids is a very powerful predictor of how the kids will develop, she adds.
parenting  tech  from instapaper
june 2016 by ayjay
Technological unemployment, then and now
The fears about automation’s job-killing potential that boiled up in the 1950s didn’t pan out. That’s one reason why “smart economists” — no, it’s not an oxymoron — became so convinced that technological unemployment, as a broad rather than a local phenomenon, was mythical. But yesterday’s automation is not today’s automation. What if a new wave of computer-generated automation, rather than putting more money into the hands of masses of consumers, ended up concentrating that wealth, in the form of greater profits, into the hands of a rather small group of plutocrats who own and control the means of automation? And what if automation’s reach extended so far into the human skill set that the range of jobs immune to automation was no longer sufficient to absorb displaced workers? There may not be a “lump of labor,” but we may discover that there is a “lump of skills.”

Henry Ford increased the hourly wage of workers beyond what was economically necessary because he knew that the workers would use the money to buy Ford cars. He saw that he had an interest in broadening prosperity. It seems telling that it has now become popular among the Silicon Valley elite to argue that the government should step in and start paying people a universal basic income. With a universal basic income, even the unemployed would still be able to afford their smartphone data plans.
tech  automation 
june 2016 by ayjay
The tyranny of transparency
The critics of the Eastern Bloc regimes had a point. In an essay from 1984, the Czech writer Milan Kundera reflected on the secret police’s publishing of private conversations between two leaders of the Prague Spring, Jan Prochazka and Professor Vaclav Cerny:
‘For the police it was an audacious, unprecedented act. And, surprisingly, it nearly succeeded; instantly Prochazka was discredited: because in private, a person says all sorts of things, slurs friends, uses coarse language, acts silly, tells dirty jokes, repeats himself, makes a companion laugh by shocking him with outrageous talk, floats heretical ideas he’d never admit in public, and so forth. Of course, we all act like Prochazka, in private we badmouth our friends and use coarse language; that we act different in private than in public is everyone’s most conspicuous experience, it is the very ground of the life of the individual; curiously, this obvious fact remains unconscious, unacknowledged, forever obscured by lyrical dreams of the transparent glass house. It is rarely understood to be the value one must defend beyond all others. Thus only gradually did people realise (though their rage was all the greater) that the real scandal was not Prochazka’s daring talk but the rape of his life; they realised (as if by electric shock) that private and public are two essentially different worlds and that respect for that difference is the indispensable condition, the sine qua non, for a man to live free.’

But today, too many are in thrall to the ‘lyrical dreams of the transparent glass house’. ‘Transparency’ has become the sine qua non for a man to live correctly, to conform to the correct opinions, the correct views, the correct conduct. It has become a panacea for politicians battling public cynicism, and a virtue-signalling buzzword for business. Transparency is no longer a threat, as it was in the imaginings of Orwell or the life of Kundera: it is an aspiration.
surveillance  tech 
april 2016 by ayjay
How Cell Phones, Computers, Gaming and Social Media Are Changing Our Brains | Second Nature
In Chapter Six, ‘The Story of Alpha’, she describes the brain science of alpha and beta waves as these relate to addiction, learning, creativity, and socialization. She shows how the cycle of arousal and reward commonly applied to addiction relates to the use of all digital media; how the use of i-media deregulates and hijacks alpha wave activity into the narrow domain of the software, and offers seductive and repetitive rewards.  She describes how the gaming industry uses current neurological research deliberately to create addictive game designs. She compares the gains in learning from unstructured, unmediated play to the highly structured learning of games and programmed learning; the games and programmed learning come up short in all contexts not structured to the game or program. She also shows how game makers abuse the emerging neuroscience to market specious claims of improved skills and learning to a gullible and uncritical audience of gamers, parents, and educators.

In short, digital media can undermine reward and social development in other areas such as early and late childhood education, emotional intelligence, and sexuality if it comes to dominate the user’s life.
socialmedia  tech  addiction 
april 2016 by ayjay
Will we compile? | ROUGH TYPE
Computers can’t choose our goals for us, Wolfram correctly observes. “Goals are a human construct.” Determining our purposes will remain a human activity, beyond the reach of automation. But will it really matter? If we are required to formulate our goals in a language a machine can understand, is not the machine determining, or at least circumscribing, our purposes? Can you assume another’s language without also assuming its system of meaning and its system of being? The question isn’t a new one. “I must create a system, or be enlaved by another man’s,” wrote William Blake two hundred years ago. Poets and other thoughtful persons have always struggled to express themselves, to formulate and fulfill their purposes, within and against the constraints of language. Up to now, the struggle has been with a language that evolved to express human purposes—to express human being. The ontological crisis changes, and deepens, when we are required to express ourselves in a language developed to suit the workings of a computer. Suddenly, we face a new question: Is a compilable life worth living?
ethics  tech 
march 2016 by ayjay
Anthony T. Grafton | The Importance of Being Printed
Eisenstein wishes to emphasize how radical the break was between the age of scribes and that of printers. To do so she minimizes the extent to which any text could circulate in stable form before mechanical means of reproduction became available. She suggests that almost no reader in any age of manuscripts could have access to a large number of texts. She both argues and implies that the scribal book trade was a casual and ill-organized affair; she clearly holds that no single scribe could produce any large number of books. She relies heavily on De la Mare's pioneering demonstration that Vespasiano da Bisticci, the most famous Florentine manuscript dealer, operated on a far smaller scale than traditional accounts suggest. And she tends to downplay evidence that lay literacy was increasing rapidly even before printing was invented.

I cannot feel that Eisenstein has done justice to the available evidence. She talks a great deal about Vespasiano's backwardness, but not at all about that well-organized and productive scribe Diebold Lauber, who was innovative enough to issue written broadsides listing and advertising his wares. She says very little about the effects of the new educational institutions that popped up like mushrooms in many parts of Europe during the period 1350 to 1500, which must have had a sizeable impact on the level of literacy among members of the lay elite: for example, the ten German universities, all with law faculties, that were founded between 1365 and 1472. And though she criticizes Kristeller for suggesting that a work preserved in three copies "attained a certain diffusion" (21 I), she says nothing at all about the well-known studies by Soudek and Schucan, both inspired by Kristeller.
print  history  tech 
march 2016 by ayjay
Why Digital Maps Are Inaccurate in China | Travel + Leisure
It is, in fact, illegal for foreign individuals or organizations to make maps in China without official permission. As stated in the “Surveying and Mapping Law of the People’s Republic of China,” for example, mapping—even casually documenting “the shapes, sizes, space positions, attributes, etc. of man-made surface installations”—is considered a protected activity for reasons of national defense and “progress of the society.” Those who do receive permission must introduce a geographic offset into their products, a kind of preordained cartographic drift. An entire world of spatial glitches is thus deliberately introduced into the resulting map. The central problem is that most digital maps today rely upon a set of coordinates known as the World Geodetic System 1984, or WGS-84; the U.S. National Geospatial-Intelligence Agency describes it as “the reference frame upon which all geospatial-intelligence is based.” However, as software engineer Dan Dascalescu writes in a Stack Exchange post, digital mapping products in China instead use something called “the GCJ-02 datum.” As he points out, an apparently random algorithmic offset “causes WGS-84 coordinates, such as those coming from a regular GPS chip, to be plotted incorrectly on GCJ-02 maps.” GCJ-02 data are also somewhat oddly known as “Mars Coordinates,” as if describing the geography of another planet. Translations back and forth between these coordinate systems—to bring China back to Earth, so to speak—are easy enough to find online, but they are also rather intimidating to non-specialists.
maps  tech 
march 2016 by ayjay
Tips and Myths About Extending Smartphone Battery Life - The New York Times
we teamed up with the Wirecutter, a product recommendations website, to run an array of tests to determine best and worst practices for preserving battery life on smartphones. For those who still need extra juice, the Wirecutter also picked some external battery products.

The results showed that some conventional beliefs about extending battery life — like turning off Wi-Fi or shutting down all your phone’s apps — produced negligible or even harmful results. The Wirecutter also found plenty of helpful practices to get more use out of your battery, like playing music stored directly on the device (instead of streaming it) or tweaking email configurations.

The Wirecutter tested a range of recent Apple and Android smartphones with the latest operating systems in tightly controlled environments. Your phone’s results will vary depending on the phone model, cellular carrier, location and other factors, but the general results should hold. Here are eight tips and seven myths busted by our findings
tech  smartphones 
february 2016 by ayjay
The Unbearable Lightness of Web Pages
So here’s how this will play out. After I reimplement my other web sites using this system, I’ll be making book editions of those sites available for free to anyone who cares to request a copy. This is a win for both sides. The reader gets a permanent, well-designed copy of something they have enjoyed. And I get the satisfaction of knowing those books are out there, being held in various places by people who (at least somewhat) care about them. The point is not to make money selling books, but to sow the writing as far as it can go—even knowing it has very narrow appeal and no market value. It’s a way to write purely for the joy of it, and still know that the work will easily outlive me. My writing will not depend for its continuance on someone maintaining backups and complicated hosting arrangements.

So much for my own stuff. But my larger, longer-shot hope is that others will start thinking about doing this too: building and making use of tools that integrate print publishing with web publishing. This goes back to the concern I talked about earlier. I don’t want the writing on small, independent sites to disappear. Not only do I want it not to disappear, there’s a good deal of it that I would like to have on my bookshelf. Some of those authors and those sites are gone already. More are leaving all the time.
tech  print  internet 
february 2016 by ayjay
« earlier      
per page:    204080120160

related tags

academe  academentia  addiction  aesthetics  AI  algorithms  anthropocene  anthropology  apple  art  attention  audio  automation  bible  biology  bloggable  blogging  body  books  boredom  bureaucracy  capitalism  China  Christianity  church  city  CLI  code  commons  communication  consumption  creativity  criticism  culture  cybernetics  cyborg  democracy  design  DH  disconnection  ecology  economics  education  elit  engineering  enlightenment  entertainment  essays  ethics  Facebook  film  food  futurism  games  gender  google  health  history  homeschool  household  humanism  humanities  information  innovation  internet  invention  iOS  journalism  LAM  language  later  LaTeX  latour  law  library  mac  makers  maps  media  medical  medieval  melancholy  memory  mind  modernism  modernity  money  music  nature  neuroscience  newaesthetic  news  NRx  obsolescence  paper  parenting  pedagogy  person  philosophy  photography  politics  poverty  power  print  privacy  progress  psychology  QS  reading  religion  research  robots  rss  scholarship  science  security  self  SF  singularity  smartphones  socialmedia  sociology  software  solutionism  surveillance  teaching  tech  technique  Technopoly  textpatterns  theology  theory  THM  travel  tv  twocultures  university  via:rhgibson  webdesign  wordprocessing  work  writing 

Copy this bookmark:



description:


tags: