Theorizing Global Infrastructure – global-infrastructures – Medium
This reading list examines (i) the design, construction, and maintenance of global infrastructure, (ii) the relationship between planning, global infrastructure, and politics across local, national, and trans-national scales, and (iii) the integration into these networks into spatially-proximate, if not globally-aligned, regional economies. The seminar will also further (iv) new, comparative methodologies required for globally-oriented research concerned with juxtaposing infrastructural phenomena and patterns such as the free zone that, while originating in the global South, are today materializing in the global North.
The first half of this reading list theorizes global infrastructure from its origins in the 19th century through to its 21st century evolution, focusing on the politics that underlie global infrastructure. The second half of the reading list examines global infrastructure as it impacts cities and regions directly. Empirical case studies are presented of i) the World Bank Group and McKinsey & Company’s respective efforts to finance and build global infrastructure in the global South and North, and ii) the importance of global infrastructure to Manchester, England’s re-emergence in the globalized economy as the heart of the United Kingdom’s ‘Northern Powerhouse’. Finally, the reading list concludes with a speculative look beyond the global to the inter-planetary infrastructure involved in colonizing Mars.
infrastructure  logistics  standards 
18 hours ago
On the dark history of intelligence as domination | Aeon Essays
As I was growing up in England in the latter half of the 20th century, the concept of intelligence loomed large. It was aspired to, debated and – most important of all – measured. At the age of 11, tens of thousands of us all around the country were ushered into desk-lined halls to take an IQ test known as the 11-Plus. The results of those few short hours would determine who would go to grammar school, to be prepared for university and the professions; who was destined for technical school and thence skilled work; and who would head to secondary modern school, to be drilled in the basics then sent out to a life of low-status manual labour.

The idea that intelligence could be quantified, like blood pressure or shoe size, was barely a century old when I took the test that would decide my place in the world. But the notion that intelligence could determine one’s station in life was already much older. It runs like a red thread through Western thought, from the philosophy of Plato to the policies of UK prime minister Theresa May....

As well as determining what a person can do, their intelligence – or putative lack of it – has been used to decide what others can do to them. Throughout Western history, those deemed less intelligent have, as a consequence of that judgment, been colonised, enslaved, sterilised and murdered (and indeed eaten, if we include non-human animals in our reckoning)....

It’s an old, indeed an ancient, story. But the problem has taken an interesting 21st-century twist with the rise of Artificial Intelligence (AI). In recent years, the progress being made in AI research has picked up significantly, and many experts believe that these breakthroughs will soon lead to more. Pundits are by turn terrified and excited, sprinkling their Twitter feeds with Terminator references. To understand why we care and what we fear, we must understand intelligence as a political concept – and, in particular, its long history as a rationale for domination.....

The term ‘intelligence’ itself has never been popular with English-language philosophers. Nor does it have a direct translation into German or ancient Greek, two of the other great languages in the Western philosophical tradition. But that doesn’t mean philosophers weren’t interested in it. Indeed, they were obsessed with it, or more precisely a part of it: reason or rationality. The term ‘intelligence’ managed to eclipse its more old-fashioned relative in popular and political discourse only with the rise of the relatively new-fangled discipline of psychology, which claimed intelligence for itself. ...

Plato concluded, in The Republic, that the ideal ruler is ‘the philosopher king’, as only a philosopher can work out the proper order of things. And so he launched the idea that the cleverest should rule over the rest – an intellectual meritocracy....

Aristotle was always the more practical, taxonomic kind of thinker. He took the notion of the primacy of reason and used it to establish what he believed was a natural social hierarchy. In his book The Politics, he explains: ‘[T]hat some should rule and others be ruled is a thing not only necessary, but expedient; from the hour of their birth, some are marked out for subjection, others for rule.’ What marks the ruler is their possession of ‘the rational element’. ...

So at the dawn of Western philosophy, we have intelligence identified with the European, educated, male human. It becomes an argument for his right to dominate women, the lower classes, uncivilised peoples and non-human animals. ....

Rather than challenging the hierarchy of intelligence as such, many critics have focused on attacking the systems that allow white, male elites to rise to the top. The 11-Plus exam that I took is an interesting, deeply equivocal example of one such system. It was intended to identify bright young things from all classes and creeds. But, in reality, those who passed came disproportionately from the better-resourced, white middle classes, whose members found themselves thereby reaffirmed in their position and advantages.

So when we reflect upon how the idea of intelligence has been used to justify privilege and domination throughout more than 2,000 years of history, is it any wonder that the imminent prospect of super-smart robots fills us with dread?... If we’ve absorbed the idea that the more intelligent can colonise the less intelligent as of right, then it’s natural that we’d fear enslavement by our super-smart creations. If we justify our own positions of power and prosperity by virtue of our intellect, it’s understandable that we see superior AI as an existential threat. ....

This narrative of privilege might explain why, as the New York-based scholar and technologist Kate Crawford has noted, the fear of rogue AI seems predominant among Western white men. Other groups have endured a long history of domination by self-appointed superiors, and are still fighting against real oppressors. White men, on the other hand, are used to being at the top of the pecking order. They have most to lose if new entities arrive that excel in exactly those areas that have been used to justify male superiority....

It’s interesting to speculate about how we’d view the rise of AI if we had a different view of intelligence. Plato believed that philosophers would need to be cajoled into becoming kings, since they naturally prefer contemplation to mastery over men. Other traditions, especially those from the East, see the intelligent person as one who scorns the trappings of power as mere vanity, and who removes him or herself from the trivialities and tribulations of quotidian affairs.

Imagine if such views were widespread: if we all thought that the most intelligent people were not those who claimed the right to rule, but those who went to meditate in remote places, to free themselves of worldly desires; or if the cleverest of all were those who returned to spread peace and enlightenment. Would we still fear robots smarter than ourselves?
intelligence  artificial_intelligence  epistemology  psychology 
yesterday
We’re Already Building New Cities – HOTHOUSE – Medium
In 2000, only 8,000 people lived in these Central Florida flatlands. By 2015, the population had exploded to 157,000, a city larger than Charleston, South Carolina or Kansas City, Kansas. Of those, 100,000 had joined the community since 2010.

The Villages has taken the concept of an age-restricted retirement home to an industrial scale. As a community, it is organized into 32 Neighborhood Centers, 17 Village Centers, and eight Regional Centers. It also operates 39 golf courses, its own television news channel, and a robust events calendar (51 events happening on the day I write this)....

The poorest zip code in the United States isn’t in rural West Virginia or Chicago’s South Side. It’s nestled in the northern suburbs of New York City: 10950, or the Hasidic community of Kiryas Joel.

While The Villages is one of the oldest communities in the United States, KJ is the youngest — its median resident is 13 years old. Kiryas Joel is unique in that much of its growth comes not from immigration, but from the fecundity of its residents; the median household has six children. One resident was able to name 2,000 living descendants when she passed away at 93.
Poverty is endemic in Kiryas Joel. More than 40% of the community is on food stamps, and 62% of all families in KJ live below the poverty line, many on various forms of government assistance. The community’s isolation is deep-rooted: most residents speak Yiddish at home, and 46% speak English “not well” or “not at all”....

So what do these new cities have in common, and what lessons might a modern city-builder take from their development?
1. They each focus on a very specific audience.
2. They are practical, not utopian.
3. They’re both from the right side of the aisle.
4. Local political domination was an early goal with social cohesion and low employment as weapons.
5. They moved fast and didn’t over-engineer it.
urban_planning  new_cities 
2 days ago
How blockchains could change the world | McKinsey & Company
blockchains—an open-source distributed database using state-of-the-art cryptography—may facilitate collaboration and tracking of all kinds of transactions and interactions... believes the technology could offer genuine privacy protection and “a platform for truth and trust.”...

What if there were a second generation of the Internet that enabled the true, peer-to-peer exchange of value? We don’t have that now. If I’m going to send some money to somebody else, I have to go through an intermediary—a powerful bank, a credit-card company—or I need a government to authenticate who I am and who you are. What if we could do that peer to peer? What if there was a protocol—call it the trust protocol—that enabled us to do transactions, to do commerce, to exchange money, without a powerful third party?...

The blockchain is basically a distributed database. Think of a giant, global spreadsheet that runs on millions and millions of computers. It’s distributed. It’s open source, so anyone can change the underlying code, and they can see what’s going on. It’s truly peer to peer; it doesn’t require powerful intermediaries to authenticate or to settle transactions.

It uses state-of-the-art cryptography, so if we have a global, distributed database that can record the fact that we’ve done this transaction, what else could it record? Well, it could record any structured information, not just who paid whom but also who married whom or who owns what land or what light bought power from what power source. In the case of the Internet of Things, we’re going to need a blockchain-settlement system underneath. Banks won’t be able to settle trillions of real-time transactions between things....

An immutable, unhackable distributed database of digital assets. This is a platform for truth and it’s a platform for trust. ... permission-less systems. We can do transactions and satisfy each other’s economic needs without knowing who the other party is and independent from central authorities.

...the idea of a distributed database where trust is established through mass collaboration and clever code rather than through a powerful institution that does the authentication and the settlement.

...I send you the $20, and these miners, to make a long story short, go about authenticating that the transaction occurred.

...For me to hack that and try and send the same money to somebody else, or for me to come in and try and take your $20 worth of Bitcoins, is not practically possible because I’d have to hack that ten-minute block. That’s why it’s called blockchain, and that block is linked to the previous block, and the previous block—ergo, chain. This blockchain is running across countless numbers of computers. I would have to commit fraud in the light of the most powerful computing resource in the world, not just for that ten-minute block but for the entire history of commerce, on a distributed platform. This is not practically feasible....

You pick any industry, and this technology holds huge potential to disrupt it, creating a more prosperous world where people get to participate in the value that they create. The music industry, for example, is a disaster, at least from the point of view of the musicians. They used to have most of the value taken by the big labels. Then, along came the technology companies, which took a whole bunch of value, and the songwriters and musicians are left with crumbs at the end. What if the new music industry was a distributed app on the blockchain, where I, as a songwriter, could post my song onto the blockchain with a smart contract specifying how it is to be used?...

There are showstoppers such as the energy that’s consumed to do this, which is massive. Another showstopper is that this technology is going to be the platform for a lot of smart agents that are going to displace a lot of humans from jobs. Maybe this whole new platform is the ultimate job-killer.

The biggest problems, though, have to do with governance. Any controversy that you read about today is going to revolve around these governance issues. This new community is in its infancy. Unlike the Internet, which has a sophisticated governance ecosystem, the whole world of blockchain and digital currencies is the Wild West.

It’s a place of recklessness and chaos and calamity. This could kill it if we don’t find the leadership to come together and to create the equivalent organizations that we have for governance of the Internet. We have the Internet Engineering Task Force, which creates standards for the Net. We have Internet Governance Forum, which creates policies for governments. We have the W3C Consortium, which creates standards for the Web. There’s the Internet Society; that’s an advocacy group. There’s the Internet Corporation for Assigned Names and Numbers (ICANN), an operational network that just delivers the domain names. There’s a structure and a process to figure out things. Right now, there’s a big debate that continues about the block size. We need a bigger block size to be able to handle all of the transactions that will be arising....

Imagine a world where foreign aid didn’t get consumed in the bureaucracy but went directly to the beneficiary under a smart contract? Rather than a $60 billion car-service aggregation, why couldn’t we have a distributed app on the blockchain that manages all these vehicles and handles everything from reputation to payments? ...

Imagine each of us having our own identity in a black box on the blockchain. When you go to do a transaction, it gives away a shred of information required to do that transaction and it collects data. You get to keep your data and monetize it if you want, or not. This could be the foundation of a whole new era whereby our basic right to privacy is protected, because identity is the foundation of freedom and it needs to be managed responsibly.
economy  blockchain  database  distribution  computing  networks  governance 
2 days ago
The Philosopher of Doomsday - The New Yorker
true artificial intelligence, if it is realized, might pose a danger that exceeds every previous threat from technology—even nuclear weapons—and that if its development is not managed carefully humanity risks engineering its own extinction. Central to this concern is the prospect of an “intelligence explosion,” a speculative event in which an A.I. gains the ability to improve itself, and in short order exceeds the intellectual potential of the human brain by many orders of magnitude....

transhumanist, joining a fractious quasi-utopian movement united by the expectation that accelerating advances in technology will result in drastic changes—social, economic, and, most strikingly, biological—which could converge at a moment of epochal transformation known as the Singularity...

Perhaps because the field of A.I. has recently made striking advances—with everyday technology seeming, more and more, to exhibit something like intelligent reasoning—the book has struck a nerve. Bostrom’s supporters compare it to “Silent Spring.” In moral philosophy, Peter Singer and Derek Parfit have received it as a work of importance, and distinguished physicists such as Stephen Hawking have echoed its warning. Within the high caste of Silicon Valley, Bostrom has acquired the status of a sage. ...

Bstrom’s sole responsibility at Oxford is to direct an organization called the Future of Humanity Institute, which he founded ten years ago, with financial support from James Martin, a futurist and tech millionaire. Bostrom runs the institute as a kind of philosophical radar station: a bunker sending out navigational pulses into the haze of possible futures. ...

The term “extropy,” coined in 1967, is generally used to describe life’s capacity to reverse the spread of entropy across space and time. Extropianism is a libertarian strain of transhumanism that seeks “to direct human evolution,” hoping to eliminate disease, suffering, even death; the means might be genetic modification, or as yet un­invented nanotechnology, or perhaps dispensing with the body entirely and uploading minds into supercomputers....

Bostrom had little interest in conventional philosophy—not least because he expected that superintelligent minds, whether biologically enhanced or digital, would make it obsolete. ...

"You must seize the biochemical processes in your body in order to vanquish, by and by, illness and senescence. In time, you will discover ways to move your mind to more durable media.” He tends to see the mind as immaculate code, the body as inefficient hardware—able to accommodate limited hacks but probably destined for replacement."...

The view of the future from Bostrom’s office can be divided into three grand panoramas. In one, humanity experiences an evolutionary leap—either assisted by technology or by merging into it and becoming software—to achieve a sublime condition that Bostrom calls “posthumanity.” Death is overcome, mental experience expands beyond recognition, and our descendants colonize the universe. In another panorama, humanity becomes extinct or experiences a disaster so great that it is unable to recover. Between these extremes, Bostrom envisions scenarios that resemble the status quo—people living as they do now, forever mired in the “human era.”...

he uses arithmetical sketches to illustrate this point. Imagining one of his utopian scenarios—trillions of digital minds thriving across the cosmos—he reasons that, if there is even a one-per-cent chance of this happening, the expected value of reducing an existential threat by a billionth of a billionth of one per cent would be worth a hundred billion times the value of a billion present-day lives. Put more simply: he believes that his work could dwarf the moral importance of anything else.

Bostrom introduced the philosophical concept of "existential risk" in 2002... In recent years, new organizations have been founded almost annually to help reduce it—among them the Centre for the Study of Existential Risk, affiliated with Cambridge Uni­versity, and the Future of Life Institute, which has ties to the Massachusetts Institute of Technology. All of them face a key problem: Homo sapiens, since its emergence two hundred thousand years ago, has proved to be remarkably resilient, and figuring out what might imperil its existence is not obvious. Climate change is likely to cause vast environmental and economic damage—but it does not seem impossible to survive.... Bostrom dates the first scientific analysis of existential risk to the Manhattan Project: in 1942, Robert Oppenheimer became concerned that an atomic detonation of sufficient power could cause the entire atmosphere to ignite. A subsequent study concluded that the scenario was “unreasonable,” given the limitations of the weapons then in development.... The answers must be fraught with ambiguity, because they can be derived only by predicting the effects of technologies that exist mostly as theories or, even more indirectly, by using abstract reasoning....

Many of those Earth-like planets are thought to be far, far older than ours. One that was recently discovered, called Kepler 452b, is as much as one and a half billion years older. Bostrom asks: If life had formed there on a time scale resembling our own, what would it look like? What kind of technological progress could a civilization achieve with a head start of hundreds of millions of years?...

Te field of artificial intelligence was born in a fit of scientific optimism, in 1955, when a small group of researchers—three mathematicians and an I.B.M. programmer—drew up a proposal for a project at Dartmouth. “An attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves"...

Norbert Wiener, the father of cybernetics, argued that it would be difficult to manage powerful computers, or even to accurately predict their behavior. “Complete subservience and complete intelligence do not go together,” he said. Envisioning Sorcerer’s Apprentice scenarios, he predicted, “The future will be an ever more demanding struggle against the limitations of our intelligence, not a comfortable hammock in which we can lie down to be waited upon by our robot slaves.”...

The scientists at Dartmouth recognized that success required answers to fundamental questions: What is intelligence? What is the mind? By 1965, the field had experimented with several models of problem solving: some were based on formal logic; some used heuristic reasoning; some, called “neural networks,” were inspired by the brain. With each, the scientists’ work indicated that A.I. systems could find their own solutions to problems. One algorithm proved numerous theorems in the classic text “Principia Mathematica,” and in one instance it did so more elegantly than the authors. A program designed to play checkers learned to beat its programmer. And yet, despite the great promise in these experiments, the challenges to creating an A.I. were forbidding. Programs that performed well in the laboratory were useless in everyday situations...

The research fell into the first of several “A.I. winters.” As Bostrom notes in his book, “Among academics and their funders, ‘A.I.’ became an unwanted epithet.” Eventually, the researchers started to question the goal of building a mind altogether. Why not try instead to divide the problem into pieces? They began to limit their interests to specific cognitive functions: vision, say, or speech. ...

Unexpectedly, by dismissing its founding goals, the field of A.I. created space for outsiders to imagine more freely what the technology might look like. ... In 2005, an organization called the Singularity Institute for Artificial Intelligence began to operate out of Silicon Valley; its primary founder, a former member of the Extropian discussion group, published a stream of literature on the dangers of A.I. That same year, the futurist and inventor Ray Kurzweil wrote “The Singularity Is Near"...

In 2007, the Association for the Advancement of Artificial Intelligence—the most prominent professional organization for A.I. researchers—elected Eric Horvitz, a scientist from Microsoft, as its president. Until then, it had given virtually no attention to the ethical and social implications of the research, but Horvitz was open to the big questions....

Horvitz organized a meeting at the Asilomar Conference Grounds, in California, a place chosen for its symbolic value: biologists had gathered there in 1975 to discuss the hazards of their research in the age of modern genetics. He divided the researchers into groups. One studied short-term ramifications, like the possible use of A.I. to commit crimes; another considered long-term consequences. Mostly, there was skepticism about the intelligence-explosion idea, which assumed answers to many unresolved questions. No one fully understands what intelligence is, let alone how it might evolve in a machine. Can it grow as Good imagined, gaining I.Q. points like a rocketing stock price? If so, what would its upper limit be? And would its increase be merely a function of optimized software design, without the difficult process of acquiring knowledge through experience? Can software fundamentally rewrite itself without risking crippling breakdowns?...

In people, intelligence is inseparable from consciousness, emotional and social awareness, the complex interaction of mind and body. An A.I. need not have any such attributes. Bostrom believes that machine intelligences—no matter how flexible in their tactics—will likely be rigidly fixated on their ultimate goals. How, then, to create a machine that respects the nuances of social cues? That adheres to ethical norms, even at the expense of its goals? ...

Bostrom worries that solving the “control problem”—insuring that a superintelligent machine does what … [more]
artificial_intelligence  transhumanism  posthumanism  machine_vision  deep_learning  neural_nets 
2 days ago
Ask the Stone to Say - Triple Canopy
Time is discerned by the shadow of an object that stands somewhere between the earth and the sun. What are you looking at when you check your phone to see the time and date, except for the artificial order imposed on reality? This order, composed of innumerable interlocking standards, works really well; but it also benefits certain systems of communication and exchange while marginalizing or suppressing many kinds of interaction and experience. Before the creation of time zones, one town would be a few minutes ahead of a neighboring town; a sense of place was tied to a sense of time. What’s lost with the synchronization of nearly every place in the world is the ability to experience any one place apart from any other....

Britain delayed its acceptance of the Gregorian calendar until 1752, and so correspondence with the rest of Europe required dates listed in the Old Style and the New Style. Because of this tardiness, when the calendar was finally implemented the British had to delete eleven days. The public was dismayed. In a print made at the time by painter William Hogarth, you can see a placard marked with the slogan “Give Us Our Eleven Days”...

More than one hundred years later, as the standardization movement took off and “public time” subordinated the timekeepers and church bells of municipalities—never mind the rising and setting of the sun, all clocks were coordinated with London’s Greenwich Observatory—protests erupted in the United States, with its suspicion of federalism, not to mention internationalism. As railroad tracks were extended and connected, and time zones drawn accordingly, Boston’s Evening Transcript blared, “Let us keep our own noon.” ...

“Listening to a bell conjures up a space that is by nature slow, prone to conserve what lies within it, and redolent of a world in which walking was the chief mode of locomotion,” Corbin writes. “Such a sound is attuned to the quiet tread of a peasant.” With the French Revolution, the use of bells for religious purposes was banned and many were transformed into cannonballs and coins; the qualitative time of localities was supplanted by the quantitative time of the secular state. Thanks to the pealing of parish bells, villagers knew not just where but who they were. After the bells were silenced, regional identities eroded and the rootless, alienated urban proletariat came into being....

Isn’t it strange that the days of the week have names but the hours of the day are just referred to by numbers? I think we should name our hours. Or the Catholic Church does, via the Roman Breviary, which defined the time of day by the prayer to be uttered: Matins, Lauds, Prime, Terce, Sext, None, Vespers, Compline. ...

The Greeks had two understandings of time. One, chronos, a term we still have today, considers time quantitatively, as sequential. The other, kairos, considers time qualitatively, as opportune moments, as indeterminate. You may feel that something is happening outside of chronological time...

I want to turn iPhones into astrolabes that track the movement of the sun. I want to temper atomic clocks with decans...
temporality  time  technics 
2 days ago
Ask the Stone to Say - Triple Canopy
The Egyptian model divided the sun’s route into thirty-six sections, which were marked by stars—also symbols—called decans; the duration between decans varied. Moments in time were defined by whichever celestial event was happening. The title bestowed on Egyptian priests who attended to the zodiac literally translates as “who is in units of time”; it’s typically translated as “astronomer” but might better be understood as “calendarist” or “timekeeper.” To keep time was to watch the sky...

I want to believe that we can each have our own time—or that we can purposefully fall out of sync with the time of clocks and calendars. ....

Time is discerned by the shadow of an object that stands somewhere between the earth and the sun. What are you looking at when you check your phone to see the time and date, except for the artificial order imposed on reality? This order, composed of innumerable interlocking standards, works really well; but it also benefits certain systems of communication and exchange while marginalizing or suppressing many kinds of interaction and experience. Before the creation of time zones, one town would be a few minutes ahead of a neighboring town; a sense of place was tied to a sense of time. What’s lost with the synchronization of nearly every place in the world is the ability to experience any one place apart from any other.
temporality  technics  time 
2 days ago
Homestead's 'cybrary' will be part library, part entertainment, part tech lab | Miami Herald
Landmark Entertainment Group — the company responsible for the Spider-Man and Jurassic Park rides at Universal Orlando and Caesar’s Palace in Las Vegas — has partnered with the city of Homestead to create the world’s first “Cybrary,” or cyber library.

“We are redefining what the library is,” said George Gretsas, Homestead’s city manager. “When you think about bettering this thing called a library, which has been around since before 300 B.C, do you turn to the library scientists — the librarians — to create a fresh and new thing, or do you turn to people who have expertise in the areas of entertainment and attraction?

Homestead did the latter.

The Cybrary was designed to break every stereotype — no shushing, no boredom. It will have old-fashioned books but also much more. Think e-books, librarians in unique costumes and a verbose robot welcoming you to the building.

“It’s like, why can’t Mary Poppins be your Cybrarian? What if children weren’t hushed but rather encouraged and inspired to really want to read, to learn, to explore new places to really engage?” said Tony Christopher, Landmark’s founder, CEO and president. “We are brainstorming ways to gamify the library experience and make kids — and adults — actually want to take a trip to the library.”
libraries  stupid  gamification  games  innovation 
3 days ago
History in a Time of Crisis
...what if anything can historians offer? What are historians good for? I’ll focus here on three particular knacks: disrupting inevitabilities, digging out lost alternatives, and widening the horizons of empathy....

Even as historians can dethrone legitimating myths, they can set themselves to the imaginative work of historical re-creation. Authoritarians manufacture convenient pasts that justify their power, but they also build, toward this end, rigidly forward-facing timelines that do away with history altogether, issuing new calendrical systems, Year Zeros, and days that "changed everything."...

In the space opened by unraveled inevitabilities, historians have a key role to play in identifying alternative paths. We can and should be, among other things, the archaeologists of roads not taken....

This kind of exploration can be hazardous. History can easily become a quarry from which only select minerals are extracted, leaving large, treacherous holes. And there is, along with the condescension, the enormous narcissism of posterity, a tendency to fabricate ancestors that make our own existence a matter of happy destiny. Even as we struggle against inescapability, we must not limit our search to only those ancestors whose descendants we care to be. (When it comes to sought-after forebears, freedom fighters, resourceful survivors, colorful rogues, and free thinkers — "ahead of their time" — are among the usual suspects.)
historiography  activism  history 
3 days ago
The once and future library | MIT News
And who gets to choose what we preserve? How do we ensure equity and inclusion and a multi-perspective cultural history? A pitfall to avoid in collections is sidelining certain contributions, or arguing that books not in use should be stored off campus. People who have been marginalized in certain disciplines may continue to be overlooked if their work is off site. We want to avoid just housing the greatest hits in each discipline. We want to include other perspectives that enrich the view of the subject. It’s a self-fulfilling prophesy that if it’s off site, it will get less use.
libraries  preservation  storage  off_site_storage  collections 
4 days ago
The Web Stalker | Net Art Anthology
The Web Stalker was an artist-made browser that challenged the emerging conventions of the new medium of the web. Released at a time when Netscape Navigator and Microsoft Internet Explorer competed for dominance, it critiqued these commercial browsers for encouraging passive, restrictive modes of browsing.

The radical interface of The Web Stalker reimagined web browsing as an engagement with the structure of the web itself. It ignored images and formatting, instead allowing users to move freely among online texts while highlighting the connections among them.

The Web Stalker offered a provocation to artists working with the net, suggesting that to fight back against its emerging corporate monoculture, they must look beyond HTML, and consider other aspects of its infrastructure.

...I/O/D applied its critique of software culture to the web browser in issue four. At that time, the so-called Browser War between Netscape and Microsoft was in full swing. The two companies' browsers competed with one another for market dominance by rendering pages slightly differently.

In I/O/D's analysis, both browsers shared a similar ideology, positioning the user as a passive consumer. For example, they shared an emphasis on the display of individual pages, a metaphor inherited from print publishing that downplayed the interactivity and interconnectivity of the web. The metaphor also made the web friendly for advertising strategies such as the banner and the splash page.

Both browsers employed metaphors of travel, which Fuller argued were “designed to suggest to the user that they are not in fact sitting in front of a computer calling up files, but hurtling round an earth embedded into a gigantic trademark ‘N’ or ‘e’ with the power of some voracious cosmological force.” The Web Stalker challenged these assumptions by offering a new kind of interface for browsing....

The Map function will parse an HTML document and diagram all of its links using lines and circles. Sites with more incoming links are shown as brighter circles. Thus, this function emphasized the links among pages, rather than the coherence of any individual page.
browsing  software  net_art  links  infrastructure 
6 days ago
Crawl, Map, Link, Read, Copy, Repeat | Rhizome
The storage capacity of a floppy disk weighed in at a massive 1.4 megabytes in the 1990s. Can you imagine what to do with that much power? In 1994, trying to answer that question, Simon Pope, Colin Green, and I started to create an "interactive multimedia" publication that would fit onto a high-density floppy. We called ourselves, and our publication, I/O/D, which stood for a few things that we would make up on the fly without being fixed to any of them. We gave copies away for free, by post and at events...

By the third issue of I/O/D we had introduced a supplement to the "Finder" element of the Macintosh operating system that would dredge random samples from any text files on the machine and put them in the speech bubbles of moveable figures drawn by comic artist Paquito Bolino. Partially by accident, it turned a story by the writer Ronald Sukenick into something like a virus that would rename a few files on any computer it was loaded onto with fragments from his text. ...

Artists were responding to its development, often rightly working with incoherence to test the too-ready assumptions that defined the internet as a medium of communication early on. However, we felt that there was still an implied acceptance of the aesthetic norms of the browser; for instance, that the browser was based primarily on the design for paper, emphasizing the single page as a coherent unit, rather than the connections amongst files.

The web was based on a structure of links. The patterns of connection of those links revealed the "native" power structure of the web. Today, people speak of certain websites operating as "gravity wells," where links to the outside are absent. Most such sites have links generated by scripts that refer to aggregates of content from databases, making them distinct from the hand-coded HTML documents of the 1990s, but the incipient tendency for large-scale sites to become hermetic and self-referential was already there as the flow of users and attention was beginning to become a valuable commodity in itself. ...

This tendency to silo the web page and contain the flow of users also affected the design of browsers. At that time, the battle among a small number of companies to determine the standards of the web and reap the rewards, imaginatively known as the “Browser Wars,” was in full-swing. Fault lines appeared between websites that were optimized for the unique features of Netscape and those best viewed in Internet Explorer, with the companies developing each program always edging towards breaking the web by introducing novel features to undermine their competitor and get greater market share.

Our approach was to try to develop another order of interaction, one that was not content with what it might be presented with, that would try and look behind the assembling of smooth surfaces and into the plumbing. We were also interested in doing so by reconnecting to the imperatives of Constructivism, moving across, art, design, and everyday life by making an object for direct use. The aim of the fourth issue of I/O/D, The Web Stalker, was to create a way of interfacing with the web that foregrounded some of the qualities of the network sublimated by other software. We wanted to develop an approach that would privilege fast access to information, and the ability to look ahead of the structures that were presented to users as well as to map the idiomatic structures of sites. We wanted to embed critical operations in software, but by forcing critical ideas to become productive rather than simply being aloof and knowing....

To these ends, I/O/D 4: The Web Stalker was a new kind of web browser that decomposed websites into separate sets of entities. The texts of the site were treated as the primary resource, but were stripped of most of their formatting. Links from one file to another were mapped in a network diagram, which allowed users to visualize their path through the clusters, skeins, and aporias of files. This Map built dynamically as a Crawler function gradually moved through the network. We saw the logical structure of websites, established by the links in and between them, as another key resource, and we wanted the software to act in a modular manner, with users calling up functions, each with their own separate window, only when they needed them.
textual_form  epistemology  browsing  interfaces  infrastructure  links  software  net_art 
6 days ago
Did Media Literacy Backfire?
Understanding what sources to trust is a basic tenet of media literacy education. When educators encourage students to focus on sourcing quality information, they encourage them to critically ask who is publishing the content. Is the venue a respected outlet? What biases might the author have? The underlying assumption in all of this is that there’s universal agreement that major news outlets like the New York Times, scientific journal publications, and experts with advanced degrees are all highly trustworthy.

Think about how this might play out in communities where the “liberal media”
is viewed with disdain as an untrustworthy source of information…or in those where science is seen as contradicting the knowledge of religious people…or where degrees are viewed as a weapon of the elite to justify oppression of working people. Needless to say, not everyone agrees on what makes a trusted source.

Students are also encouraged to reflect on economic and political incentives that might bias reporting. Follow the money, they are told. Now watch what happens when they are given a list of names of major power players in the East Coast news media whose names are all clearly Jewish. Welcome to an opening for anti-Semitic ideology....

We’ve been telling young people that they are the smartest snowflakes in the world. From the self-esteem movement in the 1980s to the normative logic of contemporary parenting, young people are told that they are lovable and capable and that they should trust their gut to make wise decisions. This sets them up for another great American ideal: personal responsibility.

...every individual is supposed to understand finance so well that they can effectively manage their own retirement funds. And every individual is expected to understand their health risks well enough to make their own decisions about insurance. To take away the power of individuals to control their own destiny is viewed as anti-American by so much of this country. You are your own master.

Children are indoctrinated into this cultural logic early, even as their parents restrict their mobility and limit their access to social situations. But when it comes to information, they are taught that they are the sole proprietors of knowledge. All they have to do is “do the research” for themselves and they will know better than anyone what is real.

Combine this with a deep distrust of media sources. If the media is reporting on something, and you don’t trust the media, then it is your responsibility to question their authority, to doubt the information you are being given. ...

For decades, civil rights leaders have been arguing for the importance of respecting experience over expertise, highlighting the need to hear the voices of people of color who are so often ignored by experts. This message has taken hold more broadly, particularly among lower and middle class whites who feel as though they are ignored by the establishment. Whites also want their experiences to be recognized, and they too have been pushing for the need to understand and respect the experiences of “the common man.” They see “liberal” “urban” “coastal” news outlets as antithetical to their interests because they quote from experts, use cleaned-up pundits to debate issues, and turn everyday people (e.g., “red sweater guy”) into spectacles for mass enjoyment....

Why trust experts when you have at your fingertips a crowd of knowledgeable people who may have had the same experience as you and can help you out?...

Since the election, everyone has been obsessed with fake news, as experts blame “stupid” people for not understanding what is “real.” The solutionism around this has been condescending at best. More experts are needed to label fake content. More media literacy is needed to teach people how not to be duped. And if we just push Facebook to curb the spread of fake news, all will be solved....

People believe in information that confirms their priors. In fact, if you present them with data that contradicts their beliefs, they will double down on their beliefs rather than integrate the new knowledge into their understanding. This is why first impressions matter. It’s also why asking Facebook to show content that contradicts people’s views will not only increase their hatred of Facebook but increase polarization among the network. And it’s precisely why so many liberals spread “fake news” stories in ways that reinforce their belief that Trump supporters are stupid and backwards....

Addressing so-called fake news is going to require a lot more than labeling. It’s going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information. Quick and easy solutions may make the controversy go away, but they won’t address the underlying problems....

As a huge proponent for media literacy for over a decade, I’m struggling with the ways in which I missed the mark. The reality is that my assumptions and beliefs do not align with most Americans. Because of my privilege as a scholar, I get to see how expert knowledge and information is produced and have a deep respect for the strengths and limitations of scientific inquiry. Surrounded by journalists and people working to distribute information, I get to see how incentives shape information production and dissemination and the fault lines of that process. I believe that information intermediaries are important, that honed expertise matters, and that no one can ever be fully informed. As a result, I have long believed that we have to outsource certain matters and to trust others to do right by us as individuals and society as a whole. This is what it means to live in a democracy, but, more importantly, it’s what it means to live in a society....

In the United States, we’re moving towards tribalism, and we’re undoing the social fabric of our country through polarization, distrust, and self-segregation. And whether we like it or not, our culture of doubt and critique, experience over expertise, and personal responsibility is pushing us further down this path.
Media literacy asks people to raise questions and be wary of information that they’re receiving. People are. Unfortunately, that’s exactly why we’re talking past one another....

The path forward is hazy. We need to enable people to hear different perspectives and make sense of a very complicated — and in many ways, overwhelming — information landscape. We cannot fall back on standard educational approaches because the societal context has shifted. We also cannot simply assume that information intermediaries can fix the problem for us, whether they be traditional news media or social media. We need to get creative and build the social infrastructure necessary for people to meaningfully and substantively engage across existing structural lines.
media_literacy  credibility  attribution  citation  fake_news  epistemology  pedagogy  civic_engagement  social_infrastructure 
6 days ago
You Will Be Assessed and Found Mediocre - The Chronicle of Higher Education
But most assessment in higher education is not about safety or pleasure. Its power-hungry spirit is more like: "What have you, lowly instructors, failed to do perfectly? Here are the complicated, badly designed surveys you must conduct. Next, record your failures in our precise but impenetrable format, with charts and graphs, so that your mediocrity is fully documented." You will also be compared with your peers. You will all be found below average.

Maybe this all started with grading, introduced at Yale in 1785. Maybe it goes back much further, to Socrates’s day, when teachers were thought to corrupt the youth of Athens. Or Roman times, when there was a term for teacher hatred: odium magistrorum....

But assessment, as ordered by higher- education boards, has a different spirit. It’s "accountability." That means blame.

Typically, a governing board, a cabal of legislators, or a group of senior administrators decides that they need more information about "what’s going on in the classroom." They devise "instruments" to measure whatever can be numbered. Grades can be aggregated, factored, normed. Time spent on classroom activities can be monitored. Student evaluations can be parsed and excerpted to measure "student satisfaction" and punish nonconformists.
academic  assessment  grading 
7 days ago
CyberCity allows government hackers to train for attacks - The Washington Post
CyberCity has all the makings of a regular town. There’s a bank, a hospital and a power plant. A train station operates near a water tower. The coffee shop offers free WiFi.

But only certain people can get in: government hackers preparing for battles in cyberspace.

The town is a virtual place that exists only on computer networks run by a New Jersey-based security firm working under contract with the U.S. Air Force. Computers simulate communications and operations, including e-mail, heating systems, a railroad and an online social networking site, dubbed FaceSpace.

Think of it as something like the mock desert towns that were constructed at military facilities to help American soldiers train for the war in Iraq. But here, the soldier-hackers from the Air Force and other branches of the military will practice attacking and defending the computers and networks that run the theoretical town. In one scenario, they will attempt to take control of a speeding train containing weapons of mass destruction.

To those who participate in the practice missions, the digital activity will look and feel real. The “city” will have more than 15,000 “people” who have e-mail accounts, work passwords and bank deposits. The power plant has employees. The hospital has patients. The coffeeshop customers will come and go, using the insecure WiFi system, just as in real life.

To reinforce the real-world consequences of cyberattacks, CyberCity will have a tabletop scale model of the town, including an electric train, a water tower and a miniature traffic light that will show when they have been attacked....

CyberCity is one of hundreds of virtual environments — often known as cyber ranges or test beds — launched in recent years by military, corporate and academic researchers to confront the mind-bending security challenges posed by cyberspace, where millions of attacks or intrusions occur every day.

Some small ranges study the effects of malicious software and viruses. Some hope to emulate the Internet itself and become scientific instruments of sorts, akin to mountaintop telescopes or particle accelerators, that will enable researchers to seek out the elusive fundamentals of cyberspace. The most ambitious of these, the National Cyber Range, was developed by the Defense Advanced Research Projects Agency. It has cost about $130 million since 2008. The agency said seven large-scale experiments have been conducted by Pentagon researchers.
media_city  simulation  games  security  hacking 
7 days ago
Faultlines, black holes and glaciers: mapping uncharted territories | Science | The Guardian
A naval architect turned explorer, Siggi navigates by scanning aerial photos and uploading them into a plotter, the ship’s electronic navigation system. Sometimes he uses satellite images, sometimes shots taken by Danish geologists from an open-cockpit plane in the 1930s, on one of the only comprehensive surveys of the coast. Siggi sails by comparing what he sees on the shore to these rough outlines. “Of course, then you don’t have any soundings,” he says, referring to charts of ocean depths that sailors normally rely on to navigate and avoid running aground. “I’ve had some close calls.” Over the years, he has got better at reading the landscape to look for clues. He looks for river mouths, for example, where silt deposits might create shallow places to anchor, so that icebergs will go to ground before they crush the boat. In the age of GPS and Google Maps, it’s rare to meet someone who still entrusts his life to such analogue navigation.

Even when Siggi is retracing his own steps, the landscape of the Forbidden Coast is constantly changing. “Where the glaciers have disappeared,” he explains, pointing at washes of green on a creased, hand-drawn chart, “a peninsula turns out to be an island. It was actually sea where you thought there was land.” To account for this, he often trades notes with local hunters, who are similarly adept at reading the coast. “Their language is very descriptive,” Siggi explains. “So all the names of places mean something.” Although locations may have official Danish names, they are often ignored.....

Until a century ago, Greenlandic hunters would cut maps out of driftwood. “The wooden part would be the fjord, so it would be a mirror image,” Siggi says. “Holes would be islands. Compared to a paper map, it was actually quite accurate.” These driftwood sculptures were first recorded by a Danish expedition in the 1880s, along with bas-relief versions of fjords, carefully grooved and bevelled to represent headland depths....

As a source of information, a map is always a way of groping through the darkness of the unknown. But locating yourself in space has never been cartography’s sole function: like these driftwood pieces, maps inevitably chart how cultures perceive not only their landscapes but their lives.

“Everything we do is some kind of spatial interaction with objects or ourselves,” says John Hessler, a specialist in geographic information systems at the Library of Congress in Washington DC. “A map is a way to reduce this huge complexity of our everyday world.” For the last few decades, Hessler has been conducting research in the library’s map collection – the largest in the world – in stacks the lengths of football fields. “Geographic information systems have revolutionised everything,” he says.

Explorers have long filled in our understanding of the world, using and then discarding the sextant, the compass, MapQuest. “The project of mapping the Earth properly is to some extent complete,” Hessler says. But while there are no longer dragons fleshing out far-flung places, a surprising number of spaces are still uncharted – and the locations we have discovered to explore have only expanded. “Where we were just trying to accurately map terrestrial space,” Hessler says, we have moved into a “metaphor for how we live. We’re mapping things that don’t have a physical existence, like internet data and the neural connections in our heads.”

From mapping the dark between stars to the patterns of disease outbreaks, who is making maps today, and what those maps are used for, says a lot about the modern world. “Now anything can be mapped,” says Hessler. “It’s the wild west. We are in the great age of cartography, and we’re still just finding out what its powers are.”...

She hopes that mapping where neutrinos come from will lead to the discovery of new black holes, and possibly explain what physical processes take place inside them. Because the majority of neutrinos were created around 14bn years ago, shortly after the birth of the universe, this might help answer a fairly fundamental question: what are the conditions that create energy?

“The only way to study something you can’t go to or touch is to look at it in many different ways,” Kurahashi Neilson says. “The funny thing is, if you map the universe in optical light – what humans see – or gamma rays, or radio rays, our universe doesn’t look the same. That’s the beauty of this. You create a map of the same thing in different light, and when you compare them, you understand the universe better.”...

Whether on the Forbidden Coast or tracking neutrinos at the South Pole, this curiosity – to compare, to see something no one has seen before – is a fairly basic human compulsion. That’s why Robert Becker – a radio astronomer who has recently retired from the University of California, Davis – got into physics. When he started studying astronomy, the only map of the entire sky was a simple contour map, like the ones used for hiking. In the 1990s, Becker decided to conduct a Very Large Array radio survey – using radio waves to map the sky in much greater detail – finding scores of new phenomena.

In most other areas of science, a question leads to an experiment that tests a hypothesis. In astronomy, you cannot conduct experiments. “We can’t build new stars,” Becker explains. “So we do survey maps.” The goal is to create a catalogue of the sky, which is essentially a record of all the ongoing experiments in space. ...

If you could somehow drain the seas, scientists predict you would see not sea monsters but a few volcanoes sprouting from an immense, flat floor, which is hundreds of thousands of hills covered by millennia of falling sediment. Because of these cloaking deposits, developing a better map of the ocean could shed light on the distant past. “It’s one of the most complete records of history on Earth,” says Alan Mix, an oceanographer at Oregon State University. “All of history accumulates in layers on the ocean floor.” The problem is that this wealth of information lies submerged just out of reach. Because satellites cannot read through water, mapping the sea has been much more difficult than mapping land.

“The joke,” Mix says, “is that we know more about the back side of the moon than the bottom of the ocean.” In the meantime, we work with best guesses. On Google Earth, for example, the sea floor appears to be mapped, displaying mountain ranges and submerged islands, but these shapes are actually based on inferred data. “It’s an interpreted map,” Mix explains. Because a mountain on the bottom of the ocean has a lot of mass, its gravity pulls on the water around it, causing a dip in the surface that a satellite can observe. “But it’s like looking through a bad pair of glasses,” Mix says. “To really know what’s going on below the surface, scientists must still send out an expedition.”... Since then, Ballard’s idea of deploying remote-controlled robots closer to the bottom of the sea has become standard practice. But the ocean is huge and submersibles can only travel so far. Even today, only about 17% of the ocean has been mapped with sonar, meaning that a ship or submersible has physically driven back and forth over the ocean floor in a grid, like mowing a lawn....

But if the sea floor has certain morphological characteristics, the country’s territory can be extended beyond that 200 nautical-mile limit, into an area called the extended continental shelf. As the rush to claim the Arctic begins – Russia has symbolically staked its claim to recently discovered oil reserves by planting a titanium flag in the bottom of the Arctic Ocean – maps such as this will be a crucial part of the manoeuvring....

More than a century and a half later in Haiti, MSF doctors could not even do that. Though everyone being treated in Haitian clinics was asked where they were from, the information proved confounding, since none of the informal neighbourhoods and slums in Haiti were adequately mapped. Doctors lacked the ability to connect the place names with geographical coordinates. “It was effectively being recorded in random syllables,” Gayton says. Though staff tried to record cases in a spreadsheet, without locations, doctors could not tell if cases were adjacent to one another or on opposite sides of the city, making it difficult to trace or stop the sources of infection. “We couldn’t do our job,” says Pete Masters, the Missing Maps project coordinator at MSF. “We didn’t have the evidence to take the best action.”

At the peak of the outbreak, Gayton was wandering through the hallway of a clinic and spotted a colleague, Maya Allan, crouched on a windowsill with a laptop. “She was trying to place pins [of cholera cases] on Google Earth by hand,” Gayton says. Frustrated, he thought there had to be a better way. So he called Google, which was “like calling the Batcave”.

A few days later, Google software engineer Pablo Mayrgundter flew to Port-au-Prince, bringing with him Google Earth programs and map data downloaded on to hard drives so he could work in the field without the internet. He trained Haitians how to use GPS units, then sent them into neighbourhoods to get latitude and longitude coordinates for Haitian place names. Google’s engineers were aided by a group called the Humanitarian OpenStreetMap (HOT) team – “Earthquake nerds, looking at the TV, looking at the street map of Port-au-Prince, and realising there’s nothing there,” Masters says. After the earthquake, the group coordinated with members of the Haitian diaspora to map Haiti’s slums and identify local landmarks for the first time. Within 72 hours of the earthquake, search-and-rescue teams were using their maps. Together, Google and HOT worked to geolocate all of the information they had gathered and to write a script to import the case records. … [more]
cartography  mapping  gaps  epistemology 
10 days ago
The Field Guide to Fences – Next City
In their urban habitat, fences create borders both physical and abstract, defining boundaries between public and private, and occasionally creating strange in-between spaces. Sometimes fences make good neighbors but often, there is collateral damage. They go beyond delineating space to creating barriers. These barriers — geographic and psychogeographic — affect how we navigate through the city, understand our neighbors, and determine (and perhaps undermine) our sense of security.

Bushwick’s ubiquitous fences are in part a vestige of the powerful anti-urban forces that shook the city in the 1970s. The blackout of 1977, followed by fires set by landlords to collect insurance, and a callous policy of planned divestment, destroyed much of the fabric of the neighborhood, with repercussions that continue to this day. Walls of steel-wire mesh and wrought iron rose as markers of anxiety and fear. Cyclone fences cut jagged borders between public and private spaces; between protected spaces and the urban unknown.

Over the past 10 years, the crime rate has dropped in Bushwick and throughout the city. Yet fences continue to multiply. New typologies join the ranks.

These urban border walls are symptoms of our larger political climate, with its ignominious distrust of the other and push toward privatization.

So, how can we rethink the ways in which we inhabit our streets, engage with our neighbors and support safety through positive reinforcement? Can we begin to welcome each other into these interstitial areas, and break down cultural divisions through inhabiting the zone where streets and building meet? Can we create socially and ecologically productive street landscapes from chain link and wrought iron? The NYC Parks Department recently began the Parks Without Borders Initiative, a program aimed at “making parks more open, welcoming and beautiful by improving entrances, edges and park-adjacent space” through a community design process....

who benefits from these very physical elements of social control? What are we fencing off, what are we containing? Is Trump keeping us out, or are we keeping him in? This working list of fence transformations, while modest, proposes a way to loosen, in the words of author Mike Davis, the ecology of fear, at the individual building scale. We need the city, now more than ever, to be a place that embodies hospitable architecture, that is a generous and unpredictable space, one that rewards curiosity and openness.
security  fences  borders  typology  field_guide  risk 
10 days ago
Hidden in Plain Sight: The Steganographic Image | u n t h i n k i n g . p h o t o g r a p h y
a steganographic inscription is neither a depth nor the plain surface but somewhere in between. In contemporary images made of data it refers to how the image can be coded as more than is seen, but also more than the image should do. The steganographic digital image can be executed; it includes instructions for the computer to perform. Photographs as part of a longer history of communication media are one particular way of saying more than meets the eye, but this image also connects to histories of secret communication from the early modern period, to more recent discussions in security culture, as well as fiction such as William Gibson’s novel Pattern Recognition (2003). Were J.G. Ballard’s 1950s billboard mysteries one sort of cryptographic puzzle that hid a message in plain visual sight?
images  photographs  code  virus  hacking  operative_image 
10 days ago
10 Must-Listen Architecture and Design Podcasts for the Holiday Break - Curbed
Is there a form of media that compliments city living better than podcasting? Whether it keeps commuters company during a morning train ride or provides a soundtrack for a stroll through city streets, the ubiquity and portability of podcasting can make a favorite show seem like a constant companion. In a post-Serial world, when Marc Maron gets the opportunity to interview the President in his garage, there are more shows than ever. We looked back over recent episodes and broadcasts and picked out some of our favorite architecture and design podcasts of the year, ideal listening during long trips, airport delays or simply free time over the upcoming holiday break.
media_architecture  audio  podcasts  radio 
11 days ago
First 3 Chapters of Theory and Craft of Digital Preservation for Comment | Trevor Owens
Interdisciplinary dialog about digital preservation often breaks down when an individual begins to protest “but that’s not preservation.” Preservation means a lot of different things in different contexts. Each of those contexts has a history. Those histories are tied up in the changing nature of the mediums and objects for which each conception of preservation and conservation was developed. All to often, discussions of digital preservation start by contrasting digital media to analog media.  This contrast forces a series of false dichotomies. Understanding a bit about the divergent lineages of preservation helps to establish the range of competing notions at play in defining what is and isn’t preservation.

Building on work in media archeology, this chapter establishes that digital media and digital information should not be understood as a rupture with an analog past, Instead, digital media should be understood as part of a continual process of remediation embedded in the development of a range of new mediums which afford distinct communication and preservation potential. Understanding these contexts and meanings of preservation establishes a vocabulary to articulate what aspects of an object must persist into the future for a given preservation intent.

To this end, this chapter provides an overview of many of these lineages. This includes; the culture of scribes and the manuscript tradition; the bureaucracy and the development of archival theory for arranging archives and publishing records; the differences between taxidermy and insect collecting in natural history collections and living collections like butterfly gardens and zoos; the development of historic preservation of the built environment; the advent of recorded sound technology and the development of oral history; and the development of photography, microfilming and preservation reformatting. Each episode and tradition offers a mental model to consider deploy for different contexts in digital preservation.
preservation  conservation  materiality  material_texts  architecture  objects  digital_preservation 
11 days ago
Your Private Browsing History Alone Can Give Away Your Identity - The Atlantic
Companies that compile user profiles generally do so pseudonymously: They may know a lot of demographic details about you, but they don’t usually connect your behavior to your individual identity. But a group of researchers at Stanford and Princeton developed a system that can connect your profile to your name and identity, just by examining your browsing history.

When the team tested the technique on 400 real people who submitted their browsing history, they were able to correctly pick out the volunteers’ Twitter profiles nearly three-quarters of the time.

Here’s how the de-anonymization system works: The researchers figured that a person is more likely to click a link that was shared on social media by a friend—or a friend of a friend—than any other random link on the internet. (Their model controls for the baseline popularity of each website.) With that in mind, and the details of an anonymous person’s browser history in hand, the researchers can compute the probability that any one Twitter user created that browsing history. People’s basic tendency to follow links they come across on Twitter unmasks them—and it usually takes less than a minute....

That means that maintaining privacy while using Twitter is impossible without opting out of the social network’s trademark feature: its public, free-for-all nature. The alternative—keeping your online comings and goings from being cataloged—is a long shot.

Browser features like Safari’s private browsing or Chrome’s incognito mode—with its sneaky-looking fedora-and-glasses branding—aren’t real defenses against de-anonymization. Once “incognito” or “private” windows are closed, they delete the trail of history left on the browser itself, but they don’t prevent trackers, internet service providers, or certainly spy agencies from eavesdropping on traffic.

Using Tor, on the other hand—a program that anonymizes internet browsing by bouncing traffic randomly across a network of servers—would probably deter all but the most dogged spies. “We speculate that this attack can only be carried out against Tor users by well-resourced organizations on high-value targets,” Shukla wrote. “Think cyber-espionage, government intelligence, and the like.”
privacy  anonymity 
12 days ago
How New York City Gets Its Electricity - The New York Times
Your household power may have been generated by Niagara Falls, or by a natural-gas-fired plant on a barge floating off the Brooklyn shore. But the kilowatt-hour produced down the block probably costs more than the one produced at the Canadian border.

Moreover, a surprising portion of the system is idle except for the hottest days of the year, when already bottlenecked transmission lines into the New York City area reach their physical limit.

“We have a system which is energy-inefficient because it was never designed to be efficient,” said Richard L. Kauffman, the state’s so-called energy czar, who is leading its plans to reimagine the power grid.

It’s like a mainframe computer in the age of cloud computing, Mr. Kauffman added, and with climate change, the state has to “rethink that basic architecture.”...

A standard part of the electric arsenal are generators called “peakers,” which are needed to keep the grid reliable but might run only a few days a year. New York City has about 16 such plants, mostly around the waterfront, which spring into action on the hottest days of the year or if transmission lines or power plants upstate malfunction. Some sit on barges, and all are designed to switch on quickly. The trade-off for the rapid response is usually higher costs and carbon emissions.

As a result, customers pay for plants and wires that “a lot of the time are hardly used,” said Mr. Kauffman, the energy czar.

The entire system was designed to meet demand extremes and handle the worst-case situation....

What, exactly, am I paying for each month?

A complete understanding of your Con Ed bill practically requires a Ph.D., but there are three main parts:

SUPPLY About a third to a half (depending on use) reflects how much your provider paid for the electricity on wholesale markets administered by Nyiso. Like all commodities, price fluctuates with demand. Electricity tends to be cheaper at night and more expensive in the summer. Other factors affect prices, such as weather conditions, fuel costs, the cost to operate a plant and where it is.

TRANSMISSION AND DELIVERY You are also paying for maintenance and upgrades to the wires and substations.

TAXES AND FEES About 30 percent of your bill is made up of taxes and fees, according to Con Ed, including property taxes, sales tax, a special tax for utilities and a fee that finances the state’s clean energy programs and innovations.

How much utilities can charge for supply and delivery is determined by the Public Service Commission, a board appointed by the governor to regulate utilities, which takes into account positions held by consumer, environmental and industry groups, government agencies and the utilities.
energy  infrastructure 
13 days ago
Data Selfie _ About
Data Selfie explores our relationship to the online data we leave behind as a result of media consumption and social networks. In the modern age, almost everyone has an online representation of oneself and we are constantly and actively sharing information publically on various social media platforms. At the same time we are under constant surveillance by social media companies and “share” information unconsciously. How do our data profiles, the ones we actively create, compare to the profiles made by the machines at Facebook, Google and Co. – the profiles we never get to see, but unconsciously create?

Why does Facebook need your phone number or your phone contacts or the data from your WhatsApp account? Why does Facebook track the time you spend looking at the posts in your news feed? Is the sole purpose of this data gathering to serve us more relevant ads? Is there something else afoot?

Data Selfie is an application that aims to provide a personal perspective on data mining, predictive analytics and our online data identity – including inferred information from our consumption. In our data society algorithms and Big Data are increasingly defining our lives. Therefore, it is important – especially for those who are indifferent to this issue – to be aware of the power and influence your own data has on you.
big_data  surveillance  privacy  algorithms  self_tracking  quantified_self 
13 days ago
How to weather the Trump administration: Head to the library - LA Times
If, as he claims, our new president really wants to invest in infrastructure, then America will need to build more than just roads and bridges. If Donald Trump is as smart as he insists he is, then he can prove it by strengthening our intellectual infrastructure.

...librarians may be the only first responders holding the line between America and a raging national pandemic of absolutism. More desperately than ever, we need our libraries now, and all three of their traditional pillars: 1) education, 2) good reading and 3) the convivial refuge of a place apart. In other words, libraries may be the last coal we have left to blow on....

Libraries aren’t perfect, but they’re evolving. Things always get delicate when they redefine themselves as “more than just books” because to some of us “just books” will always sound like “just oxygen.” But, in addition to their sacred role as an ark for endangered book culture, libraries already offer most of the services that society can’t or won’t otherwise provide. They’re career counselors, homeless shelters and Internet cafes, stopgap solutions to way too many of society’s problems.

If government doesn’t want to confront these ills, we should at least stand ready to help the one institution that’s addressing them already — and the new president could demonstrate that willingness by moving his inauguration.
libraries  infrastructure 
14 days ago
Harvard Design Magazine: Storage Flows: Logistics as Urban Choreography
up until now, whether at home or in the city, we think of storage as the collection and shelving of items in designated places where they accumulate until needed.

But in the post-Fordist city of today, where inventories are expertly synchronized across vast territorial scales in ever-decreasing timeframes, this practice of accumulation is inverted. In the logisticalization of contemporary supply chains, shelf life is planned to be as brief as possible—storage does not accumulate in one place; rather, it flows.

The concept of storage as a vector of flow has been prevalent as long as there has been trade. The transfer of goods from their point of production to consumers in a safe, profitable, and timely fashion across regions and continents has shaped urban centers in both ancient and modern times. Technological innovation, in the form of faster distribution networks and scientific management systems (coach to steam ship to railroads to the assembly line), has also had an enormous impact on the flow and stowing of goods. However, since the 1970s a new species of exchange networks has evolved, one that accelerates ow beyond anything we have witnessed previously: logistics.
Characterized as time-space technologies that supervise and expedite production routines and global supply chains, logistics are increasingly controlled by a series of large corporate actors. In this context, storage is defined as both the stuff and the procedures of new material and data flows of digital commerce: the supply of commodities from producers to retailers to customers in online retail (Amazon); mobility infrastructure to manage the shipping and distribution of goods (DHL, FedEx, UPS); social media and entertainment services (Facebook, Netflix, Redbox); and communication software that source, map, and reserve objects and spaces in sharing networks (Airbnb, Uber, Zipcar). Here, storage is not understood as putting things away for safekeeping or as a depository of artifacts gathering dust; rather, it is a dynamic, temporal system...

All is aided by abstract, rational procedures performed by algorithms, scanners, programmable robots, and other info-industrial technologies from smart tags and hand-held tracking devices to radio-frequency identification (RFID) systems. These procedures have all descended from the bar code and just-in-time (JIT) management procedures that together transformed inventory management and shipping beginning in the mid-1970s.

...To fully comprehend contemporary mechanisms of flow, we need to explore the manner in which logistics shrewdly appropriates other external networks and spaces as a means to enhance its supply chain operations. For example, many logistical networks hijack familiar forms of urban infrastructure to further conquer the spatiotemporal gap between supply and demand. Piggybacking on other systems to optimize flow by collapsing supply and distribution into one seamless system has many implications for the city, changing how distribution typologies appear in the urban landscape and thus the landscape itself....

While the global shipping network FedEx invests heavily in its own facilities, its real-life support remains the infrastructure of the city. FedEx has approximately 40,000 drop boxes in building lobbies, supermarkets, airports, and street corners in the United States alone. FedEx locates its sorting facilities at airports (375 to be exact) and its fleet of 48,000 trucks are a familiar sight on highways. ...

Amazon lockers ...

we might all have come around to the realization that storage flows in the commercial realm are nothing more than the complete manifestation of capitalism. The more goods and services corporate logistical networks can supply, the more we will likely demand. Despite the convenience of current storage flows in the city, their purpose is not really to make life better, but to propel our desire to want more stuff, quicker. After all, storage flows are nothing more than money flows in disguise.
storage  flows  logistics  infrastructure 
14 days ago
Donald Trump and the Uses of the Past | New Republic
What relevance does such an archive have nowadays? They say that if you’re asked why you like history in a university interview, the only thing you should never say is because we can learn from the mistakes of the past. History is a methodology, a way of seeing things—not a cautionary tale.

But we seem to be living through a rupture. As the president pretends the traditional separation of the judiciary, executive, and legislative branches of government does not exist, the most basic lessons from history—by which I mean literal history lessons we all should have learned at primary school—seem to need re-teaching....

Our country also has subtler needs. When Donald Trump was elected, an artist named Matthew Chavez began a project in the Union Square subway underpass. He sat at a desk covered in colorful sticky notes and pens and invited travelers to write down their feelings and then stick them on the wall. Chavez discouraged his contributors from expressions of raw anger, and encouraged messages of love and solidarity. He called the project Subway Therapy.

Alan Balicki and his team had one and a half hours on January 23 to take the project down from the wall, after the Society partnered with Chavez, New York Governor Andrew Cuomo, and the Metropolitan Transit Authority to place part of the work in its archive. Somewhat sadly, Balicki explained that he made the decision to break the original sequence of notes—he just did not have time to preserve them as they were. One of the PhDs with us observed that nobody owned the sequence or had declared that there was a wrong or right way to arrange the notes. She thought the project was beautiful as is, lumped pell-mell into gray archival boxes. The new display is called Messages for the President-Elect...

This is the sort of thing that the New-York Historical Society saves: flotsam, jetsam, things left behind. The curators follow closely in the wake of the city’s human activity, collecting the materials left behind by protests and vigils and attacks. The museum treats these items with a reverence rarely seen in any part of our culture. ...

What does the past mean for a society like this, under a government like this? For this administration, American history functions as a backdrop for racist fantasy—the greatness that will be made again—not as a foundation of government....

It’s tough to deal with nihilism, because everything multiplied by zero makes zero. But conservation is an action that expresses tenderness toward and assigns value to things that otherwise would signify nothing; a Post-it note, a grubby old letter, a drawing. Conservators treat the material world with a deeply erudite form of respect, which is their profession. It’s an honorable way to relate to the physical world around us. Honor has a politics.
archives  memory  history 
14 days ago
Listening to Bodies and Materialities - Listening Across Disciplines
Like the first network event this meeting too aimed to facilitate knowledge sharing to provoke novel interactions, enabling the key and core participants, as well as a participatory audience, not only to break down barriers between disciplines, but also to set the terms for doing so between universities, research, pedagogy, industry and the public. The meeting invited exchange and debate to enable a ‘shared enquiry’ that produces shared and shareable outcomes.

The particular emphasis on materialities and bodies focused a cross disciplinary listening on the aim to understand the world as a material and social sphere, whose components and interactions can be heard as well as seen. It involved anthropology, forensics, history, art, music and neurology as well as technology and medical sciences to explore how new knowledge might be created, applied and communicated through sound.

The roundtable consisted of network members, core members, as well as a specially invited participating audience involving doctoral students, post-doctoral researchers, UAL and SoU staff as well as members of the general public.

The aim was to engage the group in the differing methods, channels, tools, and objectives of listening practices across the differing academic or professional fields, in order to discuss and query processes, technologies, tools, and aims. The presentation and discussions of such a variety of academic, artistic and professional contexts and objectives of listening provided a platform for comparison, exchange, re-evaluation and inspiration, and initiate a debate on the legitimacy of the heard as an artistic and scientific material, data and outcome and what knowledge it might provide.

The first meeting brought a focus on language, and emphasised the need for a shared terminology and discourse, and it foregrounded the question of consensus or ambiguity. In the second meeting we continued to pursue these questions and persisted with the effort of building a glossary of terms and a resource of key texts and materials that might serve this endeavour.

Among the other questions brought forward from the first event were:

What sound is to different professions / tasks / disciplines?
How different disciplines listen / record sound?
What different professions, academic researchers, etc. hear?
How the listened to is evaluated, communicated and applied?
How listening can be taught, shared?
listening  hearing  methodology 
15 days ago
Infrastructures of Empire and Resistance
Calais highlights a specifically carceral politics of the present; the underside of a globalization anchored in movement, connection, and mobility; seemingly on display at Heathrow. But beyond this important and well-worn geography lesson, the coupling of controversies puts a spotlight on the infrastructures that organize these power geometries. These paired events and the crisis they together announce are not only about the chaos created by uneven global development and the resulting spatial mismatch for daily survival that pushes so many into exile. It is not only about the kinds of transnational circulations the UK will welcome — business travelers and tourists, versus the gates that greet asylum seekers, especially black and brown ones. This is a profoundly material crisis anchored in infrastructure. Together, Calais and Heathrow remind us that today’s gateways apparently require very large and complex gates — that relations of power and of force rely on socio-technical systems, that are themselves increasingly the object of struggle...

Infrastructure connects a range of political conflicts which might otherwise seem disparate and discrete: crises surrounding the rights of refugees and the provision of asylum in a world of thickening borders; crises of indigenous peoples’ lands and sovereignty in the face of transnational extractive industries; crises regarding local livelihoods in an economy organized through speed and flexibility in trade across vast distances; crises of water infrastructure in Black and Indigenous communities; crises of police and carceral violence that breed profound distrust in the core institutions of the state for communities of color. At the center of these struggles are the systems engineered to order social and natural worlds. Struggles over infrastructure are hardly new, but they are perhaps more ubiquitous, as the world becomes increasingly financialized, securitized, and logistical.

Infrastructures are the collectively constructed systems that also build and sustain human life. “We” build infrastructure, and it builds “us.” Infrastructure exceeds its most obvious forms — the pipes, roadways and rail that often monopolize our imaginaries. Social infrastructures are also built, material, and lasting. Even intimacy is increasingly understood as infrastructural. When they work, infrastructures bring us food, water, power, resources, consumer goods, information, security, and connections to loved ones. But the infrastructures that distribute the necessities of life are themselves unevenly distributed, and they can inhibit as well as enable connection. The story of infrastructure is also one of disconnection, containment, and dispossession. ... Infrastructure may entrench injustice in systems that seem technical rather than political, instead of technopolitical, and thus can serve to naturalize those relations. And infrastructure does not simply reflect existing inequality, but may engineer and entrench new forms....

The injustice of infrastructure is not only about lack — for instance when clean water infrastructures do not reach northern indigenous communities on the James Bay or urban Black communities on the Great Lakes, or when public transit infrastructures do not reach racialized neighborhoods which are increasingly pushed to the fringes of gentrifying cities. Sometimes there is too much infrastructure: the security and carceral infrastructures that produce the over policing of Black and Indigenous people, or the highway infrastructures that urban renewal drove through Black communities that led James Baldwin to deem them infrastructures of “negro removal.” This capacity to both contain and connect is a persistent feature of infrastructure. ...

In colonial and settler colonial contexts, infrastructure is often the means of dispossession, and the material force that implants colonial economies and socialities. Infrastructures thus highlight the issue of competing and overlapping jurisdiction — matters of both time and space....

Infrastructure is by definition future oriented; it is assembled in the service of worlds to come. Infrastructure demands a focus on what underpins and enables formations of power and the material organization of everyday life. Visions, ideas, and analyses are important, but the future must be built, and “concretized” in ways that sustain sociality. A focus on infrastructure insists that we ask how power works, in its most mundane and practical ways. And such a focus heeds the insights of feminist thought on the centrality of social reproduction and its gendered and racialized labors to the reproduction or transformation of the social order.

What might it mean to ground citizenship in the material architectures and social relations of alternative infrastructure, instead of the gate/ways of corporations and nation states? Could repairing infrastructure be a means of repairing political life more broadly? Lauren Berlant has recently argued that “the repair or replacement of broken infrastructure is… necessary for any form of sociality to extend itself,” but she is interested, “in how that extension can be non-reproductive, generating a form from within brokenness beyond the exigencies of the current crisis, and alternatively to it too.”... Infrastructure enables all manner of things, and it can foster transformation as well as reproduction. In contrast to top-down infrastructure - communities, movements, networks and nations assemble creative alternatives that respond to needs and desires for a different future as they help bring them into being....

Indeed, as I write this reflection, more than 450 US churches are responding to Trump’s promised expansion of border infrastructure by declaring that they will act as “Trump-era underground railroad” for undocumented immigrants. In doing so they embrace some of the most striking features of prior fugitive infrastructures; they are assembled to do different things, for different people, and according to different systems of value. In doing all this, they offer a different orientation to space, time and legality....

Infrastructures implicate us in collective life and death. The promise of repair — of fixing infrastructures — is precisely in recognizing the concrete reproduction of historical violence in the everyday. It lies in seeing the persistence of (settler) colonial and racial capitalist systems of sustaining and ordering the social in our present — in roads, or pipelines, or policing systems — and of seeing the operation of power not just in social interactions or economic relations, but in the particular material ordering that infrastructure brings. Most importantly, repairing infrastructure demands investment in its fugitive forms. It demands that we look not only to the violence but to the alternative worlds that are always already in the making, and that offer us glimpses at infrastructures for an inspiring future, and cues for how to begin building.
infrastructure  crisis  injustice 
17 days ago
The Term ‘Digital Divide’ Doesn’t Work Anymore
Rather than the usual binary online/offline statistics, Experian divided U.K. users into three categories. There were the Day-to-Day Doers, whose usage is “defined by practicality and less about must-have gadgets” and account for 52 percent of the population. There were the Digital Devotees, who spend the “most time using the Internet” and make up 32.4 percent. Finally, there were the remaining Digital Dawdlers, the 16 percent who have been “left behind.” What’s interesting about the categories isn’t that they exist — any conversation about the latest meme with friends and family can attest to differing levels of online engagement — but that the concept rarely makes its way into conversations about the digital divide.
“Traditionally, the way the digital divide has been portrayed has definitely been a binary,” says Crystle Martin, a postdoctoral researcher at University of California–Irvine who specializes in studying digital literacy. “It’s been viewed, if you give people access to technology, they will be able to be online and able to access all the things available. But it actually doesn’t turn out to be true.”
What the categories in the Experian results mean is that questions regarding the digital divide have progressed and moved to a more complicated next iteration. Simple “yes or no” questions no longer suffice. The questions now must also address access (does the person have a home computer or are they smartphone-dependent?) and speed (do they have dial-up or broadband?). These factors aren’t simply ancillary, they are integral.
digital_divide  Internet  access  infrastructure 
17 days ago
The FCC is stopping 9 companies from providing federally subsidized Internet to the poor - The Washington Post
Regulators are telling nine companies they won't be allowed to participate in a federal program meant to help them provide affordable Internet access to low-income consumers — weeks after those companies had been given the green light.

The move, announced Friday by FCC Chairman Ajit Pai, reverses a decision by his Democratic predecessor, Tom Wheeler, and undercuts the companies' ability to provide low-cost Internet access to poorer Americans. In a statement, Pai called the initial decisions a form of “midnight regulation.”
digital_divide  infrastructure  access  smart_cities  digital_equity 
17 days ago
Scholars Talk Writing: Advice From an Editor - The Chronicle of Higher Education
Trade books deliver on a different level — that of enlightenment and entertainment — than scholarly works, which favor argument and endeavor to carve out a philosophical place for themselves in the scholarly universe. Every book has to have a driving argument, but in a trade book that argument has to be subsumed into and framed by narrative as well as the human drama at the core of narrative....

In the end, however, it is the writing on which everything hinges. I try to gauge as quickly as I can (all editors have their ways and means) whether an academic writer shows some (or even any) capacity for creativity, a willingness to be experimental and push limits. Even if writers balk at some forms of authorial omniscience — giving voice and even words to people beyond what is purely and demonstrably archival — they need to court it and risk it.

...teaching — done well, the toughest profession there is, in my view — encourages oratorical habits that can sometimes be killing in a trade book: repetition, overemphasis, an artificial and archly rhetorical relationship with the audience that is based on pedagogical ploys ("To be sure …," and "True, …," and "Now let us …"). They tend to come from a sense of height and detachment. Teachers tend to talk down; they lecture. They are accustomed to being heard, and to make a point stick hard they reiterate, cranking up the volume of their thought and expression.

All of this falls flat in a trade book: pomposity, over-mastery, interpretative strategies mainly intended to wow students and impress upon them how far they have to go, how far less clever and wise they are than the lecturer. Worst sin of all in writing: repetition. It immediately earns a reader’s mistrust.

...the overuse of quotation deadens. Block quotes should be limited to the Gettysburg Address or a Keats ode — something containing rich ores of meaning to be mined over and over, rather than just filled space. Quote only that which cannot be paraphrased; offer the jewel of a quote and jettison the setting. You can’t tell the story through quotation and citation. Readers want it in your voice....

I highlight plenty — no comment, merely highlight — as a means of pointing out a few things, such as, say, how often "But" is being used to lead a sentence or even a paragraph, or the sudden proliferation of "is" clauses. But if the writer is still doing them after a few hundred pages, my comments can become more clipped: "You’ve said this." ...

To my mind, however, such use of "But" should be reserved for something genuinely dramatic, an abrupt turn of events. "But it was not to be." I don’t object to its use, but to its overuse. When I pointed out what I felt were too many sentence-leading (and paragraph-leading) "buts" in his manuscript to an author, he replied that he thought "But" conferred momentum. It kept things moving. My sense is the opposite: It stops the reader. Use it too much and the reader feels jerked around. You wanted a pet peeve. There’s one.
editing  writing 
17 days ago
Advice for graduate students on presentation skills (essay)
the goal is to pique interest so they want to have a conversation with you after the presentation and then read your paper later. When you choose the content to share, start with what they know now and what it is possible to explain to them in the time allotted.
A tip I always gave to my students was to cover less information in more depth, rather than trying to cover too much without enough depth. Speakers who try to cover too much information ultimately end up speeding through part of the presentation to get through the content they prepared, and as a result, they lose audience interest.
advising  presentations  UMS 
17 days ago
MisinfoCon: Trust, Verification, Fact Checking & Beyond
MisinfoCon is a community of people focused on the challenge of misinformation and what can be done to address it. The gathering seeks to strengthen the trustworthiness of information across the entire news ecosystem: journalism, platform, community, verification, fact checking and reader experience.

Bringing together participants from different backgrounds to lead discussions and develop and test product prototypes, our goal is to connect leaders and develop actionable steps on how the various sectors can work together.

We will convene ambassadors from technology platforms, news organizations, groups in the fact checking and verification space, as well as experts in social sciences, media literacy, policy, advocacy, cyber security and more. 

This is a challenge that requires a big tent solution. Software developers designers, librarians, academics and actual, honest-to-goodness “real people” that are impacted by misinformation are all welcome.
media_literacy  data_literacy  propaganda  fake_news 
18 days ago
The Architecture of the Overlap – BLDGBLOG
One of my favorite museums, Sir John Soane’s Museum in London, has teamed up with ScanLAB Projects for a new, 3D introduction to the Soane’s collections.
archives  museums  john_soane  scanning 
20 days ago
Square-Mile Street Network Visualization - Geoff Boeing
The heart of Allan Jacobs’ classic book on street-level urban form and design, Great Streets, features dozens of hand-drawn figure-ground diagrams in the style of Nolli maps. Each depicts one square mile of a city’s street network. Drawing these cities at the same scale provides a revealing spatial objectivity in visually comparing their street networks and urban forms.

We can recreate these visualizations automatically with Python and the OSMnx package, which I developed as part of my dissertation. With OSMnx we can download a street network from OpenStreetMap for anywhere in the world in just one line of code. Here are the square-mile diagrams of Portland, San Francisco, Irvine, and Rome created and plotted automatically by OSMnx:
mapping  cartography  scale  urban_form  streets 
20 days ago
Spatial Agency: About
Spatial Agency is a project that presents a new way of looking at how buildings and space can be produced. Moving away from architecture's traditional focus on the look and making of buildings, Spatial Agency proposes a much more expansive field of opportunities in which architects and non-architects can operate. It suggests other ways of doing architecture.

In the spirit of Cedric Price the project started with the belief that a building is not necessarily the best solution to a spatial problem. The project attempts to uncover a second history of architecture, one that moves sharply away from the figure of the architect as individual hero, and replaces it with a much more collaborative approach in which agents act with, and on behalf of, others.
architecture  design  expanded_field 
21 days ago
Close Calls — Real Life
With rose-tinted glasses on, part of me appreciates the memory of slow communication, the letters, the anticipation. Knowing what I do about surveillance possibilities, and the role of corporations in that surveillance, slower forms of communication seem even more attractive: they allow me to own the information that is sent to me, and to know who has access to the information I send. But for people I’ve spoken to, as long as the instant communication factor remains, nobody really cares who else can see those messages, or where that data lives. In the future, what will these new forms of digital literacies result in? More power to the corporations, zero ability to know who has access to my personal updates? Given the huge benefits that these new technologies afford, especially to immigrant and diaspora communities for whom communication is literally life-changing, it seems unlikely that we’d ever give them up for issues of control and agency that are far harder to perceive or understand than what we gain back.

For people who are apart from their loved ones, social media and messaging apps provide us with a place to be together in a way that wasn’t possible in the past. They provide an antidote to the physical separation of immigration, a way of making those sacrifices a little less so than they otherwise might be. At a time with more global migration than ever before, these technologies are becoming ever more important to maintaining family ties. For these communities, communication technologies and social media are not isolating, they’re uniting.
immigration  visibility  globalization  migration  geography  social_media  identity 
21 days ago
Ideo Says The Future Of Design Is Circular | Co.Exist | ideas + impact
When Schiphol Airport in Amsterdam replaced its lighting, it didn't pay for the bulbs. Instead, the airport pays for light as a service—and Philips, which designed the system, is responsible for recycling or reusing anything that breaks.

It's an example of the growth of circular design. Designers are traditionally part of the linear economy—creating products from raw materials that would eventually end up in a landfill. But they're beginning to consider the entire system and design products with materials that can be used in closed loops.
recycling  sustaniability  design  as_a_service 
21 days ago
One Dataset, Visualized 25 Ways | FlowingData
“Let the data speak.” It’s a common saying for chart design. The premise — strip out the bits that don’t help patterns in your data emerge — is fine, but people often misinterpret the mantra to mean that they should make a stripped down chart and let the data take it from there.

You have to guide the conversation though. You must help the data focus and get to the point. Otherwise, it just ends up rambling about what it had for breakfast this morning and how the coffee wasn’t hot enough.

To show you what I mean, I present you with twenty-five charts below, all based on the same dataset. It’s life expectancy data by country, it’s from the the World Health Organization and it spans 2000 to 2015. Each chart provides a different focus and interpretation.
data_visualization 
21 days ago
Studying the materiality of media archives in the age of digitization: Forensics, infrastructures and ecologies | Lischer-Katz | First Monday
Since the early 2000s, a growing number of scholars in information studies, media and communication studies and related fields have begun to radically reconceptualize the materiality of digital media and infrastructures. Information infrastructures — the fiber optic cables, network switches, and servers — all exist somewhere on earth, frustrating the modernist urge of separating information from its material support, seeking to dislocate it from place, time and context. Instead, information is shown to still always require some form of material support, only now it is increasingly moved to off-site storage in data warehouses. Ignoring these large-scale infrastructures is now seen as increasingly risky: not only does the rhetoric of de-materialization risk concealing the political economies that shape and sustain information infrastructures, including world intellectual property regimes (Vaidhyanathan, 2006) and digital rights management technologies (Gillespie, 2007), and the shaping of scholarly knowledge production (Manoff 2013; 2006), but it also conceals the ecological toll imposed by computing technology, its carbon footprint and toxic materials, which pose a threat to both human civilization and its vast archives of recorded knowledge (Cubitt, et al., 2011; Davis, 2015). Storing data requires considerable energy resources, both in terms of electricity for running the servers and their peripherals, but also for cooling and dehumidifying the ambient air around the servers. This extends the needs of earlier analog archival collections of film and videotape, which are not necessarily replaced by digital storage, but will also have to additionally be maintained. Starosielski (2014) suggests an ongoing linkage between storing archival analog media collections and the ecological impact of the necessary heating and cooling systems...

For media archives that are digitizing existing tape-based analog originals, the movement from analog to digital will require an enormous amount of data storage capacity that will need to be refreshed and migrated several times each decade. The amount of energy and mineral resources required to operate data centers and produce new storage media for the long-term preservation of media content is quite high. It has become widely accepted in the video preservation community that digitization is the only way to ensure long-term preservation of analog video content. It is estimated that globally, there are roughly 400 million analog video and audiotapes that are at risk of soon becoming unplayable due to decay or the unavailability of playback equipment. Linda Tadic (2016) has estimated that storing the data from all those digitized tapes would require 14.6 exabytes [2] of storage space, and with the need to store multiple copies to ensure long-term preservation, that number doubles or triples. Tadic (2016) also points out that those 400 million tapes will need to go somewhere once they have been digitized, either to landfills or to recycling plants, as the equipment used to playback tapes becomes increasingly scarce and tapes become virtually unplayable artifacts. The ecological impact of digitization and the long-term storage of digitized media collections will likely be significant....

How does our understanding of media archives change when we start to look to the infrastructural systems, the granular materialities of codes and their magnetic inscriptions, and their entanglement with these global ecologies, networks of global distribution and ways of knowing the world? Answering this question involves opening up the “black box” [4] of the archive in order to fully understand the multiple scales, temporalities and logics that shape media archives, and are typically hidden from analysis within the archive’s opacity. Without opening the “black box” of the archive and tracing its infrastructural entanglements with complex micro and macro systems and processes, we risk losing our critical awareness of the epistemological, institutional and/or ecological factors that shape the ontology and epistemology of archival media.... the archive itself is only one institution among many embedded in networks of standards, specifications, protocols, and other entities that construct the conditions for the encoding, storage, transmission and display of digital media....

Thus, to return to the case of the Jeremy Blake collection, we can see a tension between the aesthetic expression and regimes of forensic knowing that make these collections palatable for institutional ingestion. Given the authoritative, law-enforcing roots of digital forensic techniques, further unraveling of these ethical tensions seems quite necessary to understanding the long-term impact on archival ethics and professional conduct. We might wonder if these legal techniques might someday help to unravel the mysterious double-suicide of Blake and his wife, which consequently lead to the posthumous deposit of these materials at the Fales Library. While the tools of digital forensics are clearly useful for preserving authenticity and integrity in digital records, there are still many unexamined ethical questions around the epistemological assumptions and power relations that sustain them, which deserve further critical inquiry....

A critical approach to the materiality of media archives must also investigate the preservation infrastructures and standards that support archival practice.

Infrastructures increasingly shape the patterns of media distribution and how media appear, as well as shaping the practices of archivists working with media and the construction of legitimized institutional knowledge in preservation institutions. By their nature, social institutions work to stabilize and reproduce particular practices and forms of knowledge. In a sense, institutions are social infrastructures in themselves. Technical infrastructures are intertwined with the social infrastructures of institutions, often times mediated by standards, protocols, documents and artifacts that bind social and technical aspects of infrastructure....

Digital archivists are likely cognizant of the infrastructures they depend on for access to digital collections, but their awareness needs to be extended beyond the pragmatics of preservation to include the larger political economic and ecological phenomena within which these infrastructures are enmeshed. Standards are increasingly understood to be particularly important tools in sustaining infrastructures, and they circulate around the globe and ensure that infrastructures can function by promoting uniformity across different institutions....

Digitization standards are particularly important, as more and more institutions are adopting digitization as a strategy for access and long-term preservation. Conway (2010) suggests that “in the age of Google, nondigital content does not exist, and digital content with no impact is unlikely to survive” [22]. Digitization is providing access to the archives of the future, and standards are effectively shaping how collections will appear to future generations.

Standards also play an important role in the circulation of digitized media. They help establish common formats for distribution and access, and for long-term archiving (which could be seen as a sort of temporal distribution to a future, indeterminate time, when it is hoped that the content can be decoded and displayed). Standards produce control and uniformity across “cultures, time and geography” (Timmermans and Epstein, 2010), they can be used to exclude and marginalize individuals and organizations who choose not to adopt the standard, or who do not fit the standard. ...

The long-term goal of the Library of Congress is to entirely digitize these collections and make them available for the “life of the republic plus 4,000 years,” according to Senior Systems Administrator at the Library of Congress NAVCC, James Snyder (2010). Digitizing the millions of media objects in the collection will likely generate exabytes (millions of terabytes) of data by 2017 (Snyder, 2011). Because of the nature of preserving digital information, constant copying and checking of this huge collection is required to keep it accessible over time. As the Library of Congress Web site explains, “the digital archive is based on the concept of continual migration and verification. Migration to progressively higher density storage — meaning progressively greater storage capacity — will continue indefinitely into the future” (U.S. Library of Congress, 2007). With such ambitious goals of preservation set, the Library is attempting to establish a media archive for the end of time, one that can survive format obsolescence, digital decay and nuclear war, in order to repopulate the earth with America’s media patrimony (Lischer-Katz, 2013)....

The construction of this archival facility within a decommissioned Cold War bunker, coupled with the implementation of large scale, industrial-grade tape and spinning disc storage dramatizes the relationship between military funding and information research. It reminds us that the Internet, too, had its origins as a mechanism for nuclear survival. Materialism makes us reconsider the meaning of sites, architectural forms, and specific infrastructural formations, in the context of large-scale social and political forces. We might consider the tension between the Library’s ecologically-friendly forest reclamation project on top of the construction site, and the fact that the facility draws large quantities of electrical resources to run servers, data tape warehouses, and the HVAC systems to keep its servers and analog materials cool and dry. This tension is also played out in the Library’s motion picture film lab, where they have spent years trying to filter their wastewater to reach pollution levels acceptable to the local community water standards (based on discussions with staff at the facility carried out by the author in 2011). Thinking about … [more]
digital_archives  digitization  preservation  tape  media_archaeology  waste  forensics  infrastructure  standards  labor  materiality  material_texts 
22 days ago
Analysis: Synthesis - e-flux Architecture - e-flux
Armed with biotechnology techniques—notably, faster and cheaper methods for DNA sequencing and synthesis—this new breed of life scientists treats biological media as a substrate for manufacture, raw material that can be manipulated using engineering principles borrowed from their various home disciplines. Sequencing and synthesis allow synthetic biologists to traffic between physical molecules of nucleic acid (DNA and RNA) and dematerialized genetic sequences scrolling across computer screens. Sequencing means “reading” the strings of four nucleotide bases whose sequence constitutes DNA and RNA to compose a digital genetic “code” made up entirely of letters that stand in for the molecule (A for the nucleotide adenine, C for cytosine, G for guanine, T for thymine). Synthesis does the reverse: using elaborate genomic techniques, researchers can physically build material nucleic acid macromolecules to order on the basis of desired genetic codes.

Two synthetic biologists define their field as follows:
Synthetic biologists seek to assemble components that are not natural (therefore synthetic) to generate chemical systems that support Darwinian evolution (therefore biological). By carrying out the assembly in a synthetic way, these scientists hope to understand non-synthetic biology, that is, “natural” biology.1
In their equation of making with understanding, of synthesis with analysis, making life is not an end in itself but rather a technique for probing life’s margins. Making new life-forms also requires that researchers query seemingly commonsense terms like “natural” and “unnatural,” “biological” and “synthetic.”

...potential commercial applications: clean energy, bioweapons, and cheap drug synthesis. Such work often bears practical resemblances to biotechnology, synthetic chemistry, chemical engineering, and pharmaceuticals research. ...

The difference between synthetic biologists’ impulse and earlier examples of biological experimentation is that they do not make living things in the service of discovery science or experimental research alone. Rather, making is also an end in itself. Newly built biotic things serve as answers to biological questions that might otherwise have remained unasked. They are tools with which synthetic biologists theorize what life is.... In 2010 synthetic biologists at the J. Craig Venter Institute (JCVI) synthesized a “minimal” organism, a single-celled, independently living entity that maintains, JCVI researchers posit, the least genetic material necessary to sustain life....

The Artificial Life of the 1990s and the synthetic biology of the early 2000s have much in common, not the least of which being researchers’ explicit efforts to build new instantiations of something they call “life.” Nonetheless, Artificial Life was premised on abstracting life by simulating it in computer software. Artificial Life researchers treated life as if it were a universal formal category transcending substance, material, or medium. Something different motivates synthetic biologists’ work. Most notably, they do not posit that life is a property separable from biological matter. Neither is their project mimetic: rather than imitate life, they construct new living kinds. ....

Many synthetic biologists quote Richard Feynman, who scribbled on his Caltech blackboard just before his death: “What I cannot create, I do not understand.”... Making has operated hand in glove with knowing since seventeenth-century Baconian mechanical philosophy dispensed with natural philosophy to experiment on the natural world. Experimentalists and artisans joined theoria to practica, and contemplation served instrumentation. Yet the ubiquity of “maker’s knowledge” and artisanship in modern science has since largely been forgotten, especially in the mid- to late twentieth century, when scientific disciplines were divided into “pure” and “applied” research. Synthetic biology is the latest instantiation of a centuries-long debate as to whether nature may be known through artifice....

Can genomes be “refactored” and streamlined to function like software code? Yes, synthetic biologists answer, because we have generated just such a bacteriophage. Can a living thing be fragmented into parts, and from a library of parts, can an organism be assembled? Yes, they say, because we have made standardized biological parts. What is the minimal system that is viable and free-living? The one we ourselves have made, they respond. Can species be defined beyond the continuous unspooling of biological generations? Yes, because we can revive species already extinct. Synthetic biologists make new living things in order better to understand how life works. Yet making recursively loops theory: the new living things biologists make function as “persuasive objects” that materialize theories of what synthetic biologists seek to understand about life. In short, the biological features, theories, and limits that synthetic biologists fasten upon are circularly determined by their own experimental tactics, which they then identify with the things they have made....

Biology has always been, since its inception and by definition, an inquiry into what life is. Michel Foucault claimed that “life itself” is a category that “did not exist” prior to the end of the eighteenth century: “Life does not constitute an obvious threshold beyond which entirely new forms of knowledge are required. It is a category of classification, relative, like all the other categories, to the criteria one adopts.”8 That is, biology as a discipline was warranted by a classificatory decision: carving up the world into the organic and the inorganic, differentiating between the vital and the lifeless, and insisting that the living world demanded a science of its own....

synthetic biologists build new living things, and in so doing they retroactively define what counts as “life” to accord with the living things they manufacture and account to be living. The organisms conceived by these mechanical and electrical engineers-cum-biologists, then, are altogether different from the creatures built by biotechnologists: while some are made to serve discrete pharmaceutical or agricultural functions, many of them are made as a way of theorizing the biological. Rather than being the common denominator of all living things, “life” has (once again) become a problem of ontological limits and discontinuities. As such, analyses of life are newly simultaneous with and enabled by synthesized instantiations of it....

synthesis and analysis are joined philosophical modes of reasoning underwritten by these paired technologies. Making stuff—synthesis—has become a mode of analysis, a way of theorizing the biological.... When technical and epistemic knowledge of life converge, the objects of synthetic biology function as persuasive objects. They convince synthetic biologists that life is marked by the qualities—technical, substantive, and social—that they ascribe to it. ...

These brave new organisms grow, mutate, metabolize, divide, and senesce, yet they also speak eloquently of their times, of nature and artifice, of analysis and synthesis, of life and its limits.
synthetic_biology  genetics  DNS  life  ontology  making  methodology  epistemology  experimentation 
22 days ago
BBC Radio 4 - Sound Architecture: The Spaces That Speak
Building design and city planning is dominated by the visual. But a new science has emerged which explores the relationship between design, acoustics and the human experience, called aural architecture. Every space has its own unique soundscape, created by a combination of the overall design, the materials used in construction and the way that space is used by humans.

Until very recently, few architects ever gave much thought to what affect that soundscape might have on the people inhabiting the space, be they office workers, school pupils, teachers or shoppers. This has resulted in railways stations where train announcements are unintelligible, restaurants where you have to shout to be heard and open-plan schools in which teaching is all but impossible. More recently, research has shown that a poor aural experience can have a considerable negative effect on how we feel and behave, even at a subconscious level.
sound_space  acoustics 
23 days ago
Van Alen Archive
We have staged architectural design competitions since our founding in 1894. Our design archive is a rich window to a century’s worth of programs that have nurtured generations of architects and urban thinkers. Click around below to view a selection of past programs, or all of our 2,000+ archived objects.
archives  architecture  public_space  urban_design  design_competition 
24 days ago
The Human Toll of Protecting the Internet from the Worst of Humanity - The New Yorker
content moderation. Even technology that seems to exist only as data on a server rests on tedious and potentially dangerous human labor. Although algorithms and artificial intelligence have helped streamline the process of moderation, most technology companies that host user-generated content employ moderators like Soto to screen video, text, and images, to see if they violate company guidelines. But the labor of content moderators is pretty much invisible, since it manifests not in flashy new features or viral videos but in a lack of filth and abuse. Often, the moderators are workers in developing countries, like the Philippines or India, or low-paid contractors in the United States. ...

ilicon Valley’s optimistic brand does not fit well with frank discussions of beheading videos and child-molestation images. Social-media companies are also not eager to highlight the extent to which they set limits on our expression in the digital age—think of the recurring censorship controversies involving deleted Facebook pages and Twitter accounts. ...

Tech companies like to envision themselves as neutral platforms, efficiently governed by code and logic. But users want these companies to be flexible and responsive to their needs. They want something more than terms of service handed down from policy teams, or canned responses to a reported abuse, which then disappears into a bureaucratic maze. They want a human relationship with the services that play such an important role in their lives. That will never be possible if those services dehumanize the workers that protect them.
censorship  content_moderation  digital_labor 
25 days ago
New York City Gets A Dashboard | Co.Design | business + design
At the highest levels of city government, analysts and senior staff members must make decisions based on data from 70 different agencies and groups and millions of inhabitants, fragmented further by school districts, police precincts, and other divisions. While every agency might have its own tool to parse numbers, there wasn’t a tool to generate broad, location-based insights for the people at the top. "If you think about the target users of these tools, they have very little time. They need to get answers fast," Gonzalez says. "And they need to see what they need to see—and nothing more—to be able to make decisions."

Today, the N.Y.C. Mayor’s Office officially launched its own dashboard meant to do just that, built by Vizzuality, where Gonzalez is CTO, and Carto, a platform for location data tools. It’s designed to make sense of the bottomless data silos that a city as big, complex, and connected as New York generates every hour. While many cities employ various data visualization tools to track their metrics, this one is unique in the scope of data it pulls in real-time, and in the way it's designed—to be a user-friendly tool for key actors within the government on a day-to-day, or even hour-to-hour, basis.

The dashboard pulls data from across the city, mapping it geographically to give decision-makers a bird’s-eye view of hundreds of "indicators" from across city organizations, which vary wildly across agency and can involve anything from robberies to traffic fatalities. For example, an indicator for the Housing Authority might be "average time to resolve elevator outages," while for the NYPD one might be "major felony crimes," or "average length of stay for single adults in shelter" for the Department of Homeless Services.
dashboard  big_data  smart_cities 
27 days ago
Artists are Salvaging Train Stations' Analog Departure Boards | Atlas Obscura
But these boards can just as easily be programmed to broadcast more chilling messages. Boston’s board, auctioned on eBay, went to the artist George Sanchez Calderon, who paid $350 for it (and much more to ship it to Miami). He used it in a 2009 work, “Family of Man”:
signs  lettering  typography  flip-boards  text_art 
29 days ago
Data Streams – The New Inquiry
we both hit on these singular images about the limits of knowing. For you it was that grainy capture of video. For me, back when I was writing for the New Inquiry in 2014, I was fixated on this image, which had been redacted, that simply says, “What can we tell?” and it’s effectively a blank space. It has that double meaning that you and I have both found very intriguing: the idea that they could tell and see everything, but also that there are domains where they can tell and see nothing. There are these hard limits that are reached in the epistemology of “Collect it all” where we reach a breakdown of meaning, a profusion and granularization of information to the point of being incomprehensible, of being in an ocean of potential interpretations and predictions. Once correlations become infinite, it’s difficult for them to remain moored in any kind of sense of the real. And it’s interesting how, for both of us, that presents a counter-narrative to the current discourse of the all-seeing, all-knowing state apparatus. That apparatus is actually struggling with its own profusion of data and prediction. We know that there are these black holes, these sort of moments of irrationality, and moments of information collapse....

But the thing that got me through were these moments of humor. It’s very dark humor, but in the archive there are so many moments of this type. Some of the slides in particular are written in this kind of hyper-masculinist, hyper-competitive tone that I began to personalize as “the SIGINT Bro.” There’s this “SIGINT bro” voice that would be like, “Yeah, we got ‘em! We can track anybody! Pwn all the networks!” But it has an insecure underbelly, a form of approval-seeking, “Did we do good? Did we get it right?”

...the almost emoji-like type of cartoon-like figures that are being used, all these images of Weatherby, for example, and magic, that abound in the initial unconscious of the archive. That’s rather funny.

...IBM’s terrorism scoring project, which I have spoken about elsewhere. I know we are both interested in how this type of prediction is a microcosm of a much wider propensity to score humans as part of a super-pattern.

...I’m really fascinated by quantifying social interaction and this idea of abstracting every kind of social interaction by citizens or human beings into just a single number; this could be a threat score, it could be a credit score, it could be an artist ranking score, which is something I’m subjected to all the time. For example, there was an amazing text about ranking participation in jihadi forums, but the most interesting example I found recently was the Chinese sincerity social score. I’m sure you heard about it, right? This is a sort of citizen “super score,” which cross-references credit data and financial interactions, not only in terms of quantity or turnover, but also in terms of quality, meaning that the exact purchases are looked into. In the words of the developer, someone who buys diapers will get more credit points than someone who spends money on video games because the first person is supposed to be socially “more reliable.”

...The correlations Admiral was using were things like if you use exclamation marks or if you use words like “always” and “never,” it indicates that you have a rash personality and that you will be a bad driver. So if you happen to be someone who uses emojis and exclamation marks, you will be paying more to insure your car. ...

As for the IBM terrorist credit score, there are two aspects that really stay with me. One is the fact that it’s being tested and deployed on a very vulnerable population that has absolutely no awareness that it is actually being used against them. Two, it’s drawing upon these terribly weak correlations from sources like Twitter, such as looking if somebody has liked a particular DJ, and conjecturing that this person might be staking out the nightclub where they play. These are, I think, far-flung assumptions about what the human subject does and what our data traces reveal. To use Jasbir Puar’s concept of “the trace body,” the assumptions that go into the making of the trace body have become so attenuated and, in some cases, so ridiculous, that it’s critically important that we question these knowledge claims at every level....

This reminds me of the late 19th century, where there were a lot of scientific efforts being invested into deciphering hysteria, or so-called “women’s mental diseases.” And there were so many criteria identified for pinning down this mysterious disease. I feel we are kind of back in the era of crude psychologisms, trying to attribute social, mental, or social-slash-mental illnesses or deficiencies with frankly absurd and unscientific markers. ... I was thinking of physiognomy, too, because what we now have is a new system called Faception that has been trained on millions of images. It says it can predict somebody’s intelligence and also the likelihood that they will be a criminal based on their face shape. ...

“See everything with Hollerith punch cards.” It’s the most literal example of “seeing like a state” that you can possibly imagine. This is IBM’s history, and it is coming full circle. ... I think that maybe the source of this is a paradigm shift in the methodology. As far as I understand it, statistics have moved from constructing models and trying to test them using empirical data to just using the data and letting the patterns emerge somehow from the data. This is a methodology based on correlation. They keep repeating that correlation replaces causation. But correlation is entirely based on identifying surface patterns, right? The questions–why are they arising? why do they look the way they look?–are secondary now. If something just looks like something else, then it is with a certain probability identified as this “something else,” regardless of whether it is really the “something else” or not. Looking like something has become a sort of identity relation, and this is precisely how racism works....

not to be tracked at all than to be more precisely tracked. There is a danger that if one tries to argue for more precise recognition or for more realistic training sets, the positive identification rate will actually increase, and I don’t really think that’s a good idea....

...I prefer to be misrecognized in this way than to be precisely targeted and pinpointed from the map of possible identities to sell advertising to or to arrest.

KATE CRAWFORD. There is something fascinating in these errors, the sort of mistakes that still emerge. These systems are imperfect, not just from the level of what they assume about how humans work and how sociality functions, but also about us as individuals. I love those moments of being shown ads that are just so deeply outside of my demographic....

And this paradox–of wanting to be known accurately, but not wanting to be known at all–is driving so much of the debate at the moment. ...

Artificial intelligence is all around us. People are unaware of how often it’s touching their lives. They think it’s this futuristic system that isn’t here yet, but it is absolutely part of the everyday. We are being seen with ever greater resolution, but the systems around us are increasingly disappearing into the background....

Automation is already creating major inequality and also social fragmentation–nativist, semi-fascist, and even fascist movements. The more “intelligent” these programs become, the more social fragmentation will increase, and also polarization....

As people get replaced by systems, one of the few human jobs that seems to remain is security....

HITO STEYERL. Have you seen any example of an AI that was focused on empathy or solidarity? Do you see the idea of comradeship anywhere in there?

KATE CRAWFORD. I go back to the beginning. I go back to Turing and to even the early Turing test machines, like ELIZA. ELIZA is the most simple system there is. She is by no means a real AI and she’s not even adapting in those conversations, but there’s something so simple about having an entity ‘listen’ and just pose your statements back to you as questions. People found it incredibly affecting. Some thought that this could be enough to replace human therapy. But there were these hard limits because ELIZA couldn’t actually empathize, it couldn’t actually understand, and I don’t think we’ve moved as much as we think since then. ELIZA as an empathy-producing machine because she was a simple listener.
big_data  epistemology  blanks  gaps  ambiguity  archives  surveillance  quantified_self  quantification  statistics  artificial_intelligence  security  listening 
4 weeks ago
Engines of Knowledge: The Museum and the Exhibit | The Sociological Imagination The Sociological Imagination
My focus is on what I have termed, after Ian Hacking’s idea, engines of knowledge. This notion of engines includes not just tools and methods but institutions and processes that we have come to take for granted, even, in sociological terms, naturalised. The machine metaphor refers to the capacity of these things, singularly or taken together, to produce new practices, concepts and ideas. Amongst these knowledge factories were the institutional formats of the university, museum, library, the hospital and, in organisational formats, the many associations and societies that emerged to formalize, authorize and regulate knowledge development and outputs. In this piece I look at the museum as an archetypal knowledge factory of the Victorian era that formalised, institutionalised and then diversified itself on the basis of a range of earlier prototypes....

The museum is both an ancient idea and a relatively modern institutional form. The original museum in classical thought referred more to a place for philosophical contemplation and discussion, more university than a collection of artifacts or exhibit space. Early modern examples include Ole Worm’s (1588 –1654) scientific curiosity collection in Copenhagen or the opening of the Ashmolean art museum in Oxford (1683). The British Museum was established in the 1750’s based on Hans Sloane’s collection of curiosities. Diderot proposed a national museum for France in his Encyclopédie in the 1760’s. This 18th century developmental phase became an increasingly international phenomenon in the 19th century as the growth of knowledge expanded at a phenomenal rate, and the instruments and methods for knowledge production were increasingly universalised....

The idea that nature and society could not only be captured and inventoried but that they could be scientifically classified, ordered and divided into every finer sub-domains was central to the structure and operation of museums, and especially science and natural history museums, as we now know them. These ideas of classification, taxonomies and specialization within discrete domains were central features in the rapid development of human knowledge generally and the sciences more specifically, right down to the present day. The rapid growth in museums and in the size and scope of museum collections also fed into the processes of taxonomy, themselves based on the practice of expert judgement rather than quantitative analysis....

Museums have acted as formalizing institutions for a great deal of social and cultural knowledge, and the period in which their expansion accelerated was that of the rise of nationalist state ideology – the singular people with a single language (often the dominant group’s dialect), a national anthem and a flag – so beloved by Europeans and others. To produce this kind of uniformity of identity and processes of identification requires institutions to promulgate the illusion of sameness, to historicise it, and to develop a neat linear narrative arc from the messiness of normal human societies and their complicated histories. For a long period of time the museum, like the school, was a key focus for the articulation of the nation-sate mythos. The national museum, for example, is usually located in a nation’s capital and its near neighbours are often institutions with a similarly emblematic role....

Problematic in some of these scenarios was the implicit and even explicit hierarchies that social and material taxonomies tended to produce. One of the areas where this was most complicated and also had major implications was in the area of ethnographic museums which undertook the emerging scientific study of human beings and their societies. The ethnographic museum emerged at a time when anthropology and the other social sciences were in a formative state. In addition, technical developments such as photography, and more dubious constructs such as phrenology and eugenics, emerged and were frequently applied to ethnographic work....

Critical museology, beginning in the 1970’s and gaining momentum through the 1980’s and 1990’s, has revised much of the ‘givenness’ of the traditional institutional museum in its wider social and political role. In addition, the establishment of alternative types of museums and cultural institutions has opened up the authoritative nature of the ‘museum’ in a singular sense to the variety of alternatives that exist in any society, especially for more marginalized groups and their histories. ...

One of the roles of the museum has been to help index both the natural and social worlds. The sheer scale of biological, zoological and geological data emerging from the new sciences was so great during this time that more than mere taxonomies were required. One of the problems in this first information age was the volume of data (artifacts, samples etc.) being collected and the availability of categories with which to meaningfully index them, since so much material was so utterly new. In addition, the diversity of human cultures and their artifacts has also challenged the museum to produce meaningful understandings that do not entirely abstract the knowledge of those groups that actually produced the collected artifacts. ...

To identify those factories, to examine the engines of knowledge they gave rise to and to critically review their tangible and intangible products all help us in critiquing and unpacking the conceptual heuristics we live by. Institutions formalize their authority by circumscribing not just their knowledge products but their right to authorize knowledge within their domains of activity.
epistemology  knowledge_structures  museums  exhibition  classification  collection  organization  institutional_critique 
4 weeks ago
How private contractors are taking over data in the public domain | Reveal
long-standing debate over private companies that are controlling access to government data, documents and laws. He and others are trying keep in the public domain all sorts of data, documents, regulations and laws that taxpayers pay the government to develop but then often cannot obtain without putting up a fight or a paying hundreds or thousands of dollars in fees.

Government agencies, in many instances, have given contractors exclusive rights to the data. The government then removes it from public view online or never posts the data, laws and documents that are considered public information.

Public datasets that state and local governments are handing off to private contractors include court records and judicial opinions; detailed versions of state and local laws and, in some cases, the laws themselves; building codes and standards; and public university graduation records.

Much of the information collected and stored by private data companies such as LexisNexis, Westlaw or CrimeMapping.com is not available to the public without a price. The information that is available often is not searchable, cannot be compared with data from other jurisdictions and cannot be copied unless members of the public pay hundreds or thousands of dollars in subscription fees.

Sometimes, governments pay the companies to put the data into a useful format; other times, they turn over the data, get it back from the company in a useful format and give republishing rights to the company, which can then sell the data, laws and documents to the public.

The bottom line is good for the vendors, which can make millions of dollars from the sale of public information. But the public, who paid for the information to be developed in the first place, often is left on the outside, unable to get to the information as quickly as the private vendor, if they can get it at all, without paying for it....

These governments, lacking sophisticated coders and software experts, have contracted with private companies that translate the raw data into maps and conduct other analyses.

But in a vast number of these deals, the contractor gets to control the flow of information, restrict its duplication and downloading, and repackage and sell it to other clients, such as businesses, that want quick information about crime near their facilities. Or they publish state laws, regulations and building codes – sometimes with commentary – and then sell the records, often becoming the only “public” source of the information.

State and local governments often still are stuck in the digital past. Some departments lack the funding or internal expertise to build an open-source website and look for outside vendors, which then demand some type of exclusive control. Others continue to rely on paper reports that haven’t been digitized and need vendors to put them online and crunch the data.

Still others, eager to make use of sophisticated mapping tools and the reports they can produce, have gone to outside vendors to build data portals and mapping and alert systems. But these deals usually include limits on use by others – imposed by the contractor and agreed to by the government – that restrict the public’s access and right to republish without permission from the vendor....

The restrictions, the vendors say, must be imposed because they have turned the data into a new format – such as a map – and created tools that are copyrighted. Although the data are public, the company can insist that the material can be viewed but not copied, downloaded or in some other way appropriated without the company’s permission or a payment plan. ...

Building codes also are hard for the public to obtain, Public.Resource.Org’s Malamud said. They are developed locally but put into a researchable format by various trade organizations and then sold to anyone seeking to consult them.

“You can’t search them, you can’t print them, you can’t bookmark them, you can’t copy them,” he said. Which means that someone who is trying to consult or comply with the law needs to buy it first, he said.
public  data  government_docs  public_data  urban_intel  smart_cities  big_data  open_data  archives 
4 weeks ago
A Conversation about Ergonomic Futures «DIS Magazine
Is it possible, at some point between now and the bitter end of the universe, that our bodies experience such a degree of evolutionary change that the biological, ontological, and legal criteria of the human come undone—when the human fragments or even ceases to exist?

I took this question to a number of people. I spoke with Shara Bailey, a paleoanthropologist at NYU, who said that we’ll only see drastic change if humankind breaks into isolated groups. Owing to their limited scale, the groups would experience low genetic variation, meaning that over generations, recessive traits would have the possibility of becoming prominent. This phenomenon is sometimes known as “the founder effect.”

I talked with a research fellow at Harvard, who works in the lab of geneticist George Church. In 2014, Church claimed to have identified the genes that should be modified to make the human body survive better in “extra-terrestrial environments”: modifications to give us extra-strong bones, lean muscles, and lower cancer risk.

I also wanted to know how a designer would answer this question, so I called Jonathan Olivares, who wrote the 2011 book, A Taxonomy of Office Chairs. Olivares stressed that we need to understand the cultural and sociopolitical context of a given future era when conceiving of how it might be designed....

It bears stating that there are two temporalities to my question. While I’m curious about our future evolution, I’m also interested in how this thought experiment might prompt discussion about human typologization and normalization in Western science and social science. I thus began to research what we might call the empirical fictions of the last two-hundred years: the French social statistician Adolphe Quetelet’s model of “l’homme moyen” (“the average man”) from the mid-Nineteenth Century; eugenicist Francis Galton’s subsequent composite photographs, including those of “the criminal type”; and the discipline of ergonomics, which emerged in the Taylorist years and purported to increase the efficiency of the worker, then in the mid-Twentieth Century, shifted focus to enhancing the comfort of our bodies in the workplace and beyond. ...

The challenge was to give this research material form. I returned to my conversation with Olivares. Ergonomics needs to create body typologies for which to design; while we can all benefit from an ergonomic office chair, for example, there’s an ideal body indexed in its form. What would it mean to exploit this characteristic of the discipline: specifically, to design ergonomic seats for “standard” future bodies, where the qualities of those bodies are indexed in their forms?
body  evolution  furniture  ergonomics 
4 weeks ago
Algorithmic Life - Los Angeles Review of Books
...we are experiencing a comparable moment of semantic and political inadequacy. But there is more: the term is trying to capture new processes of technological change. How we talk about algorithms can be vague and contradictory, but it’s also evocative and revealing. Semantic confusion may in fact signal a threshold moment when it behooves us to revise entrenched assumptions about people and machines.

In what follows, rather than engaging in a taxonomic exercise to norm the usage of the word “algorithm,” I’d like to focus on the omnipresent figure of the algorithm as an object that refracts collective expectations and anxieties. Let’s consider its flexible, ill-defined, and often inconsistent meanings as a resource: a messy map of our increasingly algorithmic life.

As a historian of science, I have been trained to think of algorithms as sets of instructions for solving certain problems — and so as neither glamorous nor threatening. Insert the correct input, follow the instructions, and voilà, the desired output. A typical example would be the mathematical formulas used since antiquity to calculate the position of a celestial body at a given time. In the case of a digital algorithm, the instructions need to be translated into a computer program — they must, in other words, be “mechanizable.” Understood in this way — as mechanizable instructions — algorithms were around long before the dawn of electronic computers. Not only were they implemented in mechanical calculating devices, they were used by humans who behaved in machine-like fashion. Indeed, in the pre-digital world, the very term “computer” referred to a human who performed calculations according to precise instructions — like the 200 women trained at the University of Pennsylvania to perform ballistic calculations during World War II....

Algorithms have thus become agents, which is partly why they give rise to so many suggestive metaphors. Algorithms now do things. They determine important aspects of our social reality. They generate new forms of subjectivity and new social relationships. ....

Academics variously describe them as a new technology, a particular form of decision-making, the incarnation of a new epistemology, the carriers of a new ideology, and even as a veritable modern myth — a way of saying something, a type of speech that naturalizes beliefs and worldviews.

...A clock that can measure seconds and fractions of a second inevitably changes our perception of time. It turns time into something that can be broken down into small units, scientifically measured — and accurately priced. The precision clock helped spawn new temporalities as well as oceanic navigation and the industrial revolution. It was the industrial revolution’s metronome. At the same time, the clock was taken to be the best representation of the world it was shaping: a mechanistic, quantifiable, and predictable world, made up of simple elementary components and mechanical forces.

...Similarly, seeing the workings of the human mind as analogous to the operations of a hefty Cold War electronic computer signals a momentous cognitive and social shift. Historian of science Lorraine Daston describes it as the transition from Enlightenment reason to Cold War rationality, a form of cognition literally black-boxed in shiny-cased machines.... Many sharp minds of the post–World War II era believed that the machine’s algorithmic procedures, free of emotions and bias, could solve all kinds of problems, including the most urgent ones arising from the confrontation between the two superpowers. It did not work out that way. The world was too complicated to be reduced to game theory, and by the late 1960s the most ambitious dreams of automated problem-solving had been dragged into the mud of the Vietnam War.

Clocks and Cold War computers were, I’m suggesting, emblematic artifacts. They shaped how people understood and acted within the world. Clocks and computers also shaped how people understood themselves, and how they imagined their future...

Once we become habituated to infrastructures, we are likely to take them for granted. They become transparent, as it were. But there is something distinctive about the invisibility of algorithms. To an unprecedented degree, they are embedded in the world we inhabit. This has to do with their liminal, elusive materiality. In sociological parlance, we could say that algorithms are easily black-boxed, a term I used above to describe how Cold War rationality disappeared into computers. To black-box a technology is to turn it into a taken-for-granted component of our life — in other words, to make it seem obvious and unproblematic....

algorithms select information and assess relevance in very specific ways, and users then modify their practices in response to the algorithms’ functioning. Indeed, algorithms produce new “calculated publics” by presenting groups back to themselves. Their deployment is accompanied, observes Gillespie, by “the promise of […] objectivity,” whereby “the algorithm is positioned as an assurance of impartiality.” These algorithms play a role traditionally assigned to expert groups touting or channeling what might be termed a traditional editorial logic....

The way algorithms manage information is not simply a mechanized version of that older logic. It is a new logic altogether, an algorithmic logic, which, to quote Gillespie again, “depends on the proceduralized choices of a machine, designed by humans operators to automate some proxy of human judgment or unearth patterns across collected social traces.” ...

If we want to understand the impact of these algorithms on public discourse, concludes Gillespie, it is not sufficient to know “how they work.” We need to examine “why [they] are being looked to as a credible knowledge logic” and which political assumptions condition their dissemination and legitimacy. In other words, we need to be aware of the entanglement of the algorithm with its ecology — with the mechanical and human environment within which that particular set of instructions is interpreted and put to work....

the deterministic view of the algorithm — the figure of the algorithm that does things — certainly helps us understand how, as a technological artifact, it can change the world we live in. In this type of speech, the term “algorithm” functions as a synecdoche for software and larger sociotechnical systems. The algorithm-as-doer, however, is also misleading precisely because it hides its larger ecological context; it represents the algorithm as a self-contained mechanism, a tiny portable machine whose inner workings are fixed and whose outcomes are determined. By contrast, an empirical study of algorithms suggests that we can understand their functioning — and their meaning — only by considering the sociotechnical ecologies within which they are embedded....

There is another important reason why the algorithm-as-doer is misleading: it conceals the design process of the algorithm, and therefore the human intentions and material conditions that shaped it.... Consider the example of algorithms that produce and certify information. In exploring their ecology, we can address important questions about authority, trust, and reliability. But what about the logic that shaped their design in the first place? Who decided the criteria to be adopted and their relative weight in the decision-making process? Why were the algorithms designed in one particular way and not another? To answer these questions, we need to see the technical features of an algorithm as the outcome of a process. In other words, we need a historical — indeed genealogical — understanding of the algorithm. ...

Like clocks and the programs of early electronic computers before them, current digital algorithms embody an aspiration to mechanize human thought and action in order to make them more efficient and reliable. This is a familiar and yet also unsettling story, constitutive of our modernity....

The notion of efficiency is always relative to a set of assumptions and goals. Making these assumptions and goals visible is thus a prerequisite for any informed discussion about technological change and its implications....

Social scientists have decried the difficulties inherent in empirically studying algorithms, especially proprietary ones. This problem is normally framed in terms of “secrecy,” a notion that implies strategic concealment. We need, however, a more general concept, like sociologist Jenna Burrell’s “opacity.”... Burrell’s “opacity” refers to the fact that an output of this sort rarely includes a concrete sense of the original dataset, or of how a given classification was crafted. Opacity can be the outcome of a deliberate choice to hide information. Legal scholar Frank Pasquale, a leading critic of algorithmic secrecy, has advocated for new transparency policies, targeting the kind of opacity designed to maintain and exploit an asymmetric distribution of information that benefits powerful social actors.
algorithms  epistemology  black_box  agency  actor_network  genealogy  ideology  opacity 
4 weeks ago
Why Time Flies - WSJ
Isaac Newton imagined time as a sort of cosmic metronome, relentlessly ticking away at a steady pace for all eternity. Albert Einstein envisioned time as fluid, capable of dilating or contracting or even standing still in certain circumstances.

Physics has long since decided the argument in Einstein’s favor. Within psychology, things remain murkier. Most people do believe in some kind of external clock for the universe, independent of human beings; time and tide wait for no man, after all. Yet we simultaneously recognize that time rarely seems so rigid....

He starts with the physics, describing the workings of ultra-precise atomic clocks and lasers that can measure durations as short as a millionth of a billionth of a second. Surprisingly, he argues that even with such high-precision instruments, time remains a social construct. The world’s official time, for instance, is not kept by a single Master Clock. Rather, scientists in different countries have to weigh and average the outputs of dozens of atomic clocks, all ticking along at slightly different rates. In other words, the “true” time is a judgment call—a high-tech version of townsfolk gathering in a village square on a sunny day and deciding: Yep, looks like noon....

The bulk of the book examines biological and psychological time. Our bodies have molecular clocks that run on cycles of roughly 24 hours, which explains why heartbeat, blood pressure, facial-hair growth, urine output and other biological functions all vary according to the time of day. These circadian cycles are so ingrained inside cells that transplanted kidneys sometimes tick away at the donor’s old schedule, forcing the recipient to visit the bathroom at unaccustomed times.
clock  temporality  time 
4 weeks ago
It's Nice That | Designer Darius Ou Dahao visualises links between art and law in this artist’s catalogue
Darius Ou Dahao has designed Voice of Courts, a catalogue that sits alongside an exhibition of the same name. The show and catalogue visualises a collaboration between the artist Jack Tan and the Community Justice Centre (CJC), a charity based at the State Courts and the Family Justice Courts of Singapore.

Currently on show at the Singapore Biennale until February 2017, the project sees Jack explore the “soundscape of the courts by attending hearings, sitting in for legal advice sessions and volunteering in different litigants-in-person help programmes”. His focus was to capture voice and how it was translated in the courts through timbre, tone, echo, cacophony and lyricism and has made drawings of the sounds he heard, which were later turned into graphic scores.

The project is part of a residency Jack has undertaken and Darius was brought in to conceptualise and design the catalogue to accompany the work created. “I’ve worked with Jack before on several projects in the last few years and it’s always been a very collaborative process when designing for him,” says Darius.
sound  voice  discourse  legal_system 
4 weeks ago
The Avery Review | Democratic Soundscapes
One person would suddenly fall to the ground, feigning death, while the other drew his/her outline on the street in chalk. The happening was not only a visual spectacle, but it was also a sonic act. Many performers carried handheld FM radio sets, simultaneously issuing a soundtrack of several people crying that was being broadcast across all FM stations and the state-owned Radio Nepal for a full hour, from 5:00 to 6:00 p.m. The sound and silence that emanated across the streets participates in and constitutes what I call a democratic soundscape. At once producing and reflecting upon the media of democratic protest—ranging from radio infrastructure to collective processions—Ashmina’s performance urges us to consider the importance of sound in cultivating political subjectivity....

In its popular understanding, democracy is identified with various forms of voicing: from political speeches to the shouts of public protest; from filibusters in the halls of the US Congress to heated debates in cafés, salons, and newspapers around the world. More broadly, such voicing is seen as rhetorically constituting the democratic subject itself, through metaphors such as finding voice, raising voice, and having one’s voice heard.1 These metaphors are usually disembodied, rarely invoked with any reference to the materiality or texture of embodied voices.2 Furthermore, such metaphors of political voice almost always refer to discursive speech or analytic or reasoned discourse. They rarely conjure other forms for political utterance, sound, or even noise—voices shouting, collective chanting, the production of noise for political effect, or, significantly, the active performance of silence. By reviewing Ashmina’s “Happening” and considering it as a democratic soundscape, I ask a simple question: What does democracy sound like?...

Democratic protests around the world deploy other forms of noise to similarly convey their affective discontent. The use of pots, pans, and metal plates to create the noise of protest, for example, is a global phenomenon used in Nepal, Argentina, Quebec, and Turkey. In parts of South America this clanking is deployed as cacerolazo (casserole) protests, called manifs casseroles in Quebec. Jonathan Sterne and Natalie Zemon Davis link the percussive banging on pots and pans in the 2012 Quebec protests against Bill 78 that banned public assembly and certain rights of protest to the charivari protests in early modern and modern France, when noisy demonstrations, typically performed by disguised youth, “call[ed] attention to a breach of community standards in the village or neighborhood.”12 In a kind of reversal of Elaine Scarry’s discussion of everyday items becoming weapons of torture, here the democratic soundscape turns everyday, domestic items into the weapons of protest, effectively bringing the home into the public.

...It turns our attention to those sounds inherent in participatory democracy that depend not upon a single speaker but upon collectivities—often assembled en masse—to make any message heard within the polyphony of perspectives that can constitute ongoing, collaborative deliberation. Such moments are popularly embodied in the “human mic” used at recent protests, like the occupy movements or many anti-Trump protests after the election, when a single person’s speech or chant are spoken in short segments and then repeated by the crowd so that the words can be heard far from the source. The human mic works, then, through the nonmechanical amplification of voice, collective resounding, and in so doing inverts, ironically, familiar modes of electronic mediation associated with modernity and mass experience.21 The human or “people’s mic” emphasizes a collective voice that, as Homay King notes, “shifts away from sovereign, solitary personhood,” in part through a poetics in which the human appropriates the mechanical.22 By “extending the communality of the movement through embodiment,” the human mic enacts the very thing that it seeks to represent or demand.
soundscape  sound_space  voice  media_space  language  noise  democracy 
5 weeks ago
BOMB Magazine — Laura Kurgan by Noah Chasin
We are asking questions about how these methods, maps, and data work, rather than simply assuming they are the most efficient or most accurate tools for telling us the truth about the world, whether scientifically or socially. Bringing architecture, data science, and the humanities together around these questions and testing them through real-world projects with meaningful political and ethical stakes is an ideal matrix for this. ...

In Jumping the Great Firewall (2014), a project dealing with Weibo, China's social media behemoth, we looked at the creation of an activist space in one of the most censored networks in the world. It was more a data-visualization project than a mapping project, but it had a spatial component. Basically, Chinese activists knew that the government was using algorithms to censor their messages, so they would take pictures of their posts that algorithms couldn't automatically detect—sometimes called "Long Weibo"—and send these around. The messages would then spread, and a human censor would have to find them.

At the lab, we did a short two-week project tracking these suppressed posts. First, we just stored all of these image posts, and then we developed an algorithm that could scrape the Weibo feeds to see whether something had been deleted. Afterward, we discarded everything except the messages that had been censored. Our archive preserves all the censored posts in their original form....

We're interested in how conflicts of all sorts, not just war, make, unmake, and remake urban spaces.... Conflict urbanism is about justice as a general category and about the inequalities—visible and hidden—that structure cities today. With Aleppo we've been looking at the patterns of damage over the scope of the war, but also at the conditions prior to the outbreak of violent conflict that might have set up the patterns we are seeing today. ...

Our approach to understanding the urban-conflict landscape was to create a web-based map of Aleppo that is browsable at a neighborhood level and to which we've added different layers of data. You can zoom in and out to study it. It's like Google Maps, but we designed an interface around an open-source mapping program, Mapbox, and have used open-source geospatial data from OpenStreetMap, which allows us to style the maps in our own way, incorporate multiple datasets, and bring our own concerns and interpretations to them.

For instance, if you open up the map and just look at the OpenStreetMap layer, you'll notice the government-controlled western part of the city is fully mapped and the east is not. The east has a lot of denser, so-called informal neighborhoods, so maybe it's not mapped because the streets are too small and difficult to map. But part of the reason might also be political. We found it impossible to ignore the coincidence that the portions of the city that are less mapped are also the neighborhoods now most damaged....

I know I sometimes sound optimistic about the new commercial imagery landscape, but it's not always that simple for civilians to purchase the imagery in real time. These are geopolitical conversations to which few citizens have access.... you can (sometimes) use the history button to pull up old images, but you never know the specific date or exactly which satellite took the picture. They don't supply that information. Our map allows you to track the images over time....

People often think that layers of a map simply add facts. In reality, each layer is a story about its own dataset. Although they can complement each other, and although they are all showing the same place, they can give the impression of presenting a complete truth, when, in fact, they cannot do that. We are still searching for a mapping language that can harness these divergent datasets in a way that says, "It's not all here, but let's look at this and explore."...

NC When you accumulate data like the UNOSAT or Human Rights Watch material, you're not necessarily acquiring it with a preconceived notion of what you're going to find, correct?

LK That's right. The data and the images are not illustrations—they are research, ways of discovering things about the world. For example, if we had not made this map, we wouldn't have seen clear patterns of damage—the ways in which the attacks are so disproportionately targeting certain neighborhoods in Aleppo. Now you can zoom in and look at this or that neighborhood in particular. So, for instance, we looked at the Sheikh Sa'eed neighborhood, and there's a lot of barrel bomb damage there in a completely residential area. Bomb craters are clearly visible.

Think of data as a navigation device. It's not the Truth. Sometimes, people worry about our interest in satellite images not because of their military origins, although that is sometimes an issue, but because of the apparently compromised epistemology. "Why the overhead view? Are you aspiring to a view from nowhere?"...

when you actually work with these images and the data embedded in them, the truth becomes more complicated. It's obviously not from nowhere. What people think of as being flat, I call multidimensional. Because all satellite images incorporate time, and they also allow you to zoom in. Once you start aligning all the different layers on top of each other, you can navigate your way toward certain things on the ground....

We have been working with Jamon Van Den Hoek, a geographer who has developed a clever strategy for using Landsat imagery—free low-resolution imagery, fifteen meters per pixel—to identify possible locations that are undergoing significant changes. Every two weeks during the war, he generated a "change map" highlighting the pixels that had changed in Aleppo. The patterns and intensities of transformation were apparent. Then we followed those clues and used them to guide a more detailed investigation with high-resolution imagery. They led us to certain areas, like Castello Road and some of the eastern neighborhoods, and we found a lot of destroyed bridges and other damage from aerial attacks, leading us to focus further research on those parts of the city. We're making a wartime map of Aleppo that will become an archive of the city's destruction.

...We are trying to guard against doing something in support of the civic opposition that inadvertently gets turned against them and what they're defending. For example, we are making a list of destroyed heritage sites, and we have a layer in the map that locates them. We also have a layer on the map based on a list published by the State Department of 241 prewar cultural sites. Comparing them is powerful. But it's also dangerous, so we've switched off that layer of the map online for now to protect against it becoming an easy list of future targets.
GIS  mapping  data_visualization  cartography  censorship  china  conflict_urbanism  satellite_imagery  epistemology 
5 weeks ago
Citizen Formation Is Not Our Job - The Chronicle of Higher Education
Rather than introducing students to the history and structures of American government, universities today, the report asserts, teach "civic engagement" in courses that focus less on a specific subject matter (the Supreme Court, the organization of Congress, the powers of the president) than "on turning students into activists" who engage in "coordinated social action" designed to further a left, progressive agenda. Under the aegis of this "new civics" students (and faculty) participate in "service learning, "that is, learning that gets them into the community where they can gain practical knowledge and participate in the solving of society’s pressing problems. A 1970 report by the Southern Regional Education Board names the goal succinctly: "to give young people … front-line experience with today’s problems so that they will be better equipped to solve them as adult citizens."

What could be wrong with that? What’s wrong, according to the NAS, is, first, that at bottom the project is driven by a progressive-left assumption that America "must be transformed … from an unjust, oppressive society to one that embodies social justice," and second, that in the process the traditional academic project of searching for the truth is displaced by the political project of making the society better, where "better" is defined as a movement away from currently-in-place values and norms. ...In short, learning things and becoming a learned person give way to doing things and becoming a better person; the societal transformation civically engaged students work to accomplish also transforms them: "The New Civics replaces traditional liberal arts education with vocational training for community activists."...

I agree that colleges and universities should teach civic literacy rather than civic advocacy. I agree that while volunteerism is in general a good thing, it is not an academic good thing and those who take it up should not receive academic credit for doing so. I agree that students "should possess a basic understanding of their government" and that colleges and universities should play a part in providing that understanding.
civic_engagement  progressivism  academia  teaching 
5 weeks ago
The Gathering Cloud || J. R. Carpenter
The Gathering Cloud aims to address the environmental impact of so-called 'cloud' computing through the oblique strategy of calling attention to the materiality of the clouds in the sky. Both are commonly perceived to be infinite resources, at once vast and immaterial; both, decidedly, are not. Fragments of text from Luke Howard's classic Essay on the Modifications of Clouds (1803) and other more recent online articles and books on media and the environment are pared down into hyptertextual hendecasyllabic verses. These are situated within surreal animated gif collages composed of images materially appropriated from publicly accessible cloud storage services. The cognitive dissonance between the cultural fantasy of cloud storage and the hard facts of its environmental impact is bridged, in part, through the constant evocation of animals: A cumulus cloud weighs one hundred elephants. A USB fish swims through a cloud of cables. Four million cute cat pics are shared each day. A small print iteration of The Gathering Cloud shared through gift, trade, mail art, and small press economies further confuses boundaries between physical and digital, scarcity and waste.
cloud  infrastructure  colonialism  natural_history  net_art 
5 weeks ago
Emma McNally’s Fields, Charts, Soundings Cartographies – SOCKS
Emma McNally‘s work is an artistic cartography of imaginary nodes, network topologies, noise patterns, musical notations. Traces and scatters shape an imaginary, poetic confluence of scientific advances in genetics, neuroscience, physics, molecular biology, computer systems, and sociology

From a descriptive text on her Flickr profile:

“In Emma McNally’s work dense layers of carbon on paper create fields which offer themselves up to meaning: planes, vectors, topoi are overlaid, or coexist with swarms, shoals, marks laid out in rhythmic sequence.

The effect is of a continuous flux formed by a congruence of information systems: neural networks, contagion maps, sonar soundings, weather systems, water currents, charts plotting the migratory habits of deep-ocean mammals.

Focusing on rhythm as an expression of the dynamic of forming/unforming, McNally thinks this through graphically by highly charged percussive mark-making. Lines carry force, like the pulse of an ECG or a measure of seismic activity.

Ways in which the ‘matter’ or ‘noise’ of charged marks (unclaimed by frequencies or channels) combine, disperse and recombine into gatherings of static are explored. Passage is forged between differing rhythmic expressions: highly regularised, geometric systems of marks enter into configurations with chaotic swarms and fugitive marks.

Regularised, centralising and defining forces are disrupted, subverted and deterritorialised. The nomadic and fugitive are subject to forces that capture and formalise. Monolithic and viral tendencies mutually infiltrate.

Overall the attempt is made to maintain a state of flow, of passage between these forces where both are in danger of overrunning but are constantly overthrown – with the resulting mutations and proliferations played out.”
networks  information  systems  illustrations  infrastructure  electromagnetic_waves  lines 
5 weeks ago
Fast and Free: New York's Vision for Public Wi-Fi Everywhere - YouTube
Wi-Fi is essential to New York City's strategy to give every resident and business access to affordable, reliable, high-speed broadband service everywhere in the city. Globally, Wi-Fi is the workhorse of the Internet. Currently Wi-Fi carries 60% to 80% of all broadband data traffic on smartphones, laptops, and other mobile devices, far more than cellular networks do. But a new technology is threatening the effectiveness of Wi-Fi – and its ability to create connectivity for all.

On Monday, May 2, New York City sent a letter to the FCC highlighting its concerns about the potential harms that LTE-U will have on WiFi. Read the letter here: https://static.newamerica.org/attachm...

New York City's innovative use of Wi-Fi to make Internet access available, fast, and affordable for all New Yorkers include:

CityBridge's LinkNYC franchise, which will replace at least 7,500 kiosks with free, high-speed Wi-Fi hotspots across the five boroughs
New York City's Economic Development Corporation's RISE : NYC resiliency initiative, which will fund the installation of resilient Wi-Fi networks to serve small businesses in areas impacted by Hurricane Sandy

Support for free public Wi-Fi in Chelsea, Harlem, downtown Manhattan, and downtown Brooklyn, as well as City parks, libraries, and train stations

Free broadband service to more than 21,000 residents of public housing, beginning with the Queensbridge Houses, the largest public housing development in the country

All of that free connectivity, though, may be at risk due to a plan by many cellular carriers like Verizon and T-Mobile to begin offloading data traffic onto the unlicensed frequencies of our public airwaves – on which Wi-Fi depends – to augment the licensed spectrum they currently use. The interference could slow or even shut down public Wi-Fi systems, shrinking access, undermining digital equity, and scrapping hundreds of millions of dollars marked for improving the social, digital, and economic equity of NYC.

Join New America for a conversation on the suite of initiatives that keep city systems and residents connected, and the forces that threaten to block their visions for equitable governance.
wifi  wireless  connectivity  broadband  infrastructure  digital_equity 
5 weeks ago
Calling Bullshit — About
The world is awash in bullshit. Politicians are unconstrained by facts. Science is conducted by press release. So-called higher education often rewards bullshit over analytic thought. Startup culture has elevated bullshit to high art. Advertisers wink conspiratorially and invite us to join them in seeing through all the bullshit, then take advantage of our lowered guard to bombard us with second-order bullshit. The majority of administrative activity, whether in private business or the public sphere, often seems to be little more than a sophisticated exercise in the combinatorial reassembly of bullshit.

We're sick of it. It's time to do something, and as educators, one constructive thing we know how to do is to teach people. So, the aim of this course is to help students navigate the bullshit-rich modern environment by identifying bullshit, seeing through it, and combatting it with effective analysis and argument.

What do we mean, exactly, by the term bullshit? As a first approximation, bullshit is language intended to persuade by impressing and overwhelming a reader or listener, with a blatant disregard for truth and logical coherence.

While bullshit may reach its apogee in the political sphere, this isn't a course on political bullshit. Instead, we will focus on bullshit that comes clad in the trappings of scholarly discourse. Traditionally, such highbrow nonsense has come couched in big words and fancy rhetoric, but more and more we see it presented instead in the guise of big data and fancy algorithms — and these quantitative, statistical, and computational forms of bullshit are those that we will be addressing in the present course.

Of course an advertisement is trying to sell you something, but do you know whether the TED talk you watched last night is also bullshit — and if so, can you explain why? Can you see the problem with the latest New York Times or Washington Post article fawning over some startup's big data analytics? Can you tell when a clinical trial reported in the New England Journal or JAMA is trustworthy, and when it is just a veiled press release for some big pharma company?

Our aim in this course is to teach you how to think critically about the data and models that constitute evidence in the social and natural sciences.
epistemology  syllabus  big_data  pedagogy  methodology  statistics 
5 weeks ago
Covert Cartographics – BLDGBLOG
The collections include state-of-the-art graphic tools for producing maps and other measured cartographic products, as well as the maps themselves. Organized by the decade of its production—including batches from the 1940s, 1950s, and 1960s—“each map is a time capsule of that era’s international issues,” as Allison Meier points out.

“The 1940s include a 1942 map of German dialects,” Meier writes, “and a 1944 map of concentration camps in the country. The 1950s, with innovative photomechanical reproduction and precast lead letters, saw maps on the Korean War and railroad construction in Communist China. The 1960s are punctuated by the Cuban Missile Crisis and Vietnam War, while the 1970s, with increasing map automation, contain charts of the Soviet invasion of Afghanistan, and the Arab oil embargo.”

But it’s the mapping tools themselves that really interest me here.

On one level, these graphic devices are utterly mundane—triangular rulers, ten-point dividers, and interchangeable pen nibs, for example, any of which, on its own, would convey about as much magic as a ballpoint pen. ...there is something hugely compelling for me in glimpsing the actual devices through which a country’s global geopolitical influence was simultaneously mapped and strategized.
mapping  cartography  tools  instruments  methodology 
5 weeks ago
On Anthropolysis - e-flux Architecture - e-flux
As I and others have written, the reason we know that climate change is even happening at the nuanced degrees that we do is because of the measurement capacities of terrestrial, oceanic, atmospheric sensing meta-apparatuses that are at least representative of an industrial-technological system whose appetite is significantly responsible for the changes being measured in the first place.4 This correspondence may be the rule, not the exception, and for the Anthropogeny/Anthropolysis dynamic, a more crucial example is the relationship between oil and deep time. Finding oil was (and is) an impetus for the excavation of Earth, an ongoing project that turns up sedimentary layers of fossils and provides evidence of an old Earth and deep time. If not for the comprehensive disgorging of fossil fuels since the late nineteenth century, we would not have this Anthropocene, and if not for the economic incentive to look below and at rocks in this way, we may not have been confronted with the utter discontinuity between anthropometric time and planetary time. So, even if deep time is one of the ways that we learn to de-link social and phenomenological time from planetary time, its discovery was made possible by an industry that operated on nature with the local conceit that ecological time is subordinate to social time, and now we have the “accidental” fulfillment of that superstition by the Anthropocene's binding of social and geologic time. By pursuing the illusion as if it were true, we discovered, as a by-product, that it was false, but the by-product of doing so is that we made it true.
anthropocene  temporality  deep_time  climate_change  mining 
5 weeks ago
Our Graduates Are Rubes
The pampering of students as customers, the proliferation of faux "universities," grade inflation, and the power reversal between instructor and student are well-documented, much-lamented academic phenomena. These parts, however, make up a far more dangerous whole: a citizenry unprepared for its duties in the public sphere and mired in the confusion that comes from the indifferent and lazy grazing of cable, talk radio, and the web. Worse, citizens are no longer approaching political participation as a civic duty, but instead are engaging in relentless conflict on social media, taking offense at everything while believing anything.

College, in an earlier time, was supposed to be an uncomfortable experience because growth is always a challenge. It was where a student left behind the rote learning of childhood and accepted the anxiety, discomfort, and challenge of complexity that leads to deeper knowledge — hopefully, for a lifetime.

That, sadly, is no longer how higher education is viewed, either by colleges or by students. College today is a client-centered experience. Rather than disabuse students of their intellectual solipsism, the modern university reinforces it. Students can leave the campus without fully accepting that they’ve met anyone more intelligent than they are, either among their peers or their professors (insofar as they even bother to make that distinction)....

College, in an earlier time, was supposed to be an uncomfortable experience because growth is always a challenge. It was where a student left behind the rote learning of childhood and accepted the anxiety, discomfort, and challenge of complexity that leads to deeper knowledge — hopefully, for a lifetime.

That, sadly, is no longer how higher education is viewed, either by colleges or by students. College today is a client-centered experience. Rather than disabuse students of their intellectual solipsism, the modern university reinforces it. Students can leave the campus without fully accepting that they’ve met anyone more intelligent than they are, either among their peers or their professors (insofar as they even bother to make that distinction)....

Faculty members both in the classroom and on social media report that incidents like that, in which students see themselves as faculty peers or take correction as an insult, are occurring more frequently. Unearned praise and hollow successes build a fragile arrogance in students that can lead them to lash out at the first teacher or employer who dispels that illusion, a habit that carries over into a resistance to believe anything inconvenient or challenging in adulthood.
expertise  higher_education  ego  pedagogy  advising 
5 weeks ago
Envisioning the Fully Integrated Library
Two trends are emerging in the development of academic libraries. On one hand, they are becoming more holistic learning environments, supporting a variety of needs. Students receive help finding information, but they also find help with writing, course tutoring, expertise with specialized technologies, and uniquely designed study spaces. Depending on the circumstances of each institution, the library may become a "lab outside the classroom," a one-stop learning facility, or a student version of faculty centers for teaching and learning.

On the other hand, libraries are becoming sophisticated research centers, supporting the manipulation, analysis, creation, and construction of knowledge. We see this in such diverse initiatives as data curation and visualization, digital humanities, and scholarly communication.

Concerns about library change frequently focus on the conflict that surrounds withdrawing books to make rooms for other services. These debates miss the point entirely. What is important is not whether the library removes books, but whether, and to what degree, library resources and services are integrated into teaching, learning, and research. Within this context, books, databases, library instruction, and the reference desk all deserve scrutiny....

we see it when the teaching of more-complex information skills are scattered across the curriculum. Or when a librarian participates in a course as an embedded participant. Or when assignments are created in collaboration with librarians in a way that incorporates library resources, technologies, and information-skill development.
pedagogy  research  libraries  academic_libraries 
5 weeks ago
Virus, Coal, and Seed: Subcutaneous Life in the Polar North - Los Angeles Review of Books
Anthrax is not the only ghost haunting the Arctic.
In the Arctic Circle, life seems to keep its own time. If you travel across the Barents Sea from Yamalo-Nenets, you’ll arrive at a Norwegian archipelago called Svalbard. It is an otherworldly place, inhospitable to most life yet starkly and sublimely beautiful. Roughly 2,600 intrepid people, most of them adult men, live here. But you can’t die in Svalbard. No, inhabitants are not immortal. Rather, their life cycles are abridged in mundane ways: Norwegian officials forcibly evict the sick, disabled, and elderly, shipping them back to the Norwegian mainland to end their days. You can’t be born in Svalbard either. The governor orders women in their third trimester to leave. Svalbard is not, as citizens call it, a “life cycle community” — no concessions are made for birth and death, and only able-bodied working adults are welcome....

The link between anthrax in Yamalo-Nenets and life in Svalbard is complicated, but key to understanding both is the climate and the ways in which arctic cold transfigures that which is old....

Platåberget is full of vaults and faults, graves and caves. The mountain is now a place to unearth coal and bury coal miners, to immortalize seeds and resurrect viruses. On Platåberget, viruses that lived and died in the past have lately erupted into the present; ruins of coal mines are persistently in the present; and seeds in the vault are artifacts of the present that are now buried for future disinterral. At the ends of the earth, time seems out of joint. Here in the polar north, viruses, coal, and seeds are geopolitical and climatological relics, telling tales of coal extraction, contested land claims, and crumbling empires. And, in the Arctic, geopolitics is decidedly climatological — punctuated by global war, Cold War, and global warming....

Longyearbyen is named for John Munro Longyear, an American capitalist whose name itself suggests a kind of temporal slackening. Longyear arrived in Svalbard and espied riches in the plentiful Triassic coal seams that marked the land, exposed by glacial gashes. Coal is, of course, dead organic matter. All of that shiny black sediment is the detritus of deciduous forests and puzzlegrass that flourished in a balmier Svalbard 65 to 23 million years ago, their dead tissue inspissated by heat and pressure until latent energy condensed into something combustible....

Even though coal mining in Longyearbyen is largely shuttered, its infrastructure remains scattered across Longyearbyen’s landscape [Figure 2]. The dark skeletal remains of coal tipples, lift systems, and aerial tramway conveyors litter the surrounding mountains, looking much the way Store Norske left them in 1958 — they resist decay because the temperature is too cold for liquid water to rot wood. They are ruins, and will most likely remain so indefinitely....

What will become of life, then, as the climate warms and these glaciers recede — as ecological catastrophe joins geopolitical catastrophe to make this and every other place precarious and unlivable? In 1984, agricultural researchers from a Norwegian university decided to conduct what they termed a “hundred-year experiment.” They gathered a small collection of seeds and stored them underground in Mine 3 on a Platåberget pass just past the Longyearbyen airport. The interior of the coal mine maintains an ambient temperature between -2.5 and -3.5 degrees Celsius, far enough below freezing that, the researchers suspected, seeds would be naturally preserved. Checking one year to the next, the researchers confirmed the seeds’ suspended animation: none have germinated. The Norwegian scientists made a proposal to the UN’s Food and Agriculture Organization (FAO): since there was plenty more room in this mine shaft, now repurposed as a naturally occurring cryobank, other countries might want to pay a small fee in order to archive their own seeds. The UN turned them down, on the grounds that intellectual property disputes might arise if one country stored a significant amount of its national germplasm in another nation’s territory. The mine shuttered in 1996 when its thin coal seam was exhausted. The seeds stored in 1984 are still there....

But in the wake of Hurricane Katrina, Fowler began to wonder whether agricultural diversity could ever truly be secure if cities were so vulnerable to geopolitical and ecological disaster. He and Shands realized that the gene banks were located in places where the best technological infrastructure could be quickly dismantled by political strife or natural disaster — Nigeria, Colombia, Nairobi, Kenya, Nepal. It was then that Fowler recalled the Norwegian scientists whose failed proposal had crossed his desk years earlier at the FAO. Back then, he had nixed the proposal, but now he thought differently: a vault dug into the permafrost beneath Platåberget seemed as safe a place as any, and perhaps safer than most.
The Svalbard Global Seed Vault broke ground soon thereafter....

Gesturing downhill, he points out the air traffic control tower for the Longyearbyen airport, and explains that this location allows the air traffic controllers to keep an eye on the vault and sound an alarm if they notice an intruder.
A thin cement wedge piercing the frozen mountainside at a steep incline, the vault’s Brutalist exterior suggests how deeply it is lodged beneath the earth [Figure 4]. Above the doors and along the roof is an installation of prisms and fiber-optic cables that reflect the midnight sun in the summer and glitter like the aurora borealis during the polar night. It looks like a post-apocalyptic bunker, which, I suppose, is exactly what it is....

The doors slam heavily behind us, and we face a long hallway, really a tube of corrugated metal sloping downward into the mountain. Everything is duplicated: ventilation, backup generators, and pumps. There’s no use for one water-pump, let alone two, in a hole beneath permafrost, but the building’s designers have prepared for a future when the permafrost has thawed. Engineers have planned ahead in other ways as well. For instance, they surveyed the mountain to ensure that the vault is nowhere near a coal seam. Their reasoning was that a century or more from now, when the vault is forgotten, miners may return to this mountain seeking coal seams, only to inadvertently drill into the vault. The engineers also accounted for a 70-meter sea level rise, which is an estimate of what would happen if all the glaciers in the world melted. They compounded that scenario with a tsunami, and then built the vault five stories above the predicted waterline. Engineers calculate that, given the current rate of climate change, the vault would remain below freezing even if the electricity went out for the next two centuries. How long did you build it to last, I ask? Fowler: “Essentially forever.”...

the room into which he escorts me next is wondrous indeed: a stark and cavernous antechamber of raw limestone hollowed into vaulted ceilings and washed in white reinforced concrete, rock rimed in frost. “I really enjoy being here,” Fowler murmurs, and his voice reverberates. The wall opposite the doors through which we entered is gently concave; to our left, two doors are offset, and a third door is on the other side of the parabolic bare wall. Fowler explains that they avoided putting any of the interior chambers directly opposite the door leading to the hallway so that “if someone were to fire a missile down here … it wouldn’t hit the place where the seeds are.” So, too, the wall is concave so that shockwaves — from a ballistic missile or a plane crashing into the mountain, for example — can reflect back toward the entrance instead of propagating deeper into the mountain and injuring the seeds....

Yet here is abundant life: 860,000 different varieties of crops, and 120,000 different strains of rice alone. Seeds are sealed in triple-ply, puncture-resistant vacuum packaging and then loaded into plastic crates, which are stacked on shelves. Looking inside one box, I find ampules of squash and bags of anise. Every major crop in the world is in this room — not just wheat, oats, barley, potatoes, lentils, soybeans, and alfalfa, but also heirloom seeds and forgotten landraces. Boxfuls of foraged grasses are stored cheek-by-jowl alongside sorghum, foxtail millet, bur clover, purple bush-beans, pigeon peas, Kentucky bluegrass, and creeping beggarweed. Every country in the world is represented, as are several countries that no longer exist. Colombia, North Korea, Russia, Taiwan, Ukraine, Switzerland, Nigeria, Germany, Israel, Syria, Zimbabwe, Tajikistan, and Armenia share shelf-space in this pastoral League of Nations. With over 90 million seeds deposited in the bank, India represents the largest crop diversity, nearly three times as much as Mexico, the next most prolific contributor.
On February 26, 2008, the day the seed vault opened, Pakistan and Kenya were first in line to store their seeds. The previous year, the disputed election of Mwai Kibaki in Kenya triggered ethnic violence against Kikuyus. Karachi had catastrophically flooded and was scene to a bloody suicide bombing, and Benazir Bhutto was assassinated in Rawalpindi. One can speculate that, for Kenya and Pakistan, a cache in the Seed Vault is a way to refuse political and climatological vulnerability — to forecast a future that might, somehow, sustain life....

One shelf of the vault is half empty. Four years into the civil war and humanitarian crisis in Syria, violence barreling northward toward Aleppo jeopardized the Headquarters of the International Center for Agricultural Research in the Dry Areas (ICARDA). Hundreds of thousands of seeds were banked here in Svalbard, including some of the earliest strains of Levantine wheat and durum, which are more than 10 thousand years old. The Syrian gene bank, now relocated to Morocco and Lebanon, recently requested 30 thousand samples from its original … [more]
archives  climate_change  biology  svaldbard  mining  death  temporality  ruins  infrastructure 
5 weeks ago
Art In the Age of Obsolescence
There is, however, a rich tradition at The Museum of Modern Art of offsetting this trend through collaborations with academics and researchers. Through this, we are often able to build small-scale research projects that give students incredible real-world experience — and afford museum conservators the sort of research we wish we had more time for. About a year ago, I realized that Lovers was a perfect case study for a course I was teaching at New York University, called Handling Complex Media. The artwork is composed of a veritable cocktail of technologies and media formats: 35mm slides, analog video, robotics, software — you name it. So we pulled Lovers from storage, along with its two-inch-thick folder of documentation, and began the work of understanding just what we had....

Out came LaserDiscs, 35mm slides, speakers, wires, accessories, slide projectors, an eight-foot-tall metal tower containing video projectors with robotics to control which direction they are pointing, two flight cases full of behind-the-scenes control hardware and software, and a hefty folder containing documentation, manuals, installation specifications, and correspondence with the artist and his studio. Our art handlers carefully delivered all of this material to a small viewing room at MoMA’s art storage facility in Queens. Although Lovers calls for a 32 x 32' space for proper installation, this was the best we could do for a basic assessment. After two days of combing through manuals and carefully wiring the various components to one another, we were ready to power on the artwork for the first time in decades....

The class was tasked with understanding and documenting the following: What is the anatomy of the artwork? How does it work? What condition are its various components in? What components are at risk of failure? Where can we source backups and/or replacements for the exact components used by the artist, and if exact replacements are not available, which components have significant aesthetic impact on the work beyond mere behind-the-scenes utility?...

The original LCD video projectors and the behind-the-scenes control hardware needed to be replaced due to their instability and rarity. This meant a full-on re-implementation of the original control and timing hardware and software would be necessary. Additionally, the NYU MIAP students’ research had revealed several gaps in the installation documentation. There were many unknowns regarding the parameters for successful installation, questions we knew we could only answer by working with Shiro and Yoko Takatani, Kyoto-based members of the Dumb Type artist collective and performance group, of which Furuhashi had been a pivotal member. Due to his battle with AIDS, Furuhashi was frequently hospitalized during the creation of Lovers, and Shiro Takatani was responsible for much of the artwork’s technical execution. His input would be critical in our efforts....

Our aim was to replace the at-risk components, translating the work to more stable technologies, while prioritizing two essential tenets of conservation — minimal intervention and reversibility....

At first glance, the hardware connecting the PCs to the robotics was completely incomprehensible. How did it work? There was only one way to find out. We needed use analytical and diagnostic tools to reverse engineer exactly what the PCs were doing....

Now we had the score, and we knew how to perform it, but that still was not enough; all of this documentation was very scientific and precise, but it didn’t tell us how the work felt. Furthermore, there was still the PC that did not contain plainly readable metadata, and only an impenetrable binary file. How could we reverse engineer the robotics and behavior of the slide projectors and interactive video of Furuhashi? Observation and careful documentation was the answer. I proceeded to spend hours upon hours running the original system, carefully watching and listening to the robotics, while also capturing video and audio documentation. In the end, it was this direct observation that allowed us to reverse engineer the basic algorithm....

Once we had completely documented all of Shiro’s knowledge regarding alignment, lighting, and sound, he told us it was time to move on to the refinement and correction of the motion and timing of the robotics. With puzzled looks on our faces, we reminded him of our quantitative proof that we had reproduced the timing and motion of the original control software within a completely imperceptible .0002-second margin of error. Smiling, and ever patient, Takatani, who had stewarded this work for years, explained that the timing of Lovers had been reviewed and refined nearly every time the work was installed. He suggested, therefore, that although we had perfectly reproduced the behavior, timing, and motion of the final snapshot of the artwork as it existed when it was collected, it was now time to continue its active life, and carefully refine the motion as Furuhashi would have wished. ...

Just as the original equipment that controlled Lovers had aged, obsolesced, and become unusable, so will our newly restored solutions. The field of conservation is continually evolving, not merely technologically, but philosophically and ethically. The day may come when our work here seems somehow wrong or misguided, so it is our job as responsible conservators to ensure that we produce the requisite documentation, ensuring that our work is truly reversible.
archives  preservation  digital_preservation  digital_art  emulation  ontology  pedagogy  reverse_engineering  materiality  media_archaeology  methodology  exhibition 
5 weeks ago
Learning to Teach/Teaching to Learn II - Google Docs
With the Spring semester approaching fast, the second edition of the Learning to Teach mini-conference returns, January 15, 2017 in New York City. For many educators, January is the perfect time to review the last year, write syllabus and prepare for new classes. Organized by the School for Poetic Computation in partnership with the Processing Foundation, this day long conference is an open forum for educators teaching computer programming in creative and artistic contexts. The morning session is a series of talks from experienced educators on approaches for teaching effectively, strategies for assessment and feedback. In the afternoon, participants will be invited to workshop sessions to discuss curriculum development and environments and tools for learning. Together, we will explore the intersection of pedagogy and creative practice, and provide an opportunity to share ideas for another year of teaching ahead.

video: https://www.youtube.com/watch?v=D7-m6NJ90RE
pedagogy  teaching 
5 weeks ago
Iron Mountain's Butler County mine expands to hold data secure | Pittsburgh Post-Gazette
Iron Mountain can show off plenty of these rooms across more than 200 acres of underground space carved into an abandoned limestone mine in Butler County. The facility — famous for its geology and for holding some of the most precious pieces of paper and film in America — lately has been installing large racks of blinking computer servers that stretch as far as the eye can see.

The Boston-based information management company that owns the mine has been advancing deeper into the shafts to serve health care and insurance businesses, financial institutions and tech companies looking for the safest place to store their irreplaceable digital information.

By this spring, Iron Mountain expects another 11 acres of the former mine to be in use by clients storing digital data.

Iron Mountain portrays its mine as optimal for businesses that want the highest level of security at a reasonable price. The security comes in the form of armed guards and metal detectors at the entrance all employees and visitors walk through....

It also comes with the 20-foot-thick seam limestone — bound by layers of impermeable shale rock — that could largely withstand any explosion. (Slight imprints from dynamite blasting can still be seen on the walls.)

And security is found in digital defenses: The facility’s computer system is entirely disconnected from the internet, and its computers won’t allow anyone to plug in an external hard drive.

The company touts its client base of highly regulated and sensitive companies that have bought into those assurances. In fact, the federal government uses a significant part of the mine, employing most of the 2,000 workers who enter and leave the facility each day.

When Iron Mountain purchased the data center in 1998, much of the storage was used for paper and film — patents, motion pictures, Social Security applications filed by every resident of the United States, pension records, boxes of business records....
Mr. Hill pointed to a copper/​lead door installed for a large insurance company that was concerned about electromagnetic pulses, which can be sent by terrorists or even the natural environment and could cripple equipment. (In addition to the door, a study conducted later at the mine proved that limestone layers naturally shield such waves.)
preservation  archives  mining  underground 
5 weeks ago
Wayne Barrar photographs renovated mines and industrial sites in his series, “Expanding Subterra.”
Wayne Barrar had long been photographing mines when he started to wonder what became of the mines after they were depleted. As he found while creating his series, “Expanding Subterra,” many are well suited to be transformed into other types of spaces, including offices, libraries, and even paintball fields.

“The major benefits of these sites are their security and their stable surprisingly dry and mild environment. They are cheap forms of industrial architecture,” he said via email.
mines  storage  photographs  underground 
5 weeks ago
The Digital Life: DNA as Data Storage
On this episode of The Digital Life podcast we discuss how bio-inspired technology is beginning to intersect with information technology in big ways. With the exponential increase of digital data, we face an ongoing problem of information storage. Today most digital information is stored on media that will expire relatively quickly, lasting a few decades at most. Because of this, we require new methods for long-term data storage, and biotech might just have the answer. DNA could be the storage media of the future: It can last thousands, even potentially tens of thousands of years. And the tech industry has taken notice. For instance, last month Microsoft agreed to purchase millions of strands of synthetic DNA, from San Francisco based Twist Bioscience to encode digital data. Of course we may be years away from a commercial DNA storage product, but the potential for a revolutionary, even disaster proof media is there.
storage  archives  DNA  biomedia  preservation 
5 weeks ago
‘Smart Cities’ Will Know Everything About You - WSJ
If the Internet age has taught us anything, it’s that where there is information, there is money to be made. With so much personal information available and countless ways to use it, businesses and authorities will be faced with a number of ethical questions.

In a fully “smart” city, every movement an individual makes can be tracked. The data will reveal where she works, how she commutes, her shopping habits, places she visits and her proximity to other people. You could argue that this sort of tracking already exists via various apps and on social-media platforms, or is held by public-transport companies and e-commerce sites. The difference is that with a smart city this data will be centralized and easy to access. Given the value of this data, it’s conceivable that municipalities or private businesses that pay to create a smart city will seek to recoup their expenses by selling it.

By analyzing this information using data-science techniques, a company could learn not only the day-to-day routine of an individual but also his preferences, behavior and emotional state. Private companies could know more about people than they know about themselves....

What degree of targeting is too specific and violates privacy? Should businesses limit the types of goods or services they offer to certain individuals? Is it ethical for data—on an employee’s eating habits, for instance—to be sold to employers or to insurance companies to help them assess claims? Do individuals own their own personal data once it enters the smart-city system?

With or without stringent controlling legislation, businesses in a smart city will need to craft their own policies and procedures regarding the use of data. A large-scale misuse of personal data could provoke a consumer backlash that could cripple a company’s reputation and lead to monster lawsuits. An additional problem is that businesses won’t know which individuals might welcome the convenience of targeted advertising and which will find it creepy—although data science could solve this equation eventually by predicting where each individual’s privacy line is.
smart_cities  big_data  privacy 
5 weeks ago
India’s Digital ID Rollout Collides With Rickety Reality - WSJ
India’s new digital identification system, years in the making and now being put into widespread use, has yet to deliver the new era of modern efficiency it promised... The system, which relies on fingerprints and eye scans to eventually provide IDs to all 1.25 billion Indians, is also expected to improve the distribution of state food and fuel rations and eventually facilitate daily needs such as banking and buying train tickets... The government began building the system, called Aadhaar, or “foundation,” with great fanfare in 2009, led by a team of pioneering technology entrepreneurs. Since then, almost 90% of India’s population has been enrolled in what is now the world’s largest biometric data set... But the technology is colliding with the rickety reality of India, where many people live off the grid or have fingerprints compromised by manual labor or age....

An Aadhaar ID is intended to be a great convenience, replacing the multitude of paperwork required by banks, merchants and government agencies. The benefits are only just beginning, backers say, as the biometric IDs are linked to programs and services.

But in rural areas, home to hundreds of millions of impoverished Indians dependent on subsidies, the impact of technical disruptions has already been evident.
infrastructure  India  identity  privacy  informal_infrastructure  authentication 
5 weeks ago
The Book As
First, we have the alphabet. Then scrolls. Then the codex. The codex has endured through history since around the 2nd century A.D. By themselves, these codices thrived in areas like religion. Then, around 1450, the printing press came along. This new technology changed the codex forever. Fast forward to the present day, and the printing press seems tedious, even archaic. So how has the book changed since its inception to our ever-changing digital age? Take a look through each section in the table of contents to see how certain authors/artists have altered the book in incredibly varied ways to complement the diversity of the digital age. 
books  book_art 
6 weeks ago
Empathy as Faux Ethics - EPIC
The word “empathy” comes from the German Einfühlung, meaning “in feeling,” and the Greek empatheia, meaning “a passion or state of emotion,” adopted from em, an offshoot of en, or “in,” and pathos, “feeling.” Pathos was originally used in art theory to indicate the idea that appreciation for a piece of art depends on the viewer projecting themselves into the piece.

The meaning of empathy has shifted in design discourse: designers project themselves into the other’s perspective not just to appreciate their views, but also to turn that understanding into design interventions. There is a productive mode to empathy that sets an ethical standard for designers to act on their knowledge—to discover and solve the other’s problems.

This model has several dangers. It sets up a framework in which empathy becomes a way to further separate the ones who design (professionally) from those who do not (I am deliberately avoiding labels such as “designers” and “users”). It assumes that “The Designer” possesses a unique ability to access to the psyche of “The Other.” It’s no wonder that design is so often viewed as a self-aggrandizing profession. The model also assumes that the insight acquired by empathizing gives The Designer sufficient understanding to define and resolve The Other’s problems—even the world’s problems....

Despite limitations around empathy used to distance designers and the subjects of design (we should not forget design’s ability to subjugate), there are applicable use cases, especially around health care and social justice issues. Empathy in commercial design, however, is suspect. Empathy often takes form as a subtler way of othering in nonprofit and government contexts, but I am picking on commercial design here simply because the ability to care for something not immediately profitable is so foreign to most businesses....

Empathy is applied retroactively to fit a business-centric product into a human-centric frame. It becomes an ethical practice designers use to feel better about the potentiality of making superfluous things that no one actually needs. But no matter how one justifies it, empathy for commercial ends is simply marketing. Does anyone have a real need to be sold things? Sustainable designs will never be reached by empathy alone.

Back to our coffee shop. Here’s a standard solution: Keep the bathroom door locked and require people to ask for a key available only to paying customers. This solves a discrete problem for the coffee shop. After all, how can one business possibly take on an issue like inequality or homelessness? But it does more; it actively ignores the larger, systemic responsibilities the business has to the community. By empathizing with one group of people, we necessarily exclude another...

The crux of human-centered design is that human needs should be considered before business and technological needs. If a design does not meet a defined human need, then its business viability and technical feasibility don’t matter. This human-business-technology model ignores other components of design, such as sustainability, ethics, and egalitarianism. One might argue that these considerations can be wrapped up in the “human” part, but in practice, surface level understandings of empathy tend to dominate over broad definitions that might include more politically infused ideas.

This tendency has to do with emphasizing the individual over the collective, thus reinforcing deep-seated notions of anthropocentrism that run through the history of western epistemology. Empathy does not consider ecological sustainability because human-centricity, forecloses on ecological thought, as argued by actor-network theory, deep ecology, or if you want some really fun reading, anti-civilization and anarcho-primitivism.
empathy  design  teaching  pedagogy  ethics  design_process 
6 weeks ago
I finally stepped out of my progressive bubble—and now I understand why people hate “the liberal elite”
For the first time in my life, I was on the outside of the so-called liberal bubble, looking in. And what I saw was not pretty. I watched as many of my highly educated friends and contacts addressed those who disagreed with them with contempt and arrogance, and an offensive air of intellectual superiority.
It was surprising and frustrating to find myself lumped in with political parties and ideologies I do not support. But it also provided some insight into why many liberals seem incapable of talking with those who hold different opinions. (This is, broadly speaking, not just a liberal problem.) In so much of what I read, there was a tone of odious condescension, the idea that us no voters were perhaps too simpleminded or too uninformed to really grasp the situation....

I suspect that the sudden popularity of the term populism has led to a similar lack of respect and curiosity for opinions we disapprove of. It may even betray a fundamental belief, inadvertent or explicit, that the populus is somehow lesser—less critical, less acute, and easier to sway.
But it is not. Liberals may be heavily represented in the media, the centers of culture (popular, and otherwise), and in academia. But unless we are able to start learning how to talk to people unlike us, we’ll likely keep losing. It is not the only reason for the current political polarization—but it is one we can all work to address.
politics  elitism  populism 
6 weeks ago
Your brain does not process information and it is not a computer | Aeon Essays
Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer....

here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.

Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’)....

computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.

Humans, on the other hand, do not – never did, never will. Given this reality, why do so many scientists talk about our mental life as if we were computers?...

In his book In Our Own Image (2015), the artificial intelligence expert George Zarkadakis describes six different metaphors people have employed over the past 2,000 years to try to explain human intelligence.

In the earliest one, eventually preserved in the Bible, humans were formed from clay or dirt, which an intelligent god then infused with its spirit. That spirit ‘explained’ our intelligence – grammatically, at least.

The invention of hydraulic engineering in the 3rd century BCE led to the popularity of a hydraulic model of human intelligence, the idea that the flow of different fluids in the body – the ‘humours’ – accounted for both our physical and mental functioning. The hydraulic metaphor persisted for more than 1,600 years, handicapping medical practice all the while.

By the 1500s, automata powered by springs and gears had been devised, eventually inspiring leading thinkers such as René Descartes to assert that humans are complex machines. In the 1600s, the British philosopher Thomas Hobbes suggested that thinking arose from small mechanical motions in the brain. By the 1700s, discoveries about electricity and chemistry led to new theories of human intelligence – again, largely metaphorical in nature. In the mid-1800s, inspired by recent advances in communications, the German physicist Hermann von Helmholtz compared the brain to a telegraph.

Each metaphor reflected the most advanced thinking of the era that spawned it. Predictably, just a few years after the dawn of computer technology in the 1940s, the brain was said to operate like a computer, with the role of physical hardware played by the brain itself and our thoughts serving as software. The landmark event that launched what is now broadly called ‘cognitive science’ was the publication of Language and Communication (1951) by the psychologist George Miller. Miller proposed that the mental world could be studied rigorously using concepts from information theory, computation and linguistics.

This kind of thinking was taken to its ultimate expression in the short book The Computer and the Brain (1958), in which the mathematician John von Neumann stated flatly that the function of the human nervous system is ‘prima facie digital’. Although he acknowledged that little was actually known about the role the brain played in human reasoning and memory, he drew parallel after parallel between the components of the computing machines of the day and the components of the human brain.

Propelled by subsequent advances in both computer technology and brain research, an ambitious multidisciplinary effort to understand human intelligence gradually developed, firmly rooted in the idea that humans are, like computers, information processors. This effort now involves thousands of researchers, consumes billions of dollars in funding, and has generated a vast literature consisting of both technical and mainstream articles and books. Ray Kurzweil’s book How to Create a Mind: The Secret of Human Thought Revealed (2013), exemplifies this perspective, speculating about the ‘algorithms’ of the brain, how the brain ‘processes data’, and even how it superficially resembles integrated circuits in its structure.

The information processing (IP) metaphor of human intelligence now dominates human thinking, both on the street and in the sciences. ...But the IP metaphor is, after all, just another metaphor – a story we tell to make sense of something we don’t actually understand. And like all the metaphors that preceded it, it will certainly be cast aside at some point – either replaced by another metaphor or, in the end, replaced by actual knowledge....

The faulty logic of the IP metaphor is easy enough to state. It is based on a faulty syllogism – one with two reasonable premises and a faulty conclusion. Reasonable premise #1: all computers are capable of behaving intelligently. Reasonable premise #2: all computers are information processors. Faulty conclusion: all entities that are capable of behaving intelligently are information processors.

Setting aside the formal language, the idea that humans must be information processors just because computers are information processors is just plain silly...

The idea, advanced by several scientists, that specific memories are somehow stored in individual neurons is preposterous; if anything, that assertion just pushes the problem of memory to an even more challenging level: how and where, after all, is the memory stored in the cell?...

A few cognitive scientists – notably Anthony Chemero of the University of Cincinnati, the author of Radical Embodied Cognitive Science (2009) – now completely reject the view that the human brain works like a computer. The mainstream view is that we, like computers, make sense of the world by performing computations on mental representations of it, but Chemero and others describe another way of understanding intelligent behaviour – as a direct interaction between organisms and their world....

the mainstream cognitive sciences continue to wallow uncritically in the IP metaphor, and some of the world’s most influential thinkers have made grand predictions about humanity’s future that depend on the validity of the metaphor.

One prediction – made by the futurist Kurzweil, the physicist Stephen Hawking and the neuroscientist Randal Koene, among others – is that, because human consciousness is supposedly like computer software, it will soon be possible to download human minds to a computer, in the circuits of which we will become immensely powerful intellectually and, quite possibly, immortal. ...

To understand even the basics of how the brain maintains the human intellect, we might need to know not just the current state of all 86 billion neurons and their 100 trillion interconnections, not just the varying strengths with which they are connected, and not just the states of more than 1,000 proteins that exist at each connection point, but how the moment-to-moment activity of the brain contributes to the integrity of the system. Add to this the uniqueness of each brain, brought about in part because of the uniqueness of each person’s life history...

Meanwhile, vast sums of money are being raised for brain research, based in some cases on faulty ideas and promises that cannot be kept. The most blatant instance of neuroscience gone awry, documented recently in a report in Scientific American, concerns the $1.3 billion Human Brain Project launched by the European Union in 2013. Convinced by the charismatic Henry Markram that he could create a simulation of the entire human brain on a supercomputer by the year 2023, and that such a model would revolutionise the treatment of Alzheimer’s disease and other disorders, EU officials funded his project with virtually no restrictions.
cognitive_science  brains  computers 
6 weeks ago
White Spots
Do you ever desire to escape from the information flows surrounding us?
The White Spots App visualizes the invisible electromagnetic cloud that we live in and offers a way out.
Use the App with Google cardboard to travel from the online to the offline world in Virtual Reality, or use the White Spots world map to travel to places off the grid near you.

In VR mode, the network scanner shows the invisible digital signals around you in real time and takes you on a journey to the end of the Internet in immersive 360° stories.
mapping  electromagnetic_waves  telecommunications  escape  connectivity  making_visible_invisible  data_visualization 
6 weeks ago
Memory of Mankind: All of Human Knowledge Buried in a Salt Mine - The Atlantic
Martin Kunze wants to gather a snapshot of all of human knowledge onto plates and bury it away in the world’s oldest salt mine.

In Hallstatt, Austria, a picturesque village nestled into a lake-peppered region called Salzkammergut, Kunze has spent the past four years engraving images and text onto hand-sized clay squares. A ceramicist by trade, he believes the durability of the materials he plies gives them an as-yet unmatched ability to store information. Ceramic is impervious to water, chemicals, and radiation; it’s emboldened by fire. Tablets of Sumerian cuneiform are still around today that date from earlier than 3000 B.C.E.

“The only thing that can threaten this kind of data carrier is a hammer,” Kunze says.

So far, he has created around 500 squares, which he allows anyone to design for a small donation. Many preserve memories of the lives or work of people involved in the project. Around 150 of the tablets showcase items from collections in Vienna’s museums of National History and Art History. Some local companies have been immortalized. One researcher’s CV now lies in the vault.

But Kunze aims to expand the project, to copy research, books, and newspaper editorials from around the world—along with instructions for the languages needed to read them. For this, the clay squares he’s currently using would take up far too much space than could be set aside for such an audacious undertaking. So Kunze also has conceived of a much thinner medium: He will laser-print a microscopic font onto 1-mm-thick ceramic sheets, encased in wafer-thin layers of glass. One 20 cm piece of this microfilm can store 5 million characters; whole libraries of information—readable with a 10x-magnifying lens—could be slotted next to each other and hardly take up any space.

The goal of the project, which he calls the Memory of Mankind, is to build up a complete, unbiased picture of modern societies. The sheets will be stored along with the larger tablets in a vault 2 km inside Hallstatt’s still-active salt mine. If all goes according to plan, the vault will naturally seal over the next few decades, ready for a curious future generation to open whenever it’s deemed necessary.

To Kunze, this peculiar ambition is more than a courtesy to future generations. He believes the age of digital information has lulled people into a false sense that memories are forever preserved. If today’s digital archives disappear—or, in Kunze’s view, when they do—he wants to make sure there’s a real, physical record to mark our era’s place in history....

Much of this information goes into digital storage—ranging from servers on personal computers to colossal data centers, like the NSA’s facility in Utah.... But this method of storage has inherent problems. Digital space is finite and expensive. Digitally stored data can become corrupted and decay as electrical charges used to encode information into binary bits leak out over time, altering the contents. And any enduring information could be lost if the software to access it becomes obsolete. Or a potent, well-timed coronal mass ejection could cause irreparable damage to electronic systems.

“There’s no getting around the risk of catastrophic loss in our culture,” says Robert Darnton, the librarian emeritus at the Harvard University Library. “Digital texts are much more fragile than printed books.”...

As the project slowly starts to take shape, some are worried that its own place in collective memory may ebb over time. “The thing I don’t like about the time capsule is the sense that it’s frozen,” says Richard Ovenden, the director of the Bodleian Libraries at the University of Oxford. “Information is much more likely to be kept if it’s used. The danger is that [Kunze’s project] will end up being forgotten.”

To avoid this, Kunze plans to distribute ceramic tokens around the world to everyone who either funds, contributes to, or advises on the project. ... The location of the mine will be carved onto each token, and it will require geological knowledge similar to our own to find it, especially as land shifts with time. This would be a safeguard against unwanted discoveries if for some unpredicted reason—nuclear war, say—human civilization disappears or regresses to the Stone Age....

Kunze has teamed up with the Human Document Project, another preservation scheme, and University College London’s Heritage Futures project, to co-organize the event.
archives  preservation  geology  chemistry  data_centers 
6 weeks ago
« earlier      
academia acoustics advising aesthetics_of_administration algorithms archaeology architecture archive_art archives art audio big_data blogs book_art books branded_places branding cartography cell_phones china cities classification collection collections computing conference craft curating data data_centers data_visualization databases dead_media design design_process design_research digital digital_archives digital_humanities digitization discourse diy drawing ebooks education epistemology exhibition exhibition_design filetype:pdf film furniture geography geology globalization google graduate_education graphic_design guerilla_urbanism hacking history home illustration information information_aesthetics infrastructure installation intellectual_furnishings interaction_design interface interfaces internet koolhaas korea labor landscape language learning lettering liberal_arts libraries library_art listening little_libraries little_magazines locative_media logistics machine_vision magazines making mapping maps marketing material_culture material_texts materiality media media:document media_archaeology media_architecture media_city media_education media_form media_history media_literature media_space media_theory media_workplace media_workspace memory methodology multimodal_scholarship museums music music_scenes my_work networks new_york newspapers noise notes nypl object_oriented_philosophy objects organization palimpsest paper pedagogy performance periodicals phd photography place pneumatic_tubes poetry popups postal_service presentation_images preservation print printing privacy professional_practice public_space public_sphere publication publications publishing radio reading rendering research satellites screen security sensation sensors signs smart_cities smell social_media sound sound_art sound_map sound_space sound_studies space storage surveillance sustainability syllabus teaching telecommunications telegraph telephone television temporality text_art textual_form theory things tools transportation typewriter typography ums urban_archaeology urban_design urban_form urban_history urban_informatics urban_media urban_planning urban_studies video visualization voice wedding word_art workflow writing zines

Copy this bookmark:



description:


tags: