Liberatory Archives: Towards Belonging and Believing (Part 2) – On Archivy – Medium
If you think of an idea that you think is ahead of the curve or new in any way, be assured that a woman — often times a black woman, but not always — probably thought of the idea first. So do the research. Do the reading. Cite her work. And don’t be an oppressive, patriarchal jackass who erases and undermines the work of women and folks who don’t subscribe to the gender binary. Fellas if you aren’t finding the sources that speak to whatever idea it is you’re interested in exploring, that isn’t because those sources don’t exist or haven’t been written. It’s likely because they haven’t been cited, and they likely haven’t been cited because she’s a woman. Just my thoughts.

Okay; back to the definition! Michelle Caswell, in her book chapter “Inventing New Archival Imaginaries,” really sets a fiery foundation on which to engage this concept of a liberatory archive. Again, please read this work in full if you haven’t. If you need access to a copy of it, holla at ya boy or contact Michelle directly. Once you read the chapter in its entirety, I’m sure you’ll be struck by this line that reads:
“…through the lens of liberatory archival imaginaries, our work as community-based archivists does not end with the limits of our collection policies, but rather, it is an ongoing process of conceptualizing what we want the future to look like.”
So you see in her definition that liberatory archives are not things so much as they are processes. Understanding them, then, is not a ‘what’ question as much as a ‘how’ question. Let me now expand on the ‘how’ question of liberatory archives and focus on two processes and actions for us to consider explicitly integrating into the work of community archives....

A project that embodies believing in the context of liberatory archives is Community Futurisms: Time & Memory in North Philly. Led by two black women artists who form the collective Black Quantum Futurism, Community Futurisms is:
“a collaborative art and ethnographic research project exploring the impact of redevelopment, gentrification, and displacement within the North Philadelphia neighborhood known as Sharswood/Blumberg through the themes of oral histories, memories, alternative temporalities, and futures… BQF Collective will operate Community Futures Lab, a gallery, resource and zine library, workshop space, recording booth, and time capsule, recording oral histories/futures in North Philly.”
Did you catch that at the end of their project description? They will be recording oral histories and futures. This isn’t an archival project that exists solely to recast the past. Rather, their efforts are about impacting the future, which can only happen if one 1) believes there is thing such as a future and 2) believes that one’s fate in the future is not fixed.
archives  community_archives  citation 
11 hours ago
20,000 Hard Drives on a Mission | Internet Archive Blogs
Once a new item is created, automated systems quickly replicate that item across two distinct disk drives in separate servers that are (usually) in separate physical data centers. This “mirroring” of content is done both to minimize the likelihood of data loss or data corruption (due to unexpected harddrive or system failures) and to increase the efficiency of access to the content. Both of these storage locations (called “primary” and “secondary”) are immediately available to serve their copy of the content to patrons… and if one storage location becomes unavailable, the content remains available from the alternate storage location.

We refer to this overall scheme as “paired storage.” Because of the dual-storage arrangement, when we talk about “how much” data we store, we usually refer to what really matters to the patrons — the amount of unique compressed content in storage — that is, the amount prior to replication into paired-storage. So for numbers below, the amount of physical disk space (“raw” storage) is typically twice the amount stated.

As we have pursued our mission, the need for storing data has grown. In October of 2012, we held just over 10 petabytes of unique content. Today, we have archived a little over 30 petabytes, and we add between 13 and 15 terabytes of content per day (web and television are the most voluminous).

Currently, Internet Archive hosts about 20,000 individual disk drives. Each of these are housed in specialized computers (we call them “datanodes”) that have 36 data drives (plus two operating systems drives) per machine. Datanodes are organized into racks of 10 machines (360 data drives), and interconnected via high-speed ethernet to form our storage cluster. Even though our content storage has tripled over the past four years, our count of disk drives has stayed about the same. This is because disk drive technology improvements. Datanodes that were once populated with 36 individual 2-terabyte (2T) drives are today filled with 8-terabyte (8T) drives, moving single node capacity from 72 terabytes (64.8T formatted) to 288 terabytes (259.2T formatted) in the same physical space! This evolution of disk density did not happen in a single step, so we have populations of 2T, 3T, 4T, and 8T drives in our storage clusters.
storage  archives  data_centers  repair  maintenance 
Should I Pursue My Passion or Business? – Medium
For the next 6-months, I am joining YCombinator Research’s New Cities project as an Explorer. My goal? Create an open, repeatable system for rapid cityforming that maximize human potential. It is a vastness and complex challenge — and one that makes me so happy that I want to tap dance to work. Like any other epic journey, we’ll start small and learn fast: Everything we learn, we will be publishing online.
I am not giving up entrepreneurship. This is just another form. I am trusting that amazing experiences will teach me to be a better entrepreneur.
I can’t do this alone. YC can’t do this alone. This is our problem to solve together. To be successful, we’ll need investors, industries, governments, charities, citizens, and critics. I know many of you have been waiting for a project like this. (If you have lots of land for a new city, let us know.)
Why now?
I’m done complaining about cities. I want to be a part of a solution. I want cities for the poor and the rich, the locals and the transplants, the freaks and the geeks, and the young and old....

Affordable, dynamic cities are a sustainable solution to a world thirsting for innovation....

And cities are resilient. Rome. Tokyo. Istanbul. Lagos. Cities often outlast kings and empires. City-states were the original superpowers. Yet, mass migration to mega-cities have only occurred in the last 50 years. Cities are young trees of life that have just started to bear fruit....

Every great city benefited from historically advantageous starting conditions that cannot be recreated. But I believe technology can seed fertile starting condition across nations and geographies.
urban_planning  solutionism 
MIT task force releases preliminary “Future of Libraries” report | MIT News
The MIT task force arranged ideas about the MIT Libraries into four “pillars,” which structure the preliminary report. They are “Community and Relationships,” involving the library’s interactions with local and global users; “Discovery and Use,” regarding the provision of information; “Stewardship and Sustainability,” involving the management and protection of MIT’s scholarly resources; and “Research and Development,” addressing the analysis of library practices and needs. The preliminary report contains 10 general recommendations in these areas.
For the “Community and Relationships” pillar, the report notes that MIT library users may have varying relationships to the system in the future, and suggests a flexible approach simultaneously serving students, faculty, staff, alumni, cooperating scholars, participants in MITx classes, the local Cambridge and Boston community, and the global scholarly community.
The task force also recommends further study of changes to on-campus library spaces, allowing for quiet study as well as new modes of instruction and collaboration. It suggests that in an evolving information landscape, libraries must teach students how to not only access and evaluate information, but also responsibly generate new knowledge and create systems and tools that others will use to discover, share, and analyze information.
In the area of “Discovery and Use,” the report suggests that the library system enhance its ability to disseminate MIT research to the world; provide “comprehensive digital access to content in our collections”; form partnerships to “generate open, interoperable content platforms” for sharing and preserving knowledge; and review the Institute’s Faculty Open Access Policy.  
Regarding “Stewardship and Sustainability,” the task force envisions the MIT Libraries as the leading repository of the Institute’s history and as a leader in the effort to find solutions for the “preservation of digital research,” which the report describes as a “major unsolved problem.”
Finally, in the area of “Research and Development,” the report proposes the establishment of an initiative for research in information science and scholarly communication, to support both research and development on the grand challenges in the field.
2 days ago
Turning the inside out – discontents
Kate and other historians of Chinese Australia have noted that the administration of the White Australia Policy was not uniform or consistent. Similar cases could result in quite different outcomes depending on the location and those involved. Understanding this is important, not only for documenting the workings of the system, but for recovering the agency of those subjected to it. Non-white residents were not mere victims, they found ways of negotiating, and even manipulating, the state’s racist bureaucracy. In her work on colonial archives, Ann Laura Stoller identifies this ‘disjuncture between prescription and practice, between state mandates and the manoeuvres people made in response to them’ as part of the ‘ethnographic space’ of the archive.4

How do we explore this space? One of the things I’ve found interesting in working with the closed files is the way we can use available metadata to show us what we can’t see. It’s like creating a negative image of access. Kate and I have been thinking for a number of years now about how we might use digital tools to mine the White Australia records for traces, gaps, and shadows that together build a picture of the policy in action. Who knew who? Who was where and when? What records remain and why?...

Just like systems of racial classification, intelligence services exist within a circle of self-justification. The fact they exist proves they need to exist. We are denied information that might enable us to imagine alternatives. And yet as limited as the provisions under the Archives Act are, we do have access.

How can we use this narrow, shuttered window to reverse the gaze of state surveillance and rebuild a context that has been deliberately erased. Just as with Closed Access and the White Australia records can we give meaning to the gaps and the absences? Can we see what’s not there?

This is one of the questions being explored by Columbia University’s History Lab. They’ve created the Declassification Engine – a huge database of previously classified government documents that they’re using to analyse the nature of official secrecy. By identifying non-redacted copies of previously redacted documents, they’ve also been able to track the words, concepts and events most likely to censored.

The History Lab’s collection of documents on foreign policy and world events is rather different to ASIO’s archive of the lives, habits and beliefs of ordinary Australians. But I’m hoping that they too can tell us something about the culture that created them....

Through trial and error I developed a computer vision script that did a pretty good job of finding redactions – despite many variations in redaction style, paper colour, and print quality. It took a couple of days to work through the 300,000 page images, but in the end I had a collection of about 300,000 redactions. Unfortunately about 20 percent of these were false positives, so I spent a number of nights manually sorting the results.
archives  classification  secrecy  redaction  machine_vision 
3 days ago
RHUNHATTAN: A TALE OF TWO ISLANDS | NYU Center for the Humanities
I strive to uncover invisible, suppressed stories that lie in the geopolitical shadows of colonialism and migration. As the 2016-17 Artist-in-Residence at the Asian/Pacific/American Institute at NYU, I will research the social history of plants via spice routes and botanical expeditions to create a multiplatform project, Rhunhattan, that will include psychogeographic and immersive tech experiences, as well as object and olfactory work to bring forth the historical and contemporary relationship between the islands of Rhun (located in present-day Banda Island Archipelago of Indonesia) and Manaháhtaan (original Lenape name of Manhattan).

During 17th century Spice Wars, Dutch Nieuw Amsterdam was captured by the British and renamed “New York.” By 1667, the Dutch relinquished their claim to the colony in exchange for Rhun, the sole British colony in the Banda Islands of present-day Indonesia, thereby gaining monopoly of the lucrative nutmeg and mace trade. This pivotal moment came at a bloody cost for Indigenous peoples: both for the Bandanese and the Lenape people of Manaháhtaan. Over the centuries, as the spice trade faded, Rhun also settled into the background while Manaháhtaan rose to unprecedented financial success. The remaining colonial landmarks that continue to link these islands are the present day National Museum of American Indian at Bowling Green, which occupies the original site of Fort Amsterdam, and Fort Nassau of the Banda Islands; both forts share the same diamond-shaped architectural structure. In the visual narrative that I will be developing I see the identical forts act as portals between the two contested sites to collapse the time and distance of these two islands.

To tell this story of two islands with intertwined fates of land dispossession and erasure during the birthing of imperial globalization propelled forward by countless caravans and ships transporting spice, sugar, and silk, I am reeducating myself about the broken human relationship with land and waters. We are living in debt to our future generations and must learn how the Lenape sustainably managed the island for the sake of futurity over millennia. In a time when massive glaciers the size of lower Manhattan crashing into the ocean doesn’t make a media splash, we have a great responsibility to fight apathy. We are living in urgent times and there is a need to revitalize indigenous cultures and knowledge for environmental stewardship. We need a paradigm shift from falsely believing that human beings are landlords of Earth to seeing humans as being part of the ecosystem.
smell  taste  colonialism  trade  globalization  botany 
3 days ago
Amazon as an ISP Isn’t Bonkers—It Makes Perfect Sense | WIRED
AMAZON THE ISP. It sounds strange when you first hear it. Amazon, you think, is an online store. It lets you buy stuff over the Internet. Comcast and Verizon and Orange and Vodafone are the ISPs. They provide the Internet service to the world’s homes and phones.

But if you step back, just a little, you realize that Amazon is a natural ISP. One day, it could compete the Comcasts and the AT&Ts—or at least try to. You can see this in the way Amazon has already built its business. And you can see it in the ambitions of other Internet giants like Google and Facebook....

The online news site says Amazon may sell Internet service directly to consumers alongside its streaming media offering, Prime, which delivers movies and TV shows via the Internet....

As it stands, Amazon is beholden to Comcast. But if it ran an ISP, it wouldn’t be. “One of the big challenges for companies providing over-the-top video services like Netflix and Amazon Prime Video is that they are still reliant on broadband providers, many of whom are also TV providers and so have an inherent conflict of interest in helping them reach customers with high-quality video services,” says Jackdaw Research analyst Jan Dawson, who has studied telecoms regulation and carrier strategy....

We’ve already see much the same moves from Google and Facebook. Mostly notably, in 2011, Google started building an ultra-high-speed ISP, Google Fiber, in select American cities. In the beginning, it characterized this an experiment meant to push other ISPs toward similar high-speed services. But as Google moves more and more into video and other digital media, Google Fiber has morphed into a full-fledged business. In fact, it’s now its own company, one of the business units spun out of Google under the new umbrella operation called Alphabet.

At the same, Google is offering its own wireless Internet service, Project Fi. This service is driven by existing services from entrenched mobile ISPs like Sprint and T-Mobile. But it’s a way of working around the limitations of even bigger mobile ISPs like Verizon and AT&T....

Facebook has taken a slightly different route. Through, it has partnered with ISPs in the developing world to offer free Internet service on mobile phones. This is a way of expanding the Internet into new areas, and it includes access to Facebook....

The world’s largest online retailer now controls so much of its own supply chain, from the massive fulfillment centers it operates across the globe to the brick-and-mortar stores that are popping up in places like New York and Seattle. This is just what Amazon does. It builds and operates its own infrastructure.
amazon  isp  internet  connectivity  infrastructure 
4 days ago
This New Code Ensures Buildings Designs are Internet Optimized | ArchDaily
looking at a building, how good its internet is, is probably not one’s first thought. But for the tenants and companies inside it, it’s a key building service that they rely on daily.

As Arie Barendrecht explains, “it’s vital to tenants of buildings and critical to attracting and maintain new tenants – it’s a non-negotiable design component."

Barendrecht is the co-founder and CEO of WiredScore, a company that ranks commercial buildings on their connectivity. Beginning in New York, the company has provided wired certification to over 300 buildings in the city, with further operations across several other US cities as well as London and Manchester in the UK. The company’s work is instrumental in showing architects how their designs need to prepare for the 21st century and acknowledging those that already do....

Space allocation, in particular, is a critical factor. It’s not unusual for tenants wanting to upgrade their connectivity to discover they can’t, simply because there is no room for it. A common example of this seen by WiredScore is not having the floor space for wireless equipment like DAS or small cells. The space for wireless is simply not included in a lot of current building designs, but increasingly needed by tenants given the rise of the mobile workforce.

It’s also important for spaces to be flexible, not just for the potential to free up more floor area, but also to support the installation of new technologies regardless of what sort of wired or wireless infrastructure is required. This is especially relevant for new buildings where technological requirements can easily change between the time of planning and its completion....

Nowadays, many companies depend on having connectivity 100% of the time, making this sole dependency especially risky. Instead, diverse conduit pathways provide an alternative backup if one side were to come under fire, flood, or other physical damage. This involves having at least two different internet providers running their cables vertically through, and horizontally out of different sides of the building.

Resiliency focuses on the protection of the equipment itself, such as placement above grade – a lesson many New Yorkers learnt following the flooding caused by Hurricane Sandy. It also covers allocating telecom in a way to prevent day to day damage, and the best-designed buildings for connectivity separate equipment from areas of the building where users could accidentally damage equipment...

Materiality also comes into play, especially their effect on wireless coverage. Energy-efficient glass, in particular, blocks external cellular networks from entering buildings. So for developers aiming for LEED certification, Arie suggests having wireless strategies in place to compensate for the typically worse cellular coverage caused by low-e glass.
media_architecture  internet  infrastructure  wires  connectivity 
4 days ago
Field notes for 'What We Left Unfinished' | Ibraaz
It is not simple to work with an archive in a country like Afghanistan, where books, films and monuments are all subject to burning; stupas are looted and statues shattered; and sites sacred for one reason or another are eroded by both natural and human disasters. Understandably, Afghans are wary of anyone who proposes to 'mine' any cultural resource they still possess.
If you want to work with an Afghan archive, therefore, you cannot address your desires to it directly. You must sidle up to it sideways, as if approaching a horse with an uncertain temper. ...

This indexing of the archive is critical, because your first approach to any archive is always through its metadata: not the content, but its descriptors. When you are engaged in a slantwise, shuffling sort of appeal to the archive – three steps forward, two steps back – your approach may be even more removed. First you must address the people who possess or are creating the descriptions, and then you must sort through their often conflicting and overlapping accounts. In short, you must perform some of the functions of the archive, or archivist, yourself. This performance, you hope, can be your contribution to the archive: a history of sorts, which you write as you find it and leave behind when you go....

Some performances alter the archive irrevocably, slashing and burning as they go, like the literal burning of film prints in the Afghan Films courtyard in 1996. Others are delicate insinuations, or daily rituals, whose effects are not visible until viewed from a distance – leaving out certain details while labelling a film canister, for example, because everyone knows those details, until no one is left who knows and the data transmutes from omitted to lost. Or the use of a cheap brand of tape to splice film in lean years, which decades later means that each time those reels run through a projector or telecine apparatus, they may break at the splice point. Or a particular method of cleaning prints with rags, which over the years accumulates as a fretwork of scratches on celluloid....

Some parts of the archive are always more visible than others. The archive has two faces: its public narrative and its private holdings. The public narrative, which is designed to be visible, is usually constructed from only a small portion of the private holdings, which remain largely invisible. The public narrative can be adapted by the archive's performers to meet the moment, by sampling from different parts of the private holdings to construct the order most likely to match present interests. In every archive, there exists a literal or metaphorical dusty drawer where past archivists have filed the private holdings deemed least likely to ever be of interest to anyone anywhere: unfinished projects, failed experiments, institutional embarrassments. If you are an artist, you probably want to find that drawer and rifle through it...

The Afghan Films archive is, however, a special case, where the entire negative archive and large portions of the print archive were hidden from 1996-2002, with the door to the negative archive completely bricked up and disguised behind a poster of Mullah Omar. In some ways, the whole archive was temporarily filed in the invisible dusty drawer, and only very gradually did it emerge from this position of retreat over the subsequent decade (2002-12)....

many of the prints are literally covered in dust, and need cleaning and checking to see whether they are still viable. The negatives, in their closed chamber, remained more pristine, but are plagued by the aforementioned splices, which need to be marked before any kind of large-scale telecine project can be undertaken. Multiple handwritten catalogues of prints and negatives exist but they are often contradictory or overlapping, and the handwritten labels on the film canisters also sometimes contradict the catalogues, or are inaccurate. This surplus of unreliable indices has produced some uncertainty about which films now (post-bonfire) may exist only as negatives, which may exist only as prints, and which may exist as both negatives and prints. Re-cataloguing will resolve this uncertainty. It also serves to discover which prints may still be useful for soundtrack digitization or circulation of films on film....

In the research project What we left unfinished, I will be looking for some of these unfinished films and the people who made them, trying to decipher, from the gaps between what was finished and unfinished, some clue to the gaps between how the Afghan Left imagined its re-invention of the state and how that project went so terribly wrong – the gaps between revolution, reconciliation and dissolution....

Archives often presume or present themselves to be keepers of facts, and, moreover, keepers of facts that serve as anchor points for the larger historical record. Artists prospecting in archives are sometimes suspected by archivists of taking these facts only to weave fictions around them. While this suspicion is not entirely unjustified, it also overlooks the multiple layers of constructed narrative that already surround most archival records – from provenance records, to finding aids, to placement and classification within the archive, to metadata tags, descriptions and annotations. Each of these layers has an individual author and thus allows subjective interpretations, human errors, fictions and inventions to accumulate around, and influence perceptions of, the original records....

Is it possible, however, to imagine some kind of ethics of archival research?

The media archive collective suggested in Ten theses on the archive that we approach the archive with intellectual propriety, rather than rigid notions of intellectual property.[2] I interpret this to mean that, as researchers, we should be sensitive to the origins and contexts of archival material, especially when considering how to deploy it within a new artwork. 'Fair use' is a legal doctrine but also an apt phrase: is your use of an existing work fair to the original creator?...

Intellectual propriety, however, would require that the original creators be sought out and consulted about their original intentions for the films, not only as a matter of intellectual curiosity, but also as an ethical prerequisite for taking their unfinished work and re-contextualizing it within a new artwork – especially when the work being appropriated was never made public in its original form. Ultimately, intellectual propriety might even require that the new artwork become a work of facilitation rather than a work of appropriation – that is, after consulting the original creators, it may appear more appropriate or desirable to create a system whereby they can finish their own unfinished work (with the interest coming from the gap between the moment of making and the moment of finishing), rather than subsuming their unfinished work into a new artwork....

In a country like Afghanistan, where iconoclasm is a very real and seemingly perpetual threat, preservation of cultural resources like the Afghan Films archive may best be achieved not by panicked moves to protect assets, but rather by a move to project those assets. That is, locking the films away for another decade in another dusty drawer would be less effective than digitizing the archive as quickly as possible and disseminating films as widely as possible, including placing copies of master files on servers both inside and outside the country. Broad dissemination would also allow a critical discourse to grow around the films, ultimately making an even stronger argument for their preservation....

When a collection becomes an archive, the linguistic shift registers a transformation from a group of objects that are, to a group of objects that were (the same, connected, part of a set, parts of a whole). In this sense, the archive is founded on a moment of passing into the past, a kind of death, and the impulse to archive is connected (as Derrida said, following Freud) to the death drive....

At the same time, the archive constantly engages in attempts to resuscitate its holdings, bringing them back to life in the present: translations to new formats; circulation to new audiences; new interpretations, orders, edits, narratives. If the archive is both founded on and pledged against disaster, we can interpret that founding moment as the archive's original attempt to preserve something that might otherwise be lost, and that pledge as the archive's continuing efforts to countermand the static nature of preservation by projecting its past memories into the present and the future.
archives  metadata  afghanistan  nationalism  remixing  preservation  projection 
5 days ago
New Columbia class aims to contextualize data in history, society - Columbia Daily Spectator
Columbia will offer a new course on how to interpret and evaluate the impact of data next semester in the hopes of facilitating greater understanding of how data is used.

History professor Matt Jones and applied math and physics professor Chris Wiggins announced the course at an event on the role of data science on Monday. The class will begin as a small discussion section under both the history and applied math departments, and the professors plan to eventually expand the course to lecture size.

Both professors said that the idea stemmed from a fear that while governments and large corporations are gathering more data, the public’s understanding of the way in which that data is used is not sufficient....

“What we’re seeing today is a real transition in the ability of data to impact the world,” he said. “We’ve done a great job over the last 100 years with thinking about what every citizen should know about the Greeks, but in the next century I think there’s a need for somebody to think through what every citizen needs to know about data.”

Jones explained that the course will accomplish its goal of providing a general education on data issues by including students from all disciplines in the same class....

One goal of the course is to teach students to evaluate claims based on data that has been interpreted by algorithms.

“If somebody says this algorithm exists, therefore you should believe in it, you should be critical of it,” Wiggins said. “Rhetorical literacy is recognizing that if somebody says this algorithm is true, somebody says that to you because they want something.”

Jones said he also hopes that the course will explore more political questions, explaining how expertise in history is useful to understanding why approaches to data collection and interpretation were established in their original forms.

Jones noted the original purpose for the introduction of modern statistical methods as an example of interesting background.

“We’re going to begin with classical statistics and teach all of the technical rigors that go along with that, but never neglect that the key context for that work was eugenics,” Jones said. “The science can be made independent of that context, but it’s important.”
data_science  data_literacy  liberal_arts 
5 days ago
Toward a Constructive Technology Criticism - Columbia Journalism Review
Journalism about technology looks like: reporting, facts, the fourth estate, agenda setting. This kind of writing is constrained by PR embargoes and exclusive access. It can suffer from regurgitating Silicon Valley jargon and from telling seductive stories, as in the case of Theranos being judged as a startup rather than a medical company. Producer and freelance writer Rose Eveleth points to the problem: “There’s so much glittery, breathless writing about technology that fails to slow down and think about why we’re making these things, who we’re making them for, and who we’re leaving out when we make them.11 Dave Lee, tech reporter for the BBC, further asks if the role of technology journalism is meant to be “reporting every concocted venture capital investment, or being the first draft of our digital history.” 12

On the other hand, criticism about technology looks like: analysis, interpretation, commentary, judging merits, and unfavorable opinions. In the best cases, criticism offers the opportunity for context setting, and for asking questions beyond the tick-tock of technical development and into the how’s and why’s of a larger cultural shift. Criticism leaves room for interpretation, analysis, assessment, and more systematic inquiry. Popular criticism seeks to question established and unexamined knowledge—the assumptions and positions taken for granted. As author and contributor for The New York Times Virginia Heffernan reflects, criticism should “‘familiarize the unfamiliar’ and ‘de-familiarize the familiar.’”...

tyle and Tactic Trap: Missing People

More than just missing the social and political factors that bring a technology into existence, Critics of technology often fail to address the people for whom the technology is made. In his review of Morozov’s To Save Everything, Alexis Madrigal points to the missing users: “Without a functioning account of how people actually use self-tracking technologies, it is difficult to know how well their behaviors match up with Morozov’s accounts of their supposed ideology.”79

Critics also tend to write in the idiomatic royal “we” without representing real users’ interests or perspectives. Madrigal again articulates the importance of talking to people: “It is in using things that users discover and transform what those things are. Examining ideology is important. But so is understanding practice.”...

Style and Tactic Trap: Generalizing Personal Gripes

Another common mode in mainstream technology criticism is for the Critic to generalize personal gripes about technology into blanket judgments about technological progress. This is the mode used by Franzen when he complains about Twitter, a technology that threatens his livelihood by distracting him from his writing practice and changing the way his readers consume media. It can also be seen in Morozov’s description of the safe in which he locks his internet router so he can write his damning screeds without distraction. ...

Style and Tactic Trap: Cults of Personality, Bullying, and Misrepresenting Ideas

Though it is important to understand the ideological positions of the titans of the tech industry, some technology Critics unduly focus attention on individual personalities in isolation from their contexts. Profiles and takedowns of Silicon Valley moguls like Elon Musk, Peter Thiel, Mark Zuckerberg, and Tim O’Reilly make for compelling (anti-)hero narratives, but they often miss the details of the larger system and the labor that surrounds them. These profiles also perpetuate the mystique of ownership and power attributed to these Silicon Valley leaders.

Morozov, in particular, is guilty of personal, vindictive, intellectual bullying of his targets, no matter what side of the argument they represent.....

Style and Tactic Trap: Deconstruction Without Alternatives

One of the most widely recognized Critics of technology has made it his mission to destroy the industry and everyone associated with it. Writing against what he calls “solutionist” thinking, i.e. that all problems are potentially solvable (and often with technology), Morozov facilely avoids offering alternative solutions.
technology  criticism 
5 days ago
The Mission to Save Vanishing Internet Art - The New York Times
Now the digital art organization Rhizome is setting out to bring some stability to this evanescent medium. At a symposium to be held Thursday, Oct. 27, at the New Museum, its longtime partner and backer, Rhizome plans to start an ambitious archiving project. Called Net Art Anthology, it is to provide a permanent home online for 100 important artworks, many of which have long since disappeared from view. With a $201,000 grant from the Chicago-based Carl & Marilynn Thoma Art Foundation, Rhizome will release a newly refurbished work once a week for the next two years, starting with the 1991 “Cyberfeminist Manifesto.” By 2018, Rhizome will be presenting works by artists such as Cory Arcangel and Ms. Cortright.

Continue reading the main story


Continue reading the main story
In addition to salvaging the past, the aim is to tell the story of Internet-based art in an online gallery that serves much the same narrative function as the galleries in the Museum of Modern Art. “There’s a sense of amnesia about the history these things have,” Michael Connor, Rhizome’s artistic director, said as he sat in the New Museum’s ground-floor cafe. “This is an opportunity to really be rigorous.”...

Net Art’s political posture was characteristic of the feverish, techno-utopian excitement shared by netheads in general. “There was this radical idea that the internet was going to change the way art is made and shared,” said Lauren Cornell, who was Rhizome’s executive director from 2005 to 2012 and who has since moved to the New Museum as a curator and associate director of technology initiatives. “That it might even do away with traditional institutions and gatekeepers” — that is, museums and curators.

Instead, it was Net Art that started to disappear. Rhizome began trying to preserve it in 1999 with the creation of ArtBase, an online archive that has since grown to more than 2,000 works. The organization became an affiliate of the New Museum in 2003, saving the group from almost-certain oblivion. But even then it was apparent that to keep Net Art from vanishing into the ether, something drastic would have to be done.

Preserving this work is not just a matter of uploading old computer files. “The files don’t mean anything without the browser,” Mr. Connor, 38, said. “And the browser doesn’t mean anything without the computer” it runs on. Yet browsers from 15 or 20 years ago won’t work on today’s computers, and computers from that era are hard to come by and even harder to keep working.

Dragan Espenschied, Rhizome’s preservation director, has been working with the University of Freiburg in Germany to develop a sophisticated software framework that emulates outdated computing environments on current machines.

Another iteration of this approach is, which Rhizome began in December as a free service. Oldweb lets you time-travel online, viewing archived web pages from sources such as the Library of Congress in a window that mimics an early browser. A second Rhizome initiative is Webrecorder, a free program that lets users build their own archives of currently available web pages. That can help preserve online works being created today.
archive  net_art  media_archaeology  flow 
5 days ago
Why Being A City Geek Is So Cool Today | Co.Design | business + design
"I was really surprised that anyone was interested in the stuff I was doing around internet infrastructure, but I think a lot of the appeal has to do with a growing public anxiety over the opacity of networked systems," Burrington says. "More and more of everyday life is tied into networked systems that most people interface with via a scrying mirror, which tends to obscure all the algorithmic spells and hexes going on behind the scenes. Looking at data centers and cables and microwave towers doesn't really make those hexes any more legible, but it grounds this increasingly incomprehensible system in something real, something made by humans, something that could hypothetically be destroyed by humans. It's comforting, kind of."...

"There has been a tremendous interest in urban infrastructure over the past decade, and I believe it is connected to the fact that residents feel more empowered and informed about transit decision-making than ever before," Michelle Young says. "It's not that long ago that we were in the era of Robert Moses; now he's vilified for the type of [top-down] decisions he took."

Now, planners are for advocating participatory design, like letting residents vote on how to spend public funds to improve infrastructure.

Indeed, bottom-up planning has yielded some of the most influential urban design projects that involve infrastructure. The High Line—an elevated park on a formerly abandoned elevated railway—was the product of a grassroots organization, Friends of the High Line. Now what was once a blighted stretch of track is now one of the most popular destinations in Manhattan—for better or worse—and cities across the country are clamoring to create their own "X Lines."...

We have products that celebrate the beauty of infrastructure, celebrities who endorse infrastructural adaptive reuse, and infrastructure communicating with the public in 140 characters or less. But one of the most compelling pieces of evidence about infrastructure's resurgence has to do with how we define "infrastructure" in the first place.

"One thing I've noticed rising in tandem with the appeal of an infrastructural aesthetic is a massive expansion of the use of the term 'infrastructure' to describe lots of things that aren't manholes or bridges or railroads," Burrington says. "Software is infrastructure, social media is infrastructure, UX is infrastructure—that sort of thing. I've seen artists who would have called their work 'social practice' five years ago now describe it as 'making infrastructure.' And it seems like a really strategic choice—because infrastructure is also sort of assumed to be indispensable. Defining one's work as infrastructure valorizes it, elevates its importance in a system as something crucial and in need of attention, care, maintenance, and support."
infrastructure  infrastructural_tourism  infrastructure_art 
6 days ago
Michael Kreil: An Honest Picture of Metadata | Exposing the Invisible
I have a problem with the term “metadata.” I don't think that this term is precise, because, simply put, the basic idea of metadata is that it's data about data. For example, if I take a photo, I can add data like the camera model, time and geolocation, so, the additional information about when and where the photo was shot is called metadata. But, for example, if I take a lot of photos, I can use the metadata contained in these photos to connect the location in which I took them with the time I took them. The metadata can be used to track me. So, from that point of view, metadata is the data itself, and that’s the interesting aspect, not the photos themselves.

I think that only people who add data to the data can use the term metadata. But, in general, from a public point of view, everything is data, which is usually about persons. So let's stop calling it metadata.

Have you got an alternative name? 


6 days ago
Coffee House Press: In the Stacks - In The Stacks with Matthea Harvey: Cloud Codes
On my second visit the book I fall in love with (and I fall hard) is Wind, Storm and Rain: The Story of Weather by Denning Miller, published by Howard McCann in 1952. The cover is a marvel: a brown hardcover the color of slightly over-cooked caramel, covered with clouds, photographed from above. I would guess that the clouds are cumulus, but since I don’t know what height they’re at, they might be altocumulus or stratocumulus.
clouds  weather  libraries 
7 days ago
How the First Farmers Changed History - The New York Times
But as fascinating as this culture was, something else about Ain Ghazal intrigues archaeologists more: It was one of the first farming villages to have emerged after the dawn of agriculture.

Around the settlement, Ain Ghazal farmers raised barley, wheat, chickpeas and lentils. Other villagers would leave for months at a time to herd sheep and goats in the surrounding hills.

Sites like Ain Ghazal provide a glimpse of one of the most important transitions in human history: the moment that people domesticated plants and animals, settled down, and began to produce the kind of society in which most of us live today....

Agriculture originated in a few small hubs around the world, but probably first in the Fertile Crescent, a region of the Near East including parts of modern-day Iraq, Syria, Lebanon, Israel and Jordan. The evidence for full-blown agriculture there — crops, livestock, tools for food preparation, and villages — dates back about 11,000 years....

in recent years, Dr. Zeder and other archaeologists have overturned that consensus. Their research suggests that people were inventing farming at several sites in the Fertile Crescent at roughly the same time. In the Zagros Mountains of Iran, for example, Dr. Zeder and her colleagues have found evidence of the gradual domestication of wild goats over many centuries around 10,000 years ago.

People may have been cultivating plants earlier than believed, too.
archaeology  agriculture  urban_history 
7 days ago
Cuban internet delivered weekly by hand - BBC News
But many have found a surprising way to get their favourite web content in spite of restrictions.
Much of the largely offline nation simply receives internet "deliveries" - by hand.
Fifty-six years of communist rule and the US trade embargo have inspired some novel solutions to everyday frustrations and "El Paquete Semanal", the Weekly Package, is just one.
The Paquete is an alternative to the web in a country where, according to some estimates, fewer than 5% of homes are connected.
It consists of a terabyte of data bringing together the latest music, Hollywood movies, TV series, mobile phone apps, magazines and even a classifieds section similar to Gumtree or Craigslist.
cuba  internet  storage  materiality  awesome  connectivity 
7 days ago
Is life just a thread unspooling in the hands of the Fates? | Aeon Essays
I make maps for a living, and a lot of that work is making street maps. For a long time, my fieldwork has involved driving, walking, and every so often biking back and forth over a territory. It’s movement combined with a specific kind of alert attention: I check specific pieces of information against what’s around me on the streets. And this means that when I am done, and have made that map, I have captured that piece of the world. Even a year later, I can picture pieces of it in my mind’s eye better than some of the buildings I walk or drive by unthinkingly every day at home.

But these days, my map work involves less fieldwork and more adapting and tweaking of existing digital data sets. When I first began making maps of small areas, I traced scans of architectural drawings, and then walked the streets to correct and update them. Now I routinely never leave my desk. Instead, I access digital building shapes acquired from airborne laser-mapping or LIDAR flight, align them with recent satellite views, and label them with pretty reliable street lines from the census bureau. It’s a different world, and I miss the sense of actually knowing places, because abstracted data, no matter how rich, is not the same as the world itself. As my colleague Steven Holloway says in his manifesto, Right MAP Making (2007), there is something important about a commitment to a ‘relationship with the place’ and ‘deep listening through direct-contact and stopping’ that even the best digital data can’t equal.
cartography  mapping  fieldwork 
8 days ago
Syrian Archive | beta
The Syrian Archive is an initiative launched by a collective of human rights activists dedicated to preserving open source documentation relating to human rights violations and other crimes committed by all sides during the conflict in Syria.

Our goal is to preserve the most valuable material to ensure it is organized and accessible for use by current researchers, journalists and others with an interest in the conflict as well as to facilitate the work of future historians and investigators involved in transitional justice and accountability efforts.
archives  human_rights  middle_east  syria 
10 days ago
Are Smart Cities Really That Smart 19 November 2015 by Bristol Festival of Ideas
Cities, governments and companies are devoting enormous resources to making cities smart: driverless cars; demand management of traffic; digital mapping; Big Data; new forms of energy use; new ways to encourage the saving of high streets all devoted to making traditional services and networks more efficient through the use of digital and telecommunication technologies, for the benefit of all. The Festival of the Future City has looked at many aspects of smart cities and now debates how valuable and efficient they will prove to be.
smart_cities  big_data  privacy 
10 days ago
Ephemeral Urbanism
Ranging from the scale of the small temporary infill within the urban, to the scale of the ephemeral mega cities, this project gives an overview of hundreds of cases depicting settlements or urban configurations that are constructed with an expiry date. Rahul Mehrotra & Felipe Vera started the ‘Research Project on the Ephemeral City’ at the Harvard Graduate School of Design in 2012 with the ambition to understand and frame the idea that nonpermanent configurations of the urban are a legitimate and productive category within discourse on Cities. Such exploration of temporal landscapes challenges the illusion of permanence surrounding the urban and provokes questions about seemingly permanent and explicitly impermanent urban configurations. The main argument is that in contemporary urbanism worldwide, it is becoming clear that for cities to be sustainable, they need to be accommodating more temporary fluxes in their structure and broader ecology rather than being anchored solely to static material configurations.
urbanism  ephemeral_urbanism  india  infrastructure  zones 
10 days ago
Deconstructing Analysis Techniques | Johnny Holland
Techniques of Analysis
We can start to pull back the curtain on analysis by looking at the techniques that go into the process:

Deconstruction: breaking observations down into component pieces. This is the classical definition of analysis.
Manipulation: re-sorting, rearranging and otherwise moving your research data, without fundamentally changing it. This is used both as a preparatory technique – i.e. as a precursor to some other activity – or as a means of exploring the data as an analytic tool in its own right.
Transformation: Processing the data to arrive at some new representation of the observations. Unlike manipulation, transformation has the effect of changing the data.
Summarization: collating similar observations together and treating them collectively. This is a standard technique in many quantitative analysis methods.
Aggregation: closely related to summarization, this technique draws together data from multiple sources. Such collections typically represent a “higher-level” view made up from the underlying individual data sets. Aggregate data is used frequently in quantitative analysis.
Generalization: taking specific data from our observations and creating general statements or rules.
Abstraction: the process of stripping out the particulars – information that relates to a specific example – so that more general characteristics come to the fore.
Synthesis: The process of drawing together concepts, ideas, objects and other qualitative data in new configurations, or to create something entirely new.
Let’s take a look at each of these techniques in detail and discuss some of the ways in which each technique can be applied.
methodology  deconstruction  critique  UMS  design_research 
10 days ago
Intro to Databases (for people who don’t know a whole lot about them) – Medium
At its most basic, a database is just a way of storing and organizing information. Ideally it is organized in such a way that it can be easily accessed, managed, and updated.
I like metaphors, so this simple definition of a database for me is like a toolbox. You’ve got lots of screws, nails, bits, a couple different hammers… A toolbox is a storage system that allows you to easily organize and access all of these things. Whenever you need a tool, you go to the toolbox. Maybe you have labels on the drawers — those will help you find, say, a cordless power drill. But now you need the right battery for the drill. You look in your “battery” drawer, but how do you find the one that fits this particular drill? You can run through all of your batteries using trial and error, but that seems inefficient. You think, ‘Maybe I should store my batteries with their respective drills, link them in some way.’ That might be a viable solution. But if you need all of your batteries (because you’re setting up a nice new charging station maybe?), will you have to access each of your drills to get them? Maybe one battery fits multiple drills? Also, toolboxes are great for storing disjointed tools and pieces, but you wouldn’t want to have to take your car apart and store every piece separately whenever you park it in the garage. In that case, you would want to store your car as a single entry in the database (*ahem* garage), and access its pieces through it....

- A query is a single action taken on a database, a request presented in a predefined format. This is typically one of SELECT, INSERT, UPDATE, or DELETE.
- We also use ‘query’ to describe a request from a user for information from a database. “Hey toolbox, could you get me the names of all the tools in the ‘wrenches’ drawer?” might look something like SELECT ToolName FROM Wrenches.
A transaction is a sequence of operations (queries) that make up a single unit of work performed against a database. For example, Rob paying George $20 is a transaction that consists of two UPDATE operations; reducing Rob’s balance by $20 and increasing George’s.
ACID: Atomicity, Consistency, Isolation, Durability
In most popular databases, a transaction is only qualified as a transaction if it exhibits the four “ACID” properties:
- Atomicity: Each transaction is a unique, atomic unit of work. If one operation fails, data remains unchanged. It’s all or nothing. Rob will never lose $20 without George being paid.
- Consistency: All data written to the database is subject to any rules defined. When completed, a transaction must leave all data in a consistent state.
- Isolation: Changes made in a transaction are not visible to other transactions until they are complete.
- Durability: Changes completed by a transaction are stored and available in the database, even in the event of a system failure.
- A database schema is the skeleton or structure of a database; a logical blueprint of how the database is constructed and how things relate to each other (with tables/relations, indices, etc).
- Some schemas are static (defined before a program is written), and some are dynamic (defined by the program or data itself).
DBMS: database management system
Wikipedia has a great summary: “A database management system is a software application that interacts with the user, other applications, and the database itself to capture and analyze data. A general-purpose DBMS is designed to allow the definition, creation, querying, update, and administration of databases.” MySQL, PostgreSQL, Oracle — these are database management systems.
Database-oriented middleware is “all the software that connects some application to some database.” Some definitions include the DBMS under this category. Middleware might also facilitate access to a DBMS via a web server for example, without having to worry about database-specific characteristics.
Distributed vs Centralized Databases
- As their names imply, a centralized database has only one database file, kept at a single location on a given network; a distributed database is composed of multiple database files stored in multiple physical locations, all controlled by a central DBMS.
- Distributed databases are more complex, and require additional work to keep the data stored up-to-date and to avoid redundancy. However, they provide parallelization (which balances the load between several servers), preventing bottlenecking when a large number of requests come through.
- Centralized databases make data integrity easier to maintain; once data is stored, outdated or inaccurate data (stale data) is no longer available in other places. However, it may be more difficult to retrieve lost or overwritten data in a centralized database, since it lacks easily accessible copies by nature.
Scalability is the capability of a database to handle a growing amount of data. There are two types of scalability:
- Vertical scalability is simply adding more capacity to a single machine. Virtually every database is vertically scalable.
- Horizontal scalability refers to adding capacity by adding more machines. The DBMS needs to be able to partition, manage, and maintain data across all machines.
11 days ago
Cloud Thinking – Medium
In 1884 the art critic and social thinker John Ruskin gave a series of lectures at the London Institution entitled The Storm-Cloud of the Nineteenth Century. Over the evenings of the 14th and 18th of February he presented an overview of descriptions of the sky and clouds drawn from Classical and European art, as well as the accounts of mountain climbers in his beloved Alps, together with his own contemporary observations of the skies of Southern England in the last decades of the Nineteenth Century....

Ruskin sought, in his analysis of the light which passed through cloud formations, to emphasise that the “’fiat lux’ of creation” — the moment when the God of Genesis says “Let there be light” — is also ‘fiat anima’, the creation of life. Light, he says, is “as much the ordering of Intelligence as the ordering of Vision”.
Just a few years previously, in 1880, Alexander Graham Bell first demonstrated a device called the photophone, a companion invention to the telephone, which enabled the first wireless transmission of the human voice. It worked by bouncing a beam of light off a reflective surface, which was vibrated by the voice of a speaker, and received by a primitive photovoltaic cell, which turned the light waves back into sound. The device was heavily dependant on clear skies for bright light, but even the weather could produce its own delights. Bell wrote to his father that “I have heard articulate speech by sunlight! I have heard a ray of the sun laugh and cough and sing! I have been able to hear a shadow and I have even perceived by ear the passage of a cloud across the sun’s disk.” ...

In January 1947, von Neumann and Zworykin shared a stage in New York at a joint session of the American Meteorological Society and the Institute of Aeronautical Sciences. Von Neumann’s talk on “Future Uses of High Speed Computing in Meteorology” was followed by Zworykin’s “Discussion of the Possibility of Weather Control”. The next day, the New York Times reported on the conference under the headline “Weather to Order”, commenting that “If Dr Zworykin is right the weather-makers of the future are the inventors of calculating machines.”
The inventor of calculating machines par excellence in 1947 was von Neumann himself, who had founded the Electronic Computer Project at Princeton two years previously, with the joint support of the Institute of Advanced Sciences and RCA. The project was to build upon both Vannevar Bush’s analog computer, the Bush Differential Analyser, developed at MIT in the 1930s, and von Neumann’s own contributions to the first electronic general-purpose computer, the Electronic Numerical Integrator And Computer, or ENIAC. ENIAC was formally dedicated at the University of Pennsylvania on February 15, 1946, but its origins were military: designed to calculate artillery firing tables for the United States Army’s Ballistic Research Laboratory, it spent the majority of its first years of operation predicting ever-increasing yields for the first generation of thermonuclear atomic bombs....

The meteorologists have already been working on the ENIAC continuously for almost a week, in eight-hour shifts supported by the programmers. Their intention is to perform the first automated 24 hour weather forecast. The boundaries of the calculation are the continental United States — a grid separated into 15 x 18 intervals — and the internal memory capacity of the ENIAC itself. The program consists of sixteen successive operations, each of which must be carefully planned and punched into cards, and which in turn produces a new output deck of cards which must be reproduced, collated, and sorted.
The entire run will end up taking nearly five weeks — although, as von Neumann will later point out, actual computation time will be about 24 hours, and, “one has reason to hope” that “Richardson’s dream of advancing computation faster than the weather may soon be realised.”

In the course of those five weeks, 100,000 IBM punch cards are produced, and a million multiplications and divisions performed. What strikes Platzman most deeply, and what he will recall most clearly decades later, is the strange interplay between electrical and mechanical components, and its resonance with scientific theory and technological process. The new era of virtualisation is a hybrid one: it absorbs both the physical and the digital, the climate both psychological and meteorological. Each acts on the other....

Silver iodide photography produced a revolution in seeing, and thus understanding, the world. The rays that affect silver salts overlap with, but are not the same as, the rays that the human eye sees. Early photographic plates are far more sensitive at the blues and violet end of the spectrum than they are at the red end. There has always been the possibility of capturing, on a photographic plate, marks of things we cannot see. Through its sensitivity to extrasensory wavelengths, and the ability to detect incredibly faint objects — such as distant stars — through long exposure, silver halide photography changed the consistency of the detectable universe.
Weather control through the use of silver iodide fulfils the promise of image-making to transform our understanding of, and thus agency in, the world. From representation of the environment and ourselves within it, with all its latent possibilities of apprehension and control, to direct manipulation of the environment, and thus ourselves....

On October 13 1947, GE Research collaborated with the US Army Signal Corps, the Office of Naval Research, and the US Air Force on the first instrumentalisation of cloud seeding: Project Cirrus, an attempt to modify a hurricane. Hurricane King, the eighth of the Atlantic season, was 400 miles off the East Coast and heading out to sea after already wreaking havoc in southern Florida. Shortly after the GE team dumped 180 pounds of dry ice into the heart of the storm it made a sudden hairpin turn and headed west, crashing back into the coast of Georgia and causing substantial damage.
Project Cirrus was succeeded by Project Stormfury, a large-scale, decades-long attempt to modify Atlantic hurricanes (and which Fidel Castro believed was a military project to turn counterrevolutionary hurricanes onto Cuba). It failed spectacularly, as tropical hurricanes contain little of the supercooled water found in typical storm clouds, and which can be affected by cloud seeding. But it did inspire military planners to return to the offensive possibilities of making it rain....

Since the Enlightenment, we have believed that by gathering empirical and objective data alone we can make sense of the world. Sense-making makes power: that which orders the world determines how we understand and make use of it. It is this progression from understanding to agency which we see in the history of meteorology: a progression from weather forecasting, to weather control.
Cloud computing is the full spectrum deployment of computational thinking to the world, and the internet makes of these clouds a single, vast, planetary weather system....

A deep sense of unease permeates the atmosphere. “Weather,” as the artist Roni Horn has observed, “is the key paradox of our time. Weather that is nice is often weather that is wrong. The nice is occurring in the immediate and individual, and the wrong is occurring systemwide.” Crisis is the new normal.
The cloud, however, remains a model of the world, just not the one we have taken it to mean. The apparent growth of crisis is, in part, a consequence of our new, technologically-augmented ability to perceive the world as it actually is, beyond the mediating prism of our own cultural sensorium. The stories we have been telling ourselves don’t bear out. They’re weak all over. The cloud reveals not the deep truth at the heart of the world, but its fundamental incoherence, its vast and omniferous unknowability.
In place of computational thinking, we must respond with cloud thinking: an accounting of the world which reclaims the recognition and the agency of unknowing. Aetiology is a dead end. The cloud, our world, is cloudy: it remains diffuse and forever diffusing; it refuses coherence. From our global civilisation and cultural history arises a technology of unknowing; the task of our century is to accommodate ourselves with the incoherence it reveals.
clouds  fiber_optics  telecommunications  infrastructure  computing  meteorology  weather  photography  materiality 
12 days ago
Spaces of the Learning Self - e-flux Architecture - e-flux
Chris Abel, for instance, an architect and urban planner who advised on the architecture of schools for the Greater London Council, in 1969 proposed designs for “mobile learning stations.” Conceived to be largely independent from their surroundings, these stations were to be equipped with different filing systems, display panels, work surfaces, technological aids and media.13 The individual student could either learn on their own by immersing themselves in a tailor-made, controlled and programmed environment, or by connecting with other stations to generate variable learning groups. ...

Drawing on environmental psychologist Robert Sommer’s 1969 classic Personal Space: The Behavioral Basis of Design, he emphasized the extent to which architecture shapes the individual’s learning and social intercourse as well as how physical forms and administrative arrangements account for significant changes in patterns of human activity. Van der Ryn recommends that instead of being controlled by standardized institutional architecture, people should be encouraged to change their environments, taking benefit from the flexibility that comes, for instance, with new plastic building materials....

“Spatial diffusion” is therefore key, given the “extended mobility of most students, and the non-spatial orientation of learning media such as computers and television.”17 The future of university education is, accordingly, media-based, “self-programmed instruction,” and “the place for individual learning will be the home, or a personal study station.” Van der Ryn was an early, though arguably unintentional proponent of the Californian Ideology, a believer in the potential benefits of networked learning, and the emancipatory effects of technological progress.
furniture  intellectual_furnishings  learning_space 
12 days ago
Introduction to the end of an argument (Muqaddimah Li-Nihayat Jidal)/ Speaking for oneself… Speaking for others… on Vimeo
With a combination of Hollywood, European and Israeli film, documentary, news coverage and excerpts of 'live' footage shot in the West Bank and Gaza strip, Introduction to the end of an argument... critiques representation of the Middle East, Arab culture, and the Palestinian people produced by the West.
The tape mimics the dominant media's forms of representation, subverting its methodology and construction. A process of displacement and deconstruction is enacted attempting to arrest the imagery and ideology, decolonizing and recontextualizing it to provide a space for a marginalized voice consistently denied expression in the media.
middle_east  stereotypes  film  archive_art 
12 days ago
DIY Syllabus: What Goes Into a Syllabus | Vitae
what should be on our syllabi.

Who we are and how to interact with us. Obviously we need to provide basic contact information — office location and hours, email address, and phone number. But we need to do more than that.....

Our teaching philosophy. Explaining the benefits of visiting us during office hours is the first step of a more extensive description of our teaching philosophy and pedagogical approach to the course as a whole. If our approach is based on student-led discussion, or we use the Socratic Method, or the course is predicated on problem-based learning, we ought to explain our thinking right off the bat.

Sharing our philosophical approach with students via the syllabus allows them to see the course as the product of careful decisions intentionally made. That helps them see who we are as instructors, and gives them insight into the type of environment we’re hoping to create in the classroom. Articulating our philosophy on a syllabus is a useful exercise in and of itself, but it pays larger dividends in making students aware of our expectations and their opportunities for engaging with us, each other, and the course material.

Clear and assessable outcomes. I know that “learning outcomes” and “assessment” are fighting words in some quarters, as they often represent top-down policies that add to faculty workload without any corresponding benefits. But just because “assessment” is an oft-abused term doesn’t mean it isn’t an important principle behind what we do.... Our syllabi should tell students what they’re going to get out of the course: Where will it take them? What will they build? What tools will they take with them into the rest of their academic career? Framing our course objectives as answers to such questions — as opposed to a set of sterile-sounding bullet points labeled “student learning outcomes” — conveys the class goals, the activities, and their lasting value for students....

A road map. Effective syllabi contain a thorough and specific calendar of topics and assignments. Show students when things are coming, and how course components fit together. .... An effective course calendar organizes the semester, not just with dates but with general topics, specific issues, and guiding questions. Students should be able to look at the calendar and not only know what’s due for a particular day, but where that class session fits in the larger framework of the course. For example, a sociology course on contemporary social problems could have each unit’s theme presented as a question...
Effective syllabi, then, should show students:

Exactly what they’ll be asked to do in our courses, and why.
How we’ll assess that work.
The ways in which our course complements their academic program.
And what they’ll gain from the class beyond just basic content knowledge.
If there’s one recurring theme here, it’s putting student learning (not institutional policy) at the heart of the syllabus. A syllabus tells students what they can do in the space we’re creating with the course. It should eschew the easy temptation of listing what they can’t do — a litany of thou-shalt-nots militates against the type of learning environment we want to create. The overarching theme here is one of invitation.
teaching  advising  syllabus 
13 days ago
Design Fiction 2015 | Design Fiction and Imaginary Futures Course Blog @ CMU
This praxis-based course actively engage futures research through the integration of findings from critical readings, ethnographic research, mediated storytelling and hybrid prototyping. Using techniques of inversion, defamiliarization, uncertainty scenarios, everyday practice and good old-fashioned humor, we create objects, systems and experiences that stimulate conversation, debate and understanding. The course seeks to produce a diversity of “what will?” and “what if?” cultural provocations that deeply examine possible, unwanted and seductive futures.
design_fiction  speculation  speculative_design  syllabus 
13 days ago
Artist Profile: Ingrid Burrington | Rhizome
Putting things in a larger context is definitely part of it. One of my big reference points for thinking about artists looking at infrastructure is Robert Smithson’s Monuments of Passaic. I remember seeing it when I was an undergrad and realising, “oh okay, you can just do that.” Which, sure, you can just do that if you’re Robert Smithson and you’re a man and it’s the late 1960s, but more broadly it was realising the ways in which can see a system differently when you start to incorporate it into a language or a context that seems wholly inappropriate in a way. It’s a useful trick, something that Smithson did very well and that particular essay is like a standard for it: “I went to a weird place and I didn’t understand what I saw and then I wrote about it.” He nails it.

In terms of thinking about the shift from the “here’s a thing” framing, I think it’s the result of the increase in obscurity of the infrastructures that make everyday life possible in the Western world. Sometimes I think that one of the reasons that people were very interested in the Networks of New York field guide and the premise of being able to “see” the internet on the street wasn’t necessarily because they were all that interested in the street, but because of an anxiety that there is nothing to hold in relation to how we live with technology. This feeling of “I don’t know what Facebook’s doing, I don’t know what Google’s really doing, I don’t particularly feel that I can just trust these systems and there’s nothing tangible to connect back to or point at.” That’s something James Bridle has said a lot, that “we need to things to point at!” because the network isn’t some massive abstraction that can only be comprehended by wizards—these are objects, these are systems, and there are human beings who are responsible for these objects and systems.
infrastructural_tourism  visibility  infrastructure 
13 days ago
Goethe’s Colorful & Abstract Illustrations for His 1810 Treatise, Theory of Colors: Scans of the First Edition | Open Culture
The polishing of lenses, and work in optics generally, has a long philosophical pedigree, from the experiments of Renaissance artists and scholars, to the natural philosophers of the Scientific Revolution who made their own microscopes and pondered the nature of light. Over a century after Spinoza’s birth, polymath artist and thinker Johann Wolfgang von Goethe published his great work on optics, just one of many directions he turned his gaze. Unlike Spinoza, Goethe had little use for concepts of divinity or for systematic thinking.

But unlike many freethinking aristocratic dilettantes who were a fixture of his age, Goethe–-writes poet Philip Brantingham—“was a universal genius, one of those talents whose works transcend race, nation, language-and even time.” It’s a dated concept, for sure, but when we think of genius in the old Romantic sense, we most often think of Goethe, as a poet, philosopher, and scientist. When he turned his attention to optics and the science of color, Goethe refuted the theories of Newton and created some enduring scientific art, which would later inspire philosophical iconoclasts like Wittgenstein and expressionist painters like Wassily Kandinsky.

We’ve featured Goethe’s most important scientific work, Zur Farbenlehre (Theory of Colors), in a previous post. Now we can bring you the superior images above, from a first edition scan at Stockholm’s Hagerstromer Medical Library, who host a collection of scanned illustrations from dozens of first editions of naturalist texts. The collection spans a once suppressed physiology text by Descartes—another optics theorist—to Rachel Carson’s 1962 Silent Spring, the book that “launched the modern conservationist movement.” In-between, find scans of illustrations and photographs from the works of Carl Linnaeus, Charles Darwin, Louis Pasteur, and dozens of other natural philosophers and scientists who made significant contributions to medical science.
goethe  color 
14 days ago
Visualizing Cities
Visualization as a tool for analysis, exploration and communication has become a driving force in the task of unravelling the complex urban fabrics that form our cities. This platform tries to bring together urban visualization projects from around the globe.
data_visualization  mapping  urban_media  urban_studies 
14 days ago
Looking for a Choice of Voices in A.I. Technology - The New York Times
Conversational computing is holding a mirror to many of society’s biggest preconceptions around race and gender. Listening and talking are the new input and output devices of computers. But they have social and emotional dimensions never seen with keyboards and screens....

Choosing a voice has implications for design, branding or interacting with machines. A voice can change or harden how we see each other. Where commerce is concerned, that creates a problem: Is it better to succeed by complying with a stereotype, or risk failure in the market by going against type?

For many, the answer is initially clear. Microsoft’s artificially intelligent voice system is Cortana, for example, and it was originally the voice of a female character in the video game “Halo.”

“In our research for Cortana, both men and women prefer a woman, younger, for their personal assistant, by a country mile,” said Derek Connell, senior vice president for search at Microsoft. In other words, a secretary — a job that is traditionally seen as female....

But sometimes, if you want people to figure out quickly that they are talking to a machine, it can be better to have a man’s voice. For example, IBM’s Watson, when it talks to Bob Dylan in television commercials, has a male voice. When Ashok Goel, a professor at the Georgia Institute of Technology, adapted Watson to have a female voice as an informal experiment in how people relate to conversational machines, his students couldn’t tell it was a computer....

Gender is just the starting point. Can your A.I. technology understand accents? And can it respond in a way that feels less robotic and at least mimics some sort of human empathy?

“You need a persona,” Mr. Shao said. “It’s a very emotional thing — people would get red, even get violent, if it didn’t understand them. When it did understand them, it felt like magic. They sleep next to them. This is heading for hospitals, senior care, a lot of sensitive places.”...

And, of course, there are regional issues to consider when creating a robotic voice. For Cortana, Microsoft has had to tweak things like accents, as well as languages, and the jokes Cortana tells for different countries.

If a French driver goes into Germany using driving directions voiced by Nuance Communications, the computer will mispronounce the name of a German town with a French accent. The idea is to keep the driver confident by sustaining the illusion that the computer is French.
voice  artificial_intelligence  gender  sound_design  language  place 
15 days ago
Librarians Versus the NSA | The Nation
 Audrey Evans, who was a college student in Arkansas at the time, had what she calls a “crystallizing moment” when she heard about a library warning sign that read, “The FBI has not been here,” and then, in smaller type below: “Watch very closely for the removal of this sign.” The subversive message intrigued her. “It was an effort of resistance, and of getting around something, and simultaneously making the public aware of what was going on,” she told me recently. The conversation librarians were having about civil liberties “revealed the values of the public library as an institution, and became a grounding spot for me as a political-consciousness moment. I started thinking that maybe I would want to be a librarian one day.” She got an internship at the Clinton Presidential Library and, after college, went on to study law librarianship. Several other people that I spoke with for this story, including Alison Macrina, told me similar stories about how the activism of librarians after 9/11 shaped their political perspective.

The rebellion eventually attracted enough attention that in a September 2003 speech, Attorney General John Ashcroft attacked the librarians directly, accusing them of “baseless hysteria.” Records had not been sought from libraries under Section 215, Ashcroft insisted, and the FBI had no interest in “checking how far you have gotten on the latest Tom Clancy novel.” Ashcroft used the word “hysteria” five other times throughout the speech, and then again a few days later during a speech in Memphis.
librarians  privacy  civil_liberties 
17 days ago
The Unruly Pleasures of the Mid-Manhattan Library - The New Yorker
But while each of these branches has something unique to offer, the one I keep circling back to is the Mid-Manhattan. I tell myself it’s because they have an incredible selection of books in open stacks, cheerful librarians and guards, and a surprising trove of city services (last year, I applied for my IDNYC there). But really I think it’s because of the library’s waiting-room-at-the-end-of-the-world sense of freedom. I gave a book talk there recently, and it was one of the most engaged crowds I’ve ever spoken to. One woman in the front row cheered as if she were at a rock concert. To me, she exemplified the Mid-Manhattan spirit: a little daffy, infinitely welcoming.

In recent years, there has been much talk about what to do with the various library branches as they continue to adapt to the digital era. Originally, the Mid-Manhattan was due to be sold and its services moved into a fully renovated main building. Now those controversial plans have been scrapped in favor of a new “Midtown Campus Renovation,” and the Mid-Manhattan has been tapped for an overhaul, with designs set to be unveiled later this year, and completed by 2019. The carpets could certainly use a good cleaning, and it would be nice if the elevators and climate control worked better, but I hope the revamp won’t alter the branch’s freewheeling energy. On a recent morning, after checking out a pile of books at the Mid-Manhattan, I headed over to the Schwarzman to sit at an elegant wooden desk in the Allen Room, my work spread out under a pretty green-shaded desk lamp. Before long, someone’s phone pinged and the room grew tense, and I found myself missing the grubby comfort a block and a world away.
libraries  branch_libraries  public_space 
17 days ago
Building Your Sonic Brand — The California Sunday Magazine
BY 2018, the federal government will mandate that all new electric cars make an audible sound. The ostensible reason is to assist blind people — and the rest of us — echolocate the increasing number of silent 2-ton masses rolling down our streets... The regulations do not necessitate mimicking the rumble of a combustion engine. “With electric cars,” said Connor Moore, a 34-year-old San Francisco audio designer, “you can create the sound from scratch.”... Electric cars will make that same transition, including being branded by car model. We will know that it’s a Nissan Leaf; or an Uber self-driving car; or a Google, Apple, or Microsoft car before it even pulls into view. ...

Sonic branding: “My own knowledge of this was simply music for advertising,” he said, but what he discovered instead were “people crafting sound experiences for products and brands.” ... While sonic branding is narrowly defined as music composed for a specific product or company, the larger field Moore works in is technically known as sound design, a concept that extends the notion of our fabricated soundscape to include video games and movies, even the selection of musical tempos in hotel lobbies that alter one’s sense of how pleasant one’s stay has been. These related pursuits overlap, and the terms can get confusing: acoustic branding, sonic mnemonics, music branding, or even sogos (sound logos) and mogos (music logos)...

For BMW, maintaining consistent acoustic character throughout its fleet is an executive job description. BMW doors close with a clunk that must be in concert with the acoustics of the rest of the car, like the notes produced by the exhaust or the hum of the window motor. Most drivers of BMW’s M5, for instance, are probably unaware that the engine sound they hear when driving is a replication piped to the interior of the car via the speakers of the sound system....

“All brands exist on multiplatforms now,” Beckerman said, so that the whole point of a sonic brand is to create a sound that, first, is appropriate to the thing but then to vary it and deploy it in all kinds of locations — in TV ads, on hold with customers, on cellphones, in radio spots, as a browser alert...

For the longest time — in the late ’80s, early ’90s — the noise that greeted you when you booted up your Macintosh was a noxious beep.

Musicologically speaking, it was a tritone, an unpleasant combination of notes known as an augmented fourth. In the Middle Ages, it was called the Devil’s Interval, and the anxiety and melancholy it was alleged to induce in any listener caused the Catholic Church to ban its performance. The Devil’s Interval is the opening notes you hear in the title track of the album Black Sabbath....

Around the time Apple turned to Reekes, Microsoft hired Brian Eno to score the opening notes for its operating system. The difference is classic. Apple’s chime is elegant, optimistic, commencing. Eno’s score is like a symphony reduced to three seconds — rolling through at least three different movements resulting in the perfect Microsoft intonation: fussy, narrative, involved....

THE TINY RIFFS of sound that Reekes and Moore and Beckerman have produced operate in our ears, when done well, in subconscious ways. Beckerman has a cognitive scientist on staff who collaborates with them to generate sounds that involve “low cognitive load.”...

While some sounds can convey specific meaning, most of these communications are largely subjective. Still, categorizing these connotations is underway. Beckerman has compiled playlists of compositions that, according to his science, can alter one’s mood from optimistic to creative to calming...

Moore said that he is getting more and more queries about creating branded sounds for robots — especially robotic home companions, like Jibo or Echo. ...

Obviously, the least fatal way to communicate via Bluetooth or even a heads-up display in a moving car is through sound. Trying to coordinate those communications so that they happen below the surface of thought, Moore said, is the goal. He described how new compositions for the GPS in a car might not talk directly to the driver anymore, but instead “produce tiny pings on the left side of the car to turn left or on the right side to turn right and utilize the stereo field” in order to “give the driver information without having to look at a device and without screaming, ‘Turn right!’”...

Beckerman said something similar about the home. “Technology is so interwoven into our lives that we now have a lot of computers in our houses,” he said, but “we have to break the tyranny of the screen in terms of how we connect. I don’t know about you, but I don’t want a hundred screens in my house as I start building my connected home.”...

Sonic branders talk about our interactions with our devices as if what they are composing is not so much music as sonic linguistics. Given the near future of a communication singularity among our houses, cars, robots, and our bodies, technology will move beyond the necessity of touching and toward something that altogether resembles a new language. ...

recognize something as spoken in an Amazon patois, a Cartier Mid-Atlantic, or a Walmart drawl
sound_design  acoustic_ecology  noise  gadgets  soundscape 
17 days ago
Some Sketches on Vertical Geographies - e-flux Architecture - e-flux
I’ve long thought that conventional understandings of geography were a little too “horizontal”. That geographical concepts such as production, uneven development, territory, scale, geopolitics and the like tended to be theorized on an assumed horizontal plane of human existence makes sense, because the vast majority of human activity does more-or-less conform to the relatively narrow vertical band on the earth’s surface that can support human life. But human infrastructures and activities also inhabit a vertical axis, from deep sea mining and undersea cables to outer, and even arguably interstellar, space. As others have observed, different topologies of development, politics, urbanism, and the production of space emerge when we begin to consider the vertical dimensions of human world-making.1
What follows are some sketches of case-studies from my own work that have been personally helpful in considering what a theory of vertical geography might encompass. There is nothing comprehensive here, nor anything actually theorized at all. These are simply some examples of things I think about.

-20,000ft (Undersea Cables)
More than 99% of the world’s data travels through fiberoptic cables draped across the ocean floor. Undersea cable encircle the globe at depths of 20,000ft (6,000m), connecting continents and providing the backbone of the world’s telecommunications infrastructure....

0 (Backbone)
Cable landing points are the places where undersea cables come onshore, usually connecting to a building called a cable landing station. The cable landing station is typically a windowless building that supplies power to an undersea cable’s amplifiers and repeaters (a typical undersea cable has between three and four thousand volts applied to it). In many cases, the cable landing station also connects the undersea cable to terrestrial “backhaul” cables, which lead to the common internet backbone, with switches, core routers and other equipment, effectively connecting the undersea cable to the terrestrial internet infrastructure....

2,500ft (Persistent Surveillance)
Modern aerostats began to see widespread use in the 1980s when the US Customs Service installed the Tethered Aerostat Radar System (TARS) at High Rock, Grand Bahamas and Fort Huachuca, Arizona as part of the Reagan-Era “War on Drugs.”4 These airships were designed to provide radar-surveillance of border regions, and have been subsequently deployed throughout the Caribbean and Southwest at: Cudjoe Key, Florida; Deming, New Mexico; Eagle Pass, Texas; Fort Huachuca, Arizona; Lajas, Puerto Rico; Marfa, Texas; Rio Grande City, Texas; and Yuma, Arizona....

25,000ft (Predators, Reapers, Sentinels)...

260,000ft (Numbers)
In the 1920s, amateur shortwave radio operators discovered something unusual about radio transmissions between 1.3–30Mhz: when radio transmissions at these frequencies encountered ionized air in the upper atmosphere, they were “backscattered”, “skipping” back to earth. This meant that shortwave signals could be used for long distance communication and other applications beyond the “line of sight” limitations of most radio transmissions. Throughout the Cold War and into the present, amateur radio enthusiasts and state entities have taken advantage of skipping in a number of ways, like stations broadcasting propaganda and news or militaries creating “over the horizon” radar systems and developing detection capabilities in otherwise inaccessible regions. Transmitting shortwave is inexpensive, difficult to censor and ownership of shortwave radio outside the western world is common. Moreover, much machine-to-machine communication happens over shortwave, especially in applications for synchronizing time across global infrastructures, oceanic air traffic control, and weather reporting.

The most unusual signals skipping between the earth surface and ionosphere are arguably the “number stations”, which typically consist of a computer generated voice reading seemingly random sequences of numbers, usually preceded by a signature piece of music or other unique sound to identify itself. ...

160–2,000km (Low Earth Orbit)
One of the most important things to understand about space, explains military space theorist Jim Oberg, is that “space is unearthly” and that “much ordinary ‘common sense’ doesn’t apply. One has to be cautious about making analogies with everyday life.”6 Oberg’s point is that objects in orbital space inhabit very different topologies than more familiar infrastructures on the planet’s surface. Orbital space isn’t even closely analogous to the strategic notion of “high ground” used in terrestrial military theory, but neither is orbital space smooth and undifferentiated, an unmodulated expanse of nothingness. Orbital space is rather a topology characterized by the gravitational interactions of the sun, earth, moon, and outer planets; by irregularities in the earth’s surface that translate into gravitational peaks and troughs in orbital space; by magnetic fields, solar radiation pressure, and by stray atmospheric molecules that travel upwards. What’s more, the topology of orbital space is strongly influenced by geopolitical and economic policies and conventions of spacefaring nations on the earth below.

Low Earth Orbit (LEO) is generally defined as the orbital region between about 160–2,000km from earth (objects cannot remain in orbit under 160km), and is where the vast majority of earth’s satellites are based. LEO is primarily used for remote sensing and imaging satellites, scientific monitoring, and communications infrastructures such as the Iridium system, the most popular satellite phone network. Low Earth Orbits are also the domain of optical and radar-imaging reconnaissance satellites, which take advantage of their closeness to earth to conduct high-resolution photography over vast swaths of land in a short periods of time.
The fact that low earth orbit exists at all is the result of a geopolitical quirk that became a de facto convention. Before the Soviet Union launched Sputnik, no one knew whether a satellite in orbit could be said to violate the territorial integrity of the nations that it overflew. As historian Everett Dolman points out, the Eisenhower administration was secretly elated that the Soviet Union first established the precedent that a satellite in orbit did not violate the sovereignty of the countries that it overflew...

Still to this day there is no internationally agreed upon vertical limit of a nation’s territory. The highest airships and balloons can operate up to an altitude of about 37km, while the lowest satellites can only operate at approximately 160km. ...

36,000km (Geostationary Orbit)
The “space” in “earth-orbit-space” that has most in common with terrestrial territory is the geostationary-orbit (GEO), a thin gravitational ring only a few kilometers thick and wide, 36,000km directly above the equator. This space is important to the world’s militaries, intelligence agencies, and corporations because objects placed in GEO orbit the earth at the exact same rate that the earth itself rotates. An object in geostationary orbit effectively “hovers” over a particular place on the earth’s surface, making it an ideal location for communications satellites. Because the ring of space where the geostationary orbit “works” is so thin, and because prime geostationary “slots” are finite, the International Telecommunications Union (ITU) regulates the allocation of space within the geostationary belt.
satellites  infrastructure  space  cables  telecommunications  geography 
17 days ago
Libraries and Museums Advance the Digital Humanities: New Grant Opportunity | Institute of Museum and Library Services
All of these projects directly inform and support IMLS’s commitment to improve the National Digital Platform for libraries and museums. The National Digital Platform is the combination of software applications, social and technical infrastructure, and staff expertise that provide digital content, collections, and related services to users in the US. Through grants at a variety of levels across its programs, IMLS has funded a number of projects that contribute to digital humanities work at libraries. This funding has supported projects that capture and mine social media data, convene scholars and other specialists in planning digital collections projects, expand librarians’ expertise in data mining, and other related projects.
digital_humanities  funding  digital_collections  libraries 
19 days ago
nick lally // art, geography, software » Blog Archive » notes on satellite geographies
The geographic importance of satellites as both a tool (ie: remote sensing, satellite imagery, etc) and object of study (ie: mobile maps, military intelligence, surveillance, etc) is well known today in geography. But the geographic imaginary of seeing like a satellite and the desire to see these seeing machines has also been productive in geographic scholarship. For example, Doreen Massey, in her 1993 essay Power geometry and a progressive sense of place, begins with the view of an imagined satellite, one that lies beyond existing satellites, and is able to zoom in on a place. She writes:

what gives a place its specificity is not some long internalized history but the fact that it is constructed out of a particular constellation of relations, articulated together at a particular locus. If one moves in from the satellite toward the globe, holding all those networks of social relations and movements and communication in one’s head, then each place can be seen as a particular, unique point of that intersection (61).

This is a different view than that of the totalizing vision of the technologically-enhanced primate eye seeing through a satellite that Donna Haraway has critiqued:

Vision in this technological feast becomes unregulated gluttony; all seems not just mythically about the god trick of seeing everything from nowhere, but to have put the myth into ordinary practice. And like the god trick, this eye fucks the world to make techno-monsters (581).

Massey doesn’t use the imaginary of satellite vision to produce an all-knowing perspective (she may even be reiterating contemporaneous critiques of David Harvey’s totalizing views, see for example Boys Town by Rosalyn Deutsche), instead she zooms in to a particular place. But the zoomed-out satellite view is still important in seeing the vast global networks that intersect in each particular place, allowing her to call for a “global sense of the local” (68).

If the imagined satellite view is productive in Massey’s work, Trevor Paglen flips the script by attempting to photograph the visual traces of satellites that are ostensibly hidden from view. His The Other Night Sky photographs are produced through detailed research that enables us a technologically-mediated view of the watchers:

The Other Night Sky” is a project to track and photograph classified American satellites, space debris, and other obscure objects in Earth orbit. The project uses observational data produced by an international network of amateur satellite observers to calculate the position and timing of overhead transits which are photographed with telescopes and large-format cameras and other imaging devices.
satellites  satellite_imagery  machine_vision  visibility  geography 
19 days ago
Moholy-Nagy, “Room of the Present,” 2009
Room of the Present (Raum der Gegenwart) is a multipart installation on view in the Guggenheim’s current exhibition Moholy-Nagy: Future Present, through September 7. The large-scale work is a present-day fabrication of an exhibition space the artist conceived in 1930, but that went unrealized in his lifetime. Room of the Present offers the experience of stepping inside one of Moholy-Nagy’s artworks. Film booths, revolving panoramic photography, and replicas of industrial designs are among the chorus of objects, images, and effects in the room that showcase vanguard mediums of the early 1930s.
installation  media_history  exhibition_design  moholy_nagy 
19 days ago
Moholy-Nagy: Future Present
Among Moholy-Nagy’s radical innovations were his experiments with cameraless photographs (which he dubbed “photograms”); his unconventional use of industrial materials in painting and sculpture; experiments with light, transparency, space, and motion across mediums; and his work at the forefront of abstraction, as he strove to reshape the role of the artist in the modern world. Moholy-Nagy: Future Present features paintings, sculptures, collages, drawings, prints, films, photograms, photographs, photomontages, projections, documentation, and examples of graphic, advertising, and stage design drawn from public and private collections across Europe and the United States.
On display in the museum’s High Gallery is Room of the Present (Raum der Gegenwart), a contemporary fabrication of an exhibition space conceived of by Moholy-Nagy in 1930, but not realized in his lifetime. On view for the first time in the United States, the large-scale work contains photographic reproductions and design replicas as well as his kinetic Light Prop for an Electric Stage (Lichtrequisit einer elektrischen Bühne, 1930; recreated 2006). Room of the Present illustrates Moholy-Nagy’s belief in the power of images and the significance of the various means with which to view and disseminate them—a highly relevant paradigm in today’s constantly shifting and evolving technological world.
moholy_nagy  exhibition_design  media_history  immersion 
19 days ago
The Avery Review | Reset, Modernity.
An exhibition–as–thought experiment runs the risk of being curatorially heavy-handed, sublimating authorship with argument and only mobilizing work as evidence of a claim formed before the show was conceived. The format also has the potential to approach the exhibition format as a venue to test ideas and engage with others testing the same ideas in different fields and different media, transforming the gallery into a space of interdisciplinarity for more than interdisciplinarity’s sake and destabilizing the notion of the “art object.” Some critiques levied at the show have suggested that Latour simply instrumentalized art and artists, undercutting their own authorship and meaning with didactic explanation to serve a linear argument. Reset Modernity! assembles a staggering collection of work from architects, filmmakers, sociologists, anthropologists, photographers, and painters. If the curators had not pulled together such robust and heterogeneous a show, then yes, it’s possible the work could have been subsumed by the curatorial hand. But here it stands on equal footing, in dialogue with each curatorial provocation as well as much, much more. The audience is, after all, free to dispense with the handy flipbook and wander the gallery on its own terms. Partitions interrupt a smooth and linear flow of foot traffic at unexpected angles, fragmenting the gallery floor and hinting that there are multiple paths through this field.
exhibition_design  actor_network  anthropocene  latour 
20 days ago
— Infrastructural Inversion: or how to “open” black...
Infrastructural inversion tries to deduct its underlying classification systems and standards from a small portion of the infrastructural complex. One example that comes into my mind for databases is the project Other People also Bought by artist Sebastian Schmieg and Jonas Lund (2013). They wrote a script that would pick the automated shopping recommendations that Amazon pulls from its database based on user behavior. The artists’ script then adds one of these recommended products to the shopping cart, thus altering not only the user data but also triggering another round of recommendations. With this playful project not only do Schmieg and Lund explore the recommendation algorithms, but in a deeper sense expose the particularities of data storage and the structuring of data. They address the very existence of a new kind of data – the meta-data that was recorded from user interactions. These recordings create a new form of data based subjectivity, a particular data body of the particular user, who in most cases is forced into producing his/her own transactional data body.  
Further, I remember more artists works, worth being discussed from a perspective of database infrastructure – classics such as Lisa Jevbratts 1:1 (1999), Heath Buntings The Status Project (since 2004), Paolo Cirio’s Amazon Noir  (2007/08), and Natalie Bookchins Testament (2009), all of whom either created databases for their own use or explored existing ones. Although it was not their direct aim to address the database as infrastructure, and although they have been interpreted mostly in the context of, a re-reading from a database perspective could reveal more about the underlying politics of databases.

Furthermore, non-artistic examples for infrastructural inversion of databases exist as well: For instance one could explore the popular Wordpress content management system that runs on a MySQL database backbone using the plugin Query Monitor. It makes visible each database call ‘inside’ the Wordpress system and gives us an idea what might be stored in database tables such as wp_posts, wp_comments or wp_users.
Other, more limited means to explore DBMS are APIs, which are sets of commands to access the – so to say – public part of large DBMS of companies such as Google , Facebook, Twitter, and so on. However, this seemingly open access is restricted in its own way. Exploration is only open to a certain extent that fosters, but does not jeopardize, corporate exploitation of data (cf. Taina Bucher).
While accessing databases through their interfaces forms one area of contesting DBMS, another form is infrastructural tourism (Shannon Mattern), which means actually visiting the material parts of a certain infrastructure. Boaz Levin and Ryan Jefferey have done so, when discussing the ideology behind the “Cloud” in their video essay All That is Solid Melts Into Data (2015) and Timo Arnall visited and filmed the Telefonica Data Center in Spain for his film Internet Machine (2014). Both films trace the physical layers of network infrastructure and – in my reading – also refer to infrastructural inversion.
We can only speculate whether analogous to “food porn” a genre called “infrastructure porn” will develop in the near future. Actually, there is a whole film genre that could be called “infrastructure porn”, but for reasons of deference it is labeled disaster movies. ...

Another infrastructural area of DBMS that could be considered and has not been addressed yet, comprises of the production process and its producers (often male engineers). If we look for any access points, where the un-black-boxing could take place, we may end up with the producers and managers who heavily influence the structure of data storage and who have yet to be recognized as addressees for political claims. We may also end up with legislation, regulation and standardization, yet another field that is highly unattractive for artists. There may, however, be hope: The Political Spectrum (2008) was a collaborative mapping effort initiated by artist Julian Priest to discuss the changing regulation of airwaves. It is a beautiful project that demonstrates the potentials and restrictions of artistic work in the infrastructural field.
infrastructure  databases  labor  programming 
20 days ago
Generous Thinking: Introduction – Planned Obsolescence
Those of us whose work focuses on During’s “core humanities” are often understandably queasy about our fields’ development out of the projects of nationalism and cultural dominance, and are left leery about stating clearly and passionately the values and goals that we bring to our work. And our preferred strategy for contending with such ambivalence is to complicate; to demonstrate from a rigorously theorized position the ways that we are engaged in a progressive, if not radical, project; to read, as they say, against the grain.

This form of complication is utterly necessary, not just to our methodologies but to our sense of ourselves in the act of our work: it is clear that the disciplinary force wielded by our fields has too often been put to dangerous use, and it is necessary to account for the subtleties of our positions. Our work thus repeatedly explores, as Rita Felski describes it, our suspicious “conviction” that not only the texts that we study but also the ways that we have been led to study them are “up to no good” (58). This is where my graduate students began their engagement with that article. The problem is not just that they were then unable to articulate in any positive sense what the article was actually trying to accomplish, but that the critical position they assumed was the only position they had available to them. And however much this internally-focused mode of critique has done to advance the field and its social commitments — and I will stipulate that it has done a lot — this form of engagement is too often illegible to the many readers around us, including students, parents, administrators, and policymakers. What they see looks like discomfort with the field in which we work, ambivalence about the materials we study, resistance to the culture in which we live, and a seemingly endless series of internal arguments, all of which might well lead them to ask what is to be gained from supporting a field that seems intent on self-dismantling....

David Scobey, then the dean of the New School for Public Engagement. His suggestion was that scholarly work in the humanities is in a kind of imbalance, that critical thinking has dominated at the expense of a more socially-directed mode of what he called “generous thinking,” and that a recalibration of the balance between the two might enable us to make possible a greater public commitment in our work, which in turn might inspire a greater public commitment to our work. I want, humanities scholar that I am, to revise this model slightly, by nudging us away from the notion that critical thinking and generous thinking are somehow opposed categories, in tension with one another, pulling us in different directions and requiring us to walk the tightrope between. Instead, I want to think about how these two modes of thought might be more fruitfully intertwined. What kinds of discussions might be possible — discussions among ourselves, discussions with our students, but also discussions with a much broader series of publics, with those whose support we require in order to keep doing the work we do — what such discussions might be possible if we understood the very foundation of our critical thinking practices to be generosity?

...a kind of generosity of mind, by which I mean to indicate an openness to possibility. That openness begins for me by cultivating a listening presence, which is to say a conversational disposition that is not merely waiting for my next opportunity to speak but instead genuinely hearing and processing what is being said to me, underwritten by the conviction that in any given exchange I likely have less to teach than I have to learn. It also means working to think with rather than against, whether the objects of those prepositions are texts or people. It means, as Lisa Rhody has explored in a brilliant blog post on the applicability of improvisational comedy’s “rule of agreement” to academic life, adopting a mode of exchange that begins with yes rather than no: as she describes it, among colleagues, the rule of agreement functions as “a momentary staving off of the impulse to assume that someone else’s scholarship is fashioned out of ignorance or apathy or even ill will or that the conversation was initiated in bad faith. Agreement doesn’t have to be about value: it’s not even about accuracy or support. The Rule of Agreement is a social contract to respect the intellectual work of your peers.” ...

A proper valuation of public engagement in scholarly life, however, will require a systemic rethinking of the role that prestige plays in the academic reward system — and this, as I’ll discuss later in the project, is no small task. It is, however, crucial to a renewed understanding of the relationship between the university and the common good.

Similarly, grounding our work in generous thinking might not only encourage us to adopt a position of greater dialogic openness, and might not only foster projects that are more publicly engaged, but it might also lead us to place a greater emphasis on — and to attribute a greater value to — collaboration in academic life. It might encourage us to support and value various means of working in the open, of sharing our writing at more and earlier stages in the process of its development, and of making the results of our research more readily accessible to and usable by more readers. Critical thinking often presupposes a deep knowledge of a subject, not just on the part of the speaker but of the listener as well. Generous, generative modes of thinking invite non-experts into the discussion, bringing them along in the process of discovery.....

But I want to acknowledge that adopting a mode of generous thinking is a task that is simultaneously extremely difficult and easily dismissible. We are accustomed to a mode of thought that rebuts, that questions, that complicates, and the kinds of listening and openness for which I am here advocating may well be taken as acceding to a form of cultural naïveté at best, or worse, a politically regressive knuckling-under to the pressures of neoliberal ideologies and institutions. This is the sense in which Rita Felski suggests that scholars have internalized “the assumption that whatever is not critical must therefore be uncritical” (9). Felski posits, in contrast to the most common assumptions within the field today, that the critical is not a project but instead a mood, a mode of self-performance, an affect — and one to which we have limited ourselves at great cost. ...

it is all but certain that at some future moment our own blind spots, biases, and points of general ignorance will have been uncovered. Refusing to countenance the possibility of this wrongness makes it all but inevitable, but perhaps keeping it in view might open us to some new possibilities. If everything we write today already bears within it a future anterior in which it will have been demonstrated to be wrong-headed, there opens up the opportunity to explore a new path, one along which we develop not just a form of critical audacity but also a kind of critical humility.

The use of this critical humility, in which we acknowledge the possibility that we might not always be right, is in no small part the space it creates for genuinely listening to the ideas that others present, really considering their possibilities even when they contradict our own thoughts on the matter. But critical humility, as you might guess, is neither selected for nor encouraged in the profession, and it is certainly not cultivated in grad school. ...

As I hope that many of us have learned from movements like Black Lives Matter and from the protests taking place on our campuses, what many have long known: the most difficult work of the ally may well be in adopting a position of listening in a mode that does not presume to know but that seeks instead to put the self aside in the hope of possible understanding. Genuine empathy, that is to say, is not a feel-good emotion, but an often painful, failure-filled process of what Dominick LaCapra has referred to as “empathic unsettlement,” in which we are continually called not just to feel for but to simultaneously acknowledge the irreconcilable otherness of the other, seeking to fully apprehend difference without tamping it down into bland “understanding.” ...

To ask us to open ourselves to the possibilities that love might present if it were to ground our work is not to take us back to some belletristic moment in which our energies are spent acting out our aesthetic appreciation or, worse, emoting over the text; there is still serious analytical work to be done, even if we allow ourselves to express the passion that our subject matter inspires. But we must also encounter in so doing that some of our anxieties about stepping back from the critical mood arise out of a worry that what we do will be perceived as purely affective or inspirational, unworthy of being supported as a serious form of research. We worry, as Deidre Lynch has explored, that genuinely loving something turns one into an amateur in the literal sense of the word: one so devoted to a practice that one ought to be willing to do it for free.
critique  liberal_arts  academia  UMS  advising  generosity 
21 days ago
The companies that control America's internet — Quartz
The question of who should bear those infrastructural costs (consumers, telecoms, or content providers) has a lot to do with how you conceive of the internet as infrastructure and utility (which, in theory, the FCC’s net neutrality rules are supposed to do). While the rhetoric of Silicon Valley might emphasize going it alone, staying nimble, moving fast, and breaking things, today’s major telecoms exist in the legacy of of heavy infrastructure—rail, natural gas, mining—and corporate conquest. If we accept the idea of that infrastructure as a utility, then understanding who these companies are, what they’re actually doing, and how much of the network they control is really important for holding them accountable and keeping the possibility of an open internet alive.
infrastructure  ownership  internet  utilities 
21 days ago
The story of gutta percha, the humble tropical tree sap, which laid the foundation for telecommunications — Quartz
There was a deep irony at the core of the gutta percha-enabled telegraphy craze. While the latex was fuelling the state of the art in global electrical communications, the methods by which the substance was collected by local woodsmen in its native Southeast Asia remained basic—and environmentally destructive.
infrastructure  media_archaeology  media_history  fiber_optics  cables  materiality  nature  colonialism 
21 days ago
Foucault-Blog - UZH - Forschungsstelle für Sozial- und Wirtschaftsgeschichte: From Media Archaeology to Media Genealogy An Interview with Erkki Huhtamo
Zielinski did not speak of 'media archaeology' back then although he later claimed that he had invented the concept. The first time I referred to my historical studies explicitly as 'media archaeology' in a major context was in a keynote lecture that I gave at the International Symposium of Electronic Art in Helsinki in 1994. This speech was later published in slightly modified form in the journal Leonardo and in an edited volume titled Electronic Culture, next to articles by Kittler, Zielinski, Lev Manovich, Katherine Hayles, Sherry Turkle, and other theorists of digital media.[2]

As usual in the humanities, the concept 'media archaeology' emerged as a combination of various intellectual interests. For many of us, Foucault's The Order of Things and The Archaeology of Knowledge were highly influential.[3] These books were clearly an inspiration not only for me, but also for Kittler and Jonathan Crary, who published his controversial study Techniques of the Observer in 1990....

I'm not very much connected to this German tradition although I have been reading their works for a long time. I was more influenced by Anglo-American cultural studies but also by the French Annales School, by classics of cultural history like Johan Huizinga, and by cultural semiotics along the lines of Umberto Eco and Roland Barthes. Reading signs for hidden ideological formations is still important for me. After all, I'm a cultural historian by formation, a humanist who is interested in material devices and cultural techniques, but considers them as the outcome of discursive practices, not as factors that determine processes of communication.

Kittler, however, was never a professional historian, and it shows. Foucault was never a historian either, and it shows, too. I certainly understand the critique of these figures, coming from trained historians. Although I am intrigued by Foucault's and Kittler's writings, I cannot trust them when it comes to historical accuracy. Kittler's work is full of mistakes and weird misunderstandings as well as deliberate, playful superimpositions of contemporary ideas on the past. I don't think that Kittler with his brilliant but reckless philosophical mind was interested in a sort of historical exactness I strive for. He was more interested in projecting his reflections into a historical setting and seeing what emerges from such confrontations. ... However, Kittler's intellectual world is much broader than Wolfgang Ernst's, who is more narrowly focused on technological systems and their supposed agency. I don't share his 'cold gaze' that excludes the kind of discursive interests that occupy me. That's curious, because Wolfgang's academic formation is not very different from mine. He needed to break with it, go in a radically different direction....

The point is that my understanding of media archaeology has a very strong empirical foundation. I don't have much appreciation for scholarship that is based on shaky factual grounds. In my impression, there seem to be plenty of media theorists, even those who call themselves media archaeologists, who have three shelves of books which they study very deeply, and then write new books based on their deep study of three shelves of books. For me, this is an easy way out. I like to believe that I've taken the hard way out. I made a very serious effort for years to visit archive after archive, and I learned different languages to have the skills to do so. In other words, I approach media archaeology as a historian, not as a philosopher. However, there are also very serious and studious theorists who engage in deep historical research and develop very complex systems of ideas. Bernhard Siegert is one of the foremost....

For me, a technological device, a piece of hardware, is not a medium, it cannot be a medium. It only becomes part of media culture when it's put into practice. This practice has material aspects of course, but it also unfolds on much more abstract levels when a medium gets transfigured by the people who use it. This is a question that I explored in my book Illusions in Motion, investigating on how the moving panorama could exist on different cultural levels that did not absolutely overlap.[9] Materially, there are colors put on a canvas. But there are also lectured performances where the canvas becomes a medium associated with all kinds of other devices and practices. It is also surrounded by various discourses, advertising, word of mouth. ...

My research is equally involved with synchronic and diachronic processes. Obviously, topoi travel in time, but I am not so much interested in the antiquarian question of their origins. I rather explore them in the contexts of their manifestations. Still, I have always been suspicious of Foucault's episteme. How does one get from symptomatic traces to a representative whole? How can one make general claims about overwhelming cultural transitions happening at a certain point in time? This is only possible if you deliberately exclude many aspects of that historical reality. As a historian, I was trained to critique such totalistic views of culture. ...

Well, a topos may be a cultural pattern, but it doesn't function like a clockwork. These formulas are not submitted to regulatory mechanisms. They appear from time to time, and it's worth asking why they appear at certain times but not others. The answer, however, doesn't stem from macro-level explanations. Culture is a multilayered phenomenon where changes never happen on all layers at once. The relations are too complex to be captured in such general assumptions....

there are risks in this dialogical approach. The main one is superimposing models from the present on the past and hence mistreating historical reality. And as I said, it's not by chance a typical accusation made against Foucault, Kittler, and also Jonathan Crary. Their histories are selective because they are pursuing certain agendas rather than respecting the complexity of a historical moment. In this regard, I'm with micro historians like Carlo Ginzburg who attempt to penetrate a situation back in time, a moment that only exists in the countless traces it has left behind...

Simon: There is no doubt that an interpretation of the past in terms of the present day is poor historiography. But when do we remain true to a historical moment? Only if we give the full account, a histoire totale? Or can we select elements that appear repeatedly and lead up to our present situation? ... The diversity of your topos reminds me of the dispositif. You know there are two major versions of this influential concept: the ahistorical apparatus theory stemming from Jean-Louis Baudry's dispositif cinématographique;[16] and Foucault's historical analysis of patterns of relations that connect heterogeneous elements.[17] I think one of the main challenges of today's media studies is the combination of these diverging applications of the dispositif concept.
media_archaeology  intellectual_history  methodology  historiography 
21 days ago
U.S. Tech Giants Are Investing Billions to Keep Data in Europe - The New York Times
In the battle to dominate Europe’s cloud computing market, American tech giants are spending big to build up their local credibility.

Amazon Web Services, the largest player, announced last week that it would soon open multiple data centers in France and Britain. Google, which already has sites in countries like Finland and Belgium, is expected to finish a new multimillion-dollar data complex in the Netherlands by the end of the year....

With many in Europe questioning why America’s largest tech companies control how many of the region’s 500 million citizens use everyday digital services, it is not surprising that the likes of Microsoft and Amazon are eager to play up their local roots....

As the European Union continues to clamp down on the perceived misuse of people’s digital information, analysts also say that many Silicon Valley giants are responding to these privacy concerns by increasingly offering individuals and companies the ability to keep information close to home, whereas in the past, data might have been stored solely in the United States.

“Countries like Germany are well aware of data privacy, and it has made them more wary of where data is kept,” said Gregor Petri, a cloud computing analyst at the technology research firm Gartner in Veghel, the Netherlands. “Local data sovereignty has become important, and American companies are now aware of that.”
data_centers  cloud  data_sovereignty  privacy 
21 days ago
Mapping Gentrification Risk in New York City – The Map Room
The Displacement Alert Project Map is a tool built by the Association for Neighborhood and Housing Development that maps, building by building, the risk of gentrification in New York City—i.e., where the rent is about to get too damn high. Intended for use by housing advocates, tenant organizers, community groups and others, the map calculates the risk of displacement—being pushed out of affordable housing—based on several factors. “Access to this data equips communities with information necessary to fight back against the displacement of residents who are being priced out and pushed out of their neighborhoods, to stop the harassment of tenants by bad landlords, and to prevent the expiration and loss and affordable housing units.”
mapping  methodology  housing  gentrification 
23 days ago
Enhance Library User Experience with ‘Design Thinking’ » Public Libraries Online

Describing how a design thinking process or exercise works in words is tough. There’s almost always a magic moment somewhere in the process where everyone looks around with that “eureka!” sparkle in their eyes. It’s experiential, iterative, and a lot of fun. The process facilitates suspension of judgment, rampant brainstorming, and the generation of crazy moonshot ideas.  It requires stepping out of your comfort zone, though, and can feel chaotic and raw.

How do I do this thing?

It’s important to understand that design thinking is not a discrete series of steps, and there’s no easy checklist that you can go through and be done. This is a mindset shift and, if you commit to it, it will change the way you see the world. The best tool I’ve found to help with this shift in thinking is IDEO’s design toolkit. IDEO is a company that hosts a website which provides a wealth of ideas, developed specifically for libraries on how to get started. The website has an array of tools, resources, and exercises that you can use, including “how might we?” statements, personas, and rapid prototyping. I’ve also co-written (with my colleague Callan Bignoli) a LibGuide on User Experience for Librarians where you can find a Prezi we use in our workshops, as well as a targeted series of blurbs, videos and exercises.

The only way to truly understand design thinking is to jump in with both feet. Grab some Post-Its and Sharpies and give it a go. One of the tenets is to fail early and often – that’s the only way to learn.
user_experience  libraries  design_thinking  tragic 
23 days ago
Cobalt mining for lithium ion batteries has a high human cost - Washington Post
This remote landscape in southern Africa lies at the heart of the world’s mad scramble for cheap cobalt, a mineral essential to the rechargeable lithium-ion batteries that power smartphones, laptops and electric vehicles made by companies such as Apple, Samsung and major automakers....

The world’s soaring demand for cobalt is at times met by workers, including children, who labor in harsh and dangerous conditions. An estimated 100,000 cobalt miners in Congo use hand tools to dig hundreds of feet underground with little oversight and few safety measures, according to workers, government officials and evidence found by The Washington Post during visits to remote mines. Deaths and injuries are common. And the mining activity exposes local communities to levels of toxic metals that appear to be linked to ailments that include breathing problems and birth defects, health officials say....

It moves from small-scale Congolese mines to a single Chinese company — Congo DongFang International Mining, part of one of the world’s biggest cobalt producers, Zhejiang Huayou Cobalt — that for years has supplied some of the world’s largest battery makers. They, in turn, have produced the batteries found inside products such as Apple’s iPhones — a finding that calls into question corporate assertions that they are capable of monitoring their supply chains for human rights abuses or child labor....

Apple, in response to questions from The Post, acknowledged that this cobalt has made its way into its batteries. The Cupertino, Calif.-based tech giant said that an estimated 20 percent of the cobalt it uses comes from Huayou Cobalt....

Scrutiny is heightened for a few of these minerals. A 2010 U.S. law requires American companies to attempt to verify that any tin, tungsten, tantalum and gold they use is obtained from mines free of militia control in the Congo region. The result is a system widely seen as preventing human rights abuses. Some say cobalt should be added to the conflict-minerals list, even if cobalt mines are not thought to be funding war. Apple told The Post that it now supports including cobalt in the law....

But how such serious problems could persist for so long — despite frequent warning signs — illustrates what can happen in hard-to-decipher supply chains when they are mostly unregulated, low price is paramount and the trouble occurs in a distant, tumultuous part of the world....

Lithium-ion batteries were supposed to be different from the dirty, toxic technologies of the past. Lighter and packing more energy than conventional lead-acid batteries, these cobalt-rich batteries are seen as “green.” They are essential to plans for one day moving beyond smog-belching gasoline engines. Already these batteries have defined the world’s tech devices....

The worst conditions affect Congo’s “artisanal” miners — a too-quaint name for the impoverished workers who mine without pneumatic drills or diesel draglines.

This informal army is big business, responsible for an estimated 10 to 25 percent of the world’s cobalt production and about 17 to 40 percent of production in Congo. Artisanal miners alone are responsible for more cobalt than any nation other than Congo, ranking behind only Congo’s industrial mines...

Diggers don’t have mining maps or exploratory drills. Instead, they rely on intuition. "You travel with the faith believing that one day you can find good production,” said digger Andre Kabwita, 49. Nature is said to be one guide. Yellow wildflowers are considered a sign of copper. A plant with tiny green flowers carries the telling name “la fleur du cobalt.”

Pay is based on what they find. No minerals, no money. And the money is meager — the equivalent of $2 to $3 on a good day, Nsenga said....

Cobalt is the most expensive raw material inside a lithium-ion battery. That has long presented a challenge for the big battery suppliers — and their customers, the computer and car makers. Engineers have tried for years to craft cobalt-free batteries. But the mineral best known as a blue pigment has a unique ability to boost battery performance. The price of refined cobalt has fluctuated in the past year from $20,000 to $26,000 a ton....

While a smartphone battery might contain five to 10 grams of refined cobalt, a single electric-car battery can contain up to 15,000 grams....

While cobalt mining is not thought to be funding wars, many activists and some industry analysts say cobalt miners could benefit from the law’s protection from exploitation and human rights abuses. The law forces companies to attempt to trace their supply chains and opens up the entire route to inspection by independent auditors.

But while Congo is a minor supplier of the four designated conflict minerals, the world depends on Congo for cobalt....

Starting next year, Apple will internally treat cobalt like a conflict mineral, requiring all cobalt refiners to agree to outside supply-chain audits and conduct risk assessments.

Apple’s action could have major repercussions throughout the battery world. But change will be slow. Apple spent five years working to certify that its supply chain was free of conflict minerals — and that action was enforced by law.
mining  geology  materiality  labor  technology  cell_phones  energy  supply_chain  minerals 
23 days ago
Police surveillance: The US city that beat Big Brother - BBC News
"We saw some things that raised questions. Why are they running fibre optic cables out there? That kind of thing," says BondGraham. Winston recognised the name of a security company on a council agenda and knew immediately what they were dealing with - a Domain Awareness Centre.

Most cities, including Oakland, have cameras monitoring traffic intersections and public areas. But a Domain Awareness Centre, or DAC, is far more sophisticated. It is still based around a bank of screens, but the camera feeds are augmented by data from weather reports, shipping movements, social media chatter, email records, emergency calls and other data sources. The port of Oakland had been given federal funds in 2008 to build a DAC as part of a post-9/11 push to protect critical infrastructure from terrorist attack.

At some point, the city council decided to extend the system to cover the whole of Oakland and its population of 400,000 people.....

Hundreds of new cameras would be installed across the city and data would be incorporated from number plate readers, gunshot-detection microphones, social media, and, in later phases, facial recognition software and programs that can recognise people from the way they walk.
The city said it needed an early warning system to give "first responders" a head start when dealing with emergencies like chemical spills and earthquakes, as well as major crime and terrorist incidents.
But privacy campaigners in the city were alarmed at the thought of the Oakland Police Department having access to an all-pervasive real-time surveillance network - particularly one that did not have a policy on what data would be stored and for how long....

With the city council tied on the issue, Oakland's then mayor Jean Quan, who had originally been in favour of the DAC, used her casting vote to back a motion that would dramatically scale it back so that it would be focused solely on the port, as originally planned....

"It's not the ordinary citizens. We want cameras. We want our safety. Because you can't walk down your street without worrying about whether someone is going to randomly shoot at you. Every night you hear gunshots going off."
Oakland is a high-crime city, averaging 109 homicides a year for the past 45 years. Many residents and businesses have invested in their own security cameras and are happy to share their contents with law enforcement...

Brian Hofer agrees that security cameras can prevent crime but says there is no evidence that mass surveillance does. And he argues that police departments only turn to "shiny gadgets" when relations with the public they are meant to protect, and on whom they rely as witnesses, have broken down.
"Instead of trying to repair these relationships we are just throwing more surveillance equipment at the problem. We are smart people here in Oakland. We have Silicon Valley right up the road and we just think all these new tools are going to solve our problems but it just doesn't work."...

Last week the ACLU launched proposed legislation in 11 US cities, including New York and Washington DC, that would, if passed, establish community control over police surveillance.
The initiative is inspired, in part, by the Black Lives Matter campaign, although many of the guidelines, such as an annual surveillance audit, come straight from the Oakland Privacy playbook....

Many of the systems being offered for sale to law enforcement agencies across the US, and around the world, were developed by defence giants for use on the battlefields of Iraq and Afghanistan. Here is a small selection:
Stingray fake phone masts
About the size of a suitcase, Stingrays work by pretending to be a phone tower in order to strip data from nearby devices, enabling police to track suspects without a warrant. They are also capable of accessing the content of calls and texts. The next generation of the device, Hailstorm, is now on the market.
Number plate readers
Police cars mounted with automatic number plate readers are thought to be in use in many US cities, gathering data on the location and movements of drivers. Research in Oakland found black neighbourhoods were being disproportionally targeted.
Crime prediction software
Software is being used by police in the US and UK that analyses crime statistics to predict where it will happen next. Microsoft, IBM and Hitachi are among the big players moving into this market. The latest Hitachi "crime visualisation" software - effectively a Domain Awareness Centre on your computer desktop - is being trialled in Washington DC and is demonstrated in this YouTube video. There is also growing concern about the use of social media analysis software, which monitors hashtags such as BlackLivesMatter and PoliceBrutality to identify "threats to public safety".
Surveillance enabled light bulbs
LED light bulbs marketed as energy-efficient upgrades to existing light bulbs on city streets that can contain tiny cameras and microphones linked to a central monitoring station.
Through the wall sensors
These use radar to peer through the walls of buildings - currently precise enough to show how many people are in a particular room.
X-Ray, or 'backscatter" vans
Mobile units that use X-ray radiation to see underneath clothing and car exteriors.
Aerial surveillance
The use of light aircraft to record continuous high definition footage of a city - recently discovered, and stopped, in Baltimore, following a public outcry. Police departments across the US, and in cities around the world, are also buying drones for surveillance.
Listening devices
Shotspotter microphones have been around for more than a decade and are thought to be in use in at least 90 US cities. They are designed to improve police response times but there are concerns they could be used to listen in to conversations.
surveillance  urban_media  privacy 
24 days ago
Deep-Fried Data
Machine learning privacy archives preservation copyright

Machine learning is like a deep-fat fryer. If you’ve never deep-fried something before, you think to yourself: "This is amazing! I bet this would work on anything!”

And it kind of does....

In our case, the deep-fryer is a toolbox of statistical techniques. The names keep changing—it used to be unsupervised learning, now it’s called big data or deep learning or AI. Next year it will be called something else. But the core ideas don't change. You train a computer on lots of data, and it learns to recognize structure....

So what’s your data being fried in? These algorithms train on large collections that you know nothing about. Sites like Google operate on a scale hundreds of times bigger than anything in the humanities. Any irregularities in that training data end up infused into in the classifier.

For this reason I've referred to machine learning as money laundering for bias. It's easy to put anything you want in training data....

A lot of the language around data is extractive. We talk about data processing, data mining, or crunching data. It’s kind of a rocky ore that we smash with heavy machinery to get the good stuff out.

In cultivating communities, I prefer gardening metaphors. You need the right conditions, a propitious climate, fertile soil, and a sprinkling of bullshit. But you also need patience, weeding, and tending. And while you're free to plant seeds, what you wind up with might not be what you expected....

My friend Sacha Judd, in an upcoming talk, describes something similar with fans of the boy band One Direction. This band has an obsessive following of young women, and in chronicling the lives of their beloved band members, the reach heights of technical achievement that rival anyone working in professional media. They are de facto professional archivists, developers, video editors, and journalists. But since “real” technologists don't take their interest seriously, these women don't recognize their own achievements. They would never dream of applying to the kind of jobs that they already excel at in their role as fans....

One approach is to go to the people who control the data—the big companies—and partner with them to study it.

It’s awkward because the very thing the Librarian of Congress objected to in the Patriot Act—the intrusive surveillance—is the bread and butter of online services. Much of the valuable information is collected in ways that would never pass ethical standards in academia, and ways that even the NSA would be legally prohibited from collecting.

But the data is there, and you can hear it calling to you.

“Study me” it coos. “Preserve me,” it pleads. since fly-by-night companies obsessed with growth assuredly won’t....

I worry about legitimizing a culture of universal surveillance. I am very uneasy to see social scientists working with Facebook, for example.... So in working with the online behemoths, realize that the behavioral data they collect is not consensual. There can be no consent to mass surveillance. These business models, and the social norm of collecting everything, are still fragile. By lending your prestige to them, you legitimize them and make them more likely to endure....

Something similar threatens to happen on the web. Forgive me for being technical, but the average web page right now is a giant pile of steaming garbage. Only superhuman effort by browser developers makes it possible for anything work at all. Pages get stitched together at load time through dozens of intermediaries, with live dependencies, and the bulk of rendering done in JavaScript.

So what does it mean to archive that? You can save the rendered image in the browser, but what about the dynamic behaviors? Does autocomplete fall within the purview of archiving? Is a dynamic ad an annoyance, or a valuable insight into 2016 life that we should save for posterity? And if so, which ad do we save, and how, when it’s pulled in at view time through a dozen different ad auctions and hypertargeted for the viewer?....

We have to learn how to send out ambassadors to online communities, like they do with uncontacted civilizations in the Andaman islands. You go out, they throw a few spears at your helicopter, but eventually you get to be on speaking terms and can learn from one another.

The task is pressing, because we’ve lost so much from the web already. Not only does something like 5% of URLs disappear every year, but things go up in big conflagrations when a company goes under, or makes a terrible decision.

I’ve saluted the efforts of Archive Team and the Internet Archive, but their activity is like having a museum curator that rides around in a fire truck, looking for burning buildings to pull antiques from. It's heroic, it's admirable, but it’s
no way to run a culture....

Most important is to make materials available in open formats, without restrictions, and with a serious commitment at permanence. These all require institutional courage, too. What if somebody grabs all this data, and does something with it that’s not scholarly?

Well, that’s what you want! A sign of life!

Publish your texts as text. Let the images be images. Put them behind URLs and then commit to keeping them there. A URL should be a promise....

This is an area where the Copyright Office can do a great deal of good, too, by aggressively advocating for fair use, and defending the framers’ intent in creating copyright....
data  machine_learning  privacy  preservation  archives  copyright 
24 days ago
Generous Thinking: The University and the Public Good – Planned Obsolescence
Generous Thinking proposes to intervene in this situation in two ways: first, by demonstrating the importance of the humanities to the continued success of the twenty-first-century university; and second, by encouraging scholars in the humanities to consider the ways they communicate with the publics by which they are surrounded. It does so by reconsidering the primary mode within which the humanities have long operated: as a cluster of disciplines based in, and propelled by, critique. This is not to say scholars should abandon critical thinking; far from it. But if Bruno Latour is correct in suggesting that critique has begun to run out of steam as a means of effecting change in the world, or if Rita Felski is correct in arguing that critique’s dominance has limited the kinds of work that scholars can do — well, then what? Latour asks whether we might instantiate a new mode of scholarly work grounded in concern and care; Felski hints at the possibilities for work stemming from a much broader array of affective states. Neither, however, digs into what might be required in order for scholars to do so, or what the resulting work might look like....

That is where Generous Thinking will begin: by proposing that rooting the humanities in generosity, and in particular in the practices of thinking with rather than reflexively against both the people and the materials with which we work, might invite more productive relationships and conversations not just among scholars but between scholars and the surrounding community. Again, this is not to suggest that scholars must abandon critique or the social commitments that underwrite it, particularly in favor of an approach that might be more, as our students have it, “relatable.” After all, one might reasonably ask why the suggestions that literary studies be conducted in an easily comprehensible, jargon-free, friendly and appreciable fashion are never levied against high-energy physics. ...

The first chapter will call upon scholars to begin developing their relationships with their communities, and with one another, not by replacing critique with affective states such as empathy or optimism, each of which serves (if inadvertently) to reinscribe status-quo power relations, but rather by engaging in practices of listening. Listening has frequently been evoked in recent months as a key starting point for developing a just, inclusive society, but the philosophical and ethical significance of listening has not yet been fully explored in its application to critical scholarly work. By reframing our work as both scholars and instructors from the perspective of listeners rather than speakers, I will argue, we may create the possibility of more open, trusting, effective exchange with the communities we hope to engage.

From listening as the root of engaged critical practice, I will turn my attention to the act of reading and the ways it might be reframed through the lens of generosity. In particular, in this chapter, I will explore the possibility that we as scholars might reimagine some of the too easily discredited or dismissed modes of “popular” reading — reading for pleasure, reading for identification — as the starting point for a set of collective reading practices that focus on building community through textual engagement. ...

my next chapter will turn to writing as another element in scholarly practice that might benefit from being conducted in generous, open dialogue with a wider public. This is not to say that everything scholars write should be written for the public; in fact, there is good reason for scholars to direct much of their work to one another. But developing what might be thought of in relation to code-switching skills — the ability to speak comfortably both with specialists and with broader publics — is crucial, not just for individual scholars today but for the academy writ large. By working in public, by inviting general readers into our arguments, we do not simply demonstrate our relevance to the broader culture but create the potential for effecting the kinds of change our criticism wishes to promote.

Finally, I will turn my attention to the university’s role in civic life, by exploring the ways that our work might be opened up not just so that it can be accessed by others but also so that they might participate in it. In part, this chapter will focus on what has been called the citizen humanities, public projects whose interactivity allow them to do more than simply teach others, but that rather allow us to learn from them. This chapter is also concerned, however, with the university’s relationship to the public good, a phrase that is often used in institutional outreach materials but that does not always have concrete follow-through. What would it mean for a university to understand its primary function as public service? How could such an understanding be fostered in order to transform a campus culture that, at present, rewards prestige, competitiveness, and individual accomplishment?
academia  liberal_arts  critique  advising  generosity  public_humanities 
24 days ago
Shenzhen: The Silicon Valley of Hardware (Full Documentary) | Future Cities | WIRED - YouTube
We examine the unique manufacturing ecosystem that has emerged, gaining access to the world’s leading hardware-prototyping culture whilst challenging misconceptions from the west. The film looks at how the evolution of “Shanzhai” – or copycat manufacturing – has transformed traditional models of business, distribution and innovation, and asks what the rest of the world can learn from this so-called “Silicon Valley of hardware".
piracy  media_city  supply_chain  logistics  reuse  china  infrastructure  manufacturing  repair 
24 days ago
Experimental Preservation
These preservationists do not celebrate experimentalism for its own sake, as modernist architects, artists, and scientists once did. 3 Rather they put forth experiments that interrogate conventional ways of preserving objects and offer alternatives that, while practicable, reach beyond institutionalized modes. And they often come to preservation from different disciplines — art, architecture, engineering, history, data science, material science, philosophy — bringing with them novel methods. They are simultaneously outsiders and insiders.

Adam Lowe, one of the founders of the collaborative Factum Arte, has pioneered what he calls “non-contact” experimental techniques which both digitalize material objects and rematerialize facsimiles. Lowe, trained as an artist, suggests that one can alter an object without even touching it, by multiplying it; and he proposes an expansive understanding of object, as something enmeshed in networks of social relations, media representations, political manipulations, and technological multiplication....

The word preservation is no less problematic. It has come to be associated with a sort of deference to the past over the needs of the present that subjugates contemporary action, normalizing and confining it via legal regulations and thwarting alternatives to the status quo. Experimental preservationists question the longstanding identity of preservation with the governmental protection of cultural objects, and the largely unquestioned narrative that preservation bureaucracies always act for the common good. 4 The Turkish artist Tayfun Serttas has even argued that there is no cultural heritage, only political heritage created by regimes whose goal is to consolidate and perpetuate their own power. His 2014 installation, Cemetery of Architects, consisting of plaques and cornerstones with the names of late 19th-century non-Muslim Turkish architects whose buildings have been systematically demolished, is clearly an instance of political resistance and an effort to visualize an alternative and more cosmopolitan future for Istanbul....

“The Authorized Heritage Discourse focuses attention on aesthetically pleasing material objects, sites, places and/or landscapes that current generations ‘must’ care for, protect and revere so that they might be passed to nebulous future generations for their education, and to forge a sense of common identity based on the past.” 5 In contrast, experimental preservationists guard their freedom to choose objects that might be considered ugly or unsavory, or unworthy of preservation, objects that might have been ignored or excluded by official narratives, perhaps because they embody the material, social, and environmental costs of development which governments and corporations seldom account for....

Some criteria, such as “historical significance,” were established in the late 19th century. Which does not mean these old ideas are not valuable; but they were created in an era when much of what defines our contemporary environment hadn’t been developed or didn’t even exist: electrification, cars, films, computers, digital codes, plastics, wearable technology, smartphones, smart buildings, robots, satellites, and so on. How then can we bridge the distance between these older ideas and values and the new material world?...

Heritage objects by definition represent not individual preferences but collective choices; and in their choice of objects, experimental preservationists do not attempt to speak for culture but rather to solicit a cultural response. ...

To choose an object is to take it, to appropriate it not only physically but also mentally; to alter its physical appearance and to modify its conceptual meaning. By the 1980s, historians began to notice the role of preservationists in choosing heritage. Eric Hobsbawm and Terence Ranger’s now classic The Invention of Tradition (1983) initiated a critical genre in which preservation was depicted as the deceitful manipulation of the past, artifice posing as truth in the service of sinister interests ranging from corporate profiteers to authoritarian governments. 13 What was new here was the historicizing of the discipline and the critique of its institutionalization; what was not new was the charge of deceitfulness. “The thing is a Lie from beginning to end,” declared John Ruskin in 1849. 14 In the modernist intellectual tradition that spans from Ruskin to Hobsbawm, the preservationist is condemned as deceitful if she does not visually express the manipulation of the historic object and leave a mark upon it; for without this obvious trace the future historian might be mislead and draw inaccurate conclusions. 15

Under the auspices of UNESCO, the 1964 Venice Charter made this standard into an international practice convention: “Replacements of missing parts,” states article 12, “must integrate harmoniously with the whole, but at the same time must be distinguishable from the original so that restoration does not falsify the artistic or historic evidence.” ...

The Venice Charter obliges preservationists — the lay practitioners of this dogma — to visually confess to historians — the scholarly theoretical clergy — when they transgress the bodily sanctity of the object. ...

What is at stake in Akšamija’s effort to account for the chosen objects? Carrying a clipboard and pad — and fortified with irony and humor — Akšamija impersonated a preservation bureaucrat laboring to document every received object into a pseudo-official register called the “Future Heritage Collection.” Akšamija’s performance, the fake but convincing protocols of her project, make painfully clear the biases and shortcomings of the new state, which has established no preservation authority capable of documenting, let alone storing and conserving, the objects that constitute its heritage. For the next and still more powerful stage of the project, Akšamija dressed up as a future archeologist who finds fragments of these objects — artifacts never accessioned into state collections, with no provenance, no history of associations or interpretations — and cannot make sense of them. ...

Experimental preservationists gently frustrate and subvert illusory belief by choosing, as heritage, objects that have appeared too imaginary, too fantastic, too subjective to be understood as real heritage. By insisting on the illusory nature of heritage objects, experimental preservationists are opening up new and vital questions about the reality of heritage as an open-ended process of social negotiation.
preservation  architecture  archives  archaeology 
26 days ago
This Rant is for Social Scientists | Library Babel Fish
we’re schooled to write in an inaccessible style, as if our ideas are somehow better if written in a hard-to-decipher script that only the elite can decode because if people who haven’t been schooled that way can understand it, it’s somehow base and common, not valuable enough. If you’re able to read this message, welcome! You’re one of us. The rest of you are not among the elite, so go away.
Even worse, we think our hazing rituals around publication and validation are more important than the subjects of our research, who couldn’t afford to read it even if we chose to write in a manner that didn’t require an expensive decoder ring with a university seal on it. We say “it’s for tenure” or “that’s the best journal” and think that’s reason enough to make it impossible for people without money or connections to read it.
I don’t know how else to put this: it’s immoral to study poor people and publish the results of that study in journal run by a for-profit company that charges more for your article than what the household you studied has to buy food this week. I cannot think of any valid excuse for publishing social research this way.
Because you don’t have to. SocArXiv makes it easy to share your research. Your institution may have a public research repository. There are open access journals that don’t charge authors and have the same peer review standards as other journals. You can reserve the right to share your work, and we’re finding sustainable ways to fund public knowledge. Will it take a little more of your time? Yeah, it’s a cultural shift, which is obviously complex, and you’re so busy.
academia  publishing  open_access  repositories 
26 days ago
AA School of Architecture - Exhibitions
Unknown Fields is a nomadic design studio that ventures out on expeditions through the shadows cast by the contemporary city to trace the alternative worlds, alien landscapes, industrial ecologies and precarious wilderness that its technology and culture set in motion. These distributed landscapes, the iconic and the ignored, the excavated, irradiated and the pristine, are embedded in global systems that connect them in surprising and complicated ways to our everyday lives. Systems that form a network of vast but elusive tendrils, twisting threadlike over everything around us, crisscrossing the planet, connecting the mundane to the extraordinary.
In The Dark Side of the City Unknown Fields takes us on a road trip through a reimagined city that stretches across the ends of the earth. It is a portrait of a place that sits between documentary and fiction, a city of fragments; of drone footage and hidden camera investigations, of interviews and speculative narratives, of toxic objects, reimagined landscapes and distributed matter from distant sites. The Dark Side of the City is a collection of stories from the constellation of elsewheres that are conjured into being by the city’s wants and needs, fears and dreams.
logistics  infrastructure  landscape  fieldwork  unknown_fields 
26 days ago
The Bird-Based Color System that Eventually Became Pantone
An effort to describe the diversity of birds led to one of the first modern color systems. Published by Smithsonian ornithologist Robert Ridgway in 1886, A Nomenclature of Colors for Naturalists categorizes 186 colors alongside diagrams of birds. In 1912, Ridgway self-published an expanded version for a broader audience — Color Standards and Color Nomenclature — that included 1,115 colors. Some referenced birds, like “Warbler Green” and “Jay Blue,” while others corresponded to other elements of nature, as in “Bone Brown” and “Storm Gray.”......

Color systems date back centuries, at least to Richard Waller’s 1686 Tabula colorum physiologica. Yet bird-watching hones a sharp eye for color differentiation, so Ridgway had an edge — as well...

Ridgway’s scientific work was inspired by Milton Bradley, who, along with selling board games, was a proponent of color education. He published Elementary Color in 1895 and manufactured a color wheel that, when spun, visually mixed different hues. as a drive for perfection enabled by 19th-century synthetic dye advancements.
color  ornithology  birds  standards  classification 
26 days ago
Who Funds Infrastructure for Journalism and Civic Tech? – Medium
But no program and no organization lasts forever. And this just is just the latest reminder that the work we do every day so often depends on the work of many others who have come before us.

It’s also a sobering reminder that ensuring long term sustainability for this sort of open source work is really, really hard. The Ford Foundation has done some research in this area but there are still more questions than answers.

I’m not going to say it’s easy (it’s never easy), but I will say it’s easier to get a grant to fund a shiny new project. Release an app. Provide an API to help people build stuff with your data. Keep that thing running for a couple years. Training! Webinars! Best practices!...

But I can tell you from experience that it’s a lot harder to get the next round of funding to keep the servers running, apply updates, keep pace with changing technology, improve and iterate (more on that in a future post)....

That said, when you’re doing mission-driven work and have a couple years of funding to do or build a particular thing, it’s easy to overlook or delay the necessary planning to ensure the thing you’re building is set up to survive for the long-term. And even if you do that planning, one, two or even three years may not be enough to build a viable thing AND figure out how to pay for its life after your grant runs out (more on that, also, in a future post).
As well-meaning as former staffers, volunteers and open source contributors might be, maintaining a popular open source project is hard. And it’s that much harder if you’re running a hosted service with servers to pay for, an API (or 12) and users/customers who have questions and need ongoing support. Further, if you’re lucky enough to have a project that does get really popular and has lots of people using and relying on it, your cost for infrastructure and support can dramatically increase. Where does that extra money come from?...

I hope Sunlight is indeed able to find a new home for their projects and form partnerships to continue their work, but I also hope that we can figure out a way to think beyond two year, grant funded projects and figure out a real plan for sustaining promising projects so that we don’t see critical infrastructure for journalism and civic tech wither and disappear.
open_source  open_data  maintenance  funding  labs  sustainability  infrastructure  knowledge_structures 
27 days ago
Everything Is Broken – The Message – Medium
It’s hard to explain to regular people how much technology barely works, how much the infrastructure of our lives is held together by the IT equivalent of baling wire...

Software is so bad because it’s so complex, and because it’s trying to talk to other programs on the same computer, or over connections to other computers. Even your computer is kind of more than one computer, boxes within boxes, and each one of those computers is full of little programs trying to coordinate their actions and talk to each other. Computers have gotten incredibly complex, while people have remained the same gray mud with pretensions of godhood.

Your average piece-of-shit Windows desktop is so complex that no one person on Earth really knows what all of it is doing, or how....

Security is taken more seriously than ever before, and there’s a network of people responding to malware around the clock. But they can’t really keep up. The ecosystem of these problems is so much bigger than it was even ten years ago that it’s hard to feel like we’re making progress....

Computers don’t serve the needs of both privacy and coordination not because it’s somehow mathematically impossible. There are plenty of schemes that could federate or safely encrypt our data, plenty of ways we could regain privacy and make our computers work better by default. It isn’t happening now because we haven’t demanded that it should, not because no one is clever enough to make that happen.
error  hacking  security  networks 
27 days ago
About | Describing Architecture

Describing Architecture explores how buildings and spaces are designed and documented. Over the past five years it has provided innovative ways of engaging diverse publics outside of traditional gallery contexts. The annual exhibition reveals unseen aspects of architecture as a creative practice, alongside its critical relationship to the visual arts and the work of artists. It includes work across a wide range of media – drawing, photography, model and film – and from a broad spectrum of participants, including among others, established architectural firms, artists, architects, students and graduates.

Describing Architecture’s aim is to increase the public’s understanding of architecture and engagement with the built environment. The work and themes of the exhibition are mediated and explored through a free programme of events, talks, guided tours and workshops. The online catalogue forms an integral part of the curatorial strategy for the exhibition, enabling continued engagement with the work. The website functions as an open archive, specifically designed to chronicle each years contributions, critical conversations and outputs.
architecture  archives  documentation  exhibition 
28 days ago
Sneak Peek Inside the Archives of Storefront for Art & Architecture at Industry City | Untapped Cities
Even if you haven’t been to an exhibit at the Storefront for Art and Architecture in Nolita, you’ll recognize its unique deconstructed facade of windows that open and close. Some visitors don’t even know which opening is the official front door and people have been known to climb in through the windows, Storefront tells us. Founded in 1982 and dedicated to presenting innovative and provocative work at the intersection of art and architecture, the Storefront for Art and Architecture has an impressive archival collection of material that includes original artwork and wild conceptual designs, from some of today’s leading architects like Diller + Scofidio, Steven Holl and Lebbeus Woods.

Led by curator Chialin Chou, who began work on the archives two years ago, the Storefront for Art and Architecture archives will officially open next Thursday in Industry City. We’re excited to offer this sneak peek of the space as well as announce an new partnership with Storefront to show readers materials from the archive, as a new primary source for our column The New York City That Never Was.
archives  architecture 
28 days ago
The Secrets of the Wave Pilots - The New York Times
The Marshalls provide a crucible for navigation: 70 square miles of land, total, comprising five islands and 29 atolls, rings of coral islets that grew up around the rims of underwater volcanoes millions of years ago and now encircle gentle lagoons. These green dots and doughnuts make up two parallel north-south chains, separated from their nearest neighbors by a hundred miles on average. Swells generated by distant storms near Alaska, Antarctica, California and Indonesia travel thousands of miles to these low-lying spits of sand. When they hit, part of their energy is reflected back out to sea in arcs, like sound waves emanating from a speaker; another part curls around the atoll or island and creates a confused chop in its lee. Wave-piloting is the art of reading — by feel and by sight — these and other patterns. Detecting the minute differences in what, to an untutored eye, looks no more meaningful than a washing-machine cycle allows a ri-meto, a person of the sea in Marshallese, to determine where the nearest solid ground is — and how far off it lies — long before it is visible....

In order to become a ri-meto, you had to be trained by a ri-meto and then pass a voyaging test, devised by your chief, on the first try. ....

Other species use far more sophisticated cognitive methods to orient themselves. Dung beetles follow the Milky Way; the Cataglyphis desert ant dead-reckons by counting its paces; monarch butterflies, on their thousand-mile, multigenerational flight from Mexico to the Rocky Mountains, calculate due north using the position of the sun, which requires accounting for the time of day, the day of the year and latitude; honeybees, newts, spiny lobsters, sea turtles and many others read magnetic fields. Last year, the fact of a ‘‘magnetic sense’’ was confirmed when Russian scientists put reed warblers in a cage that simulated different magnetic locations and found that the warblers always tried to fly ‘‘home’’ relative to whatever the programmed coordinates were. Precisely how the warblers detected these coordinates remains unclear. As does, for another example, the uncanny capacity of godwits to hatch from their eggs in Alaska and, alone, without ever stopping, take off for French Polynesia....

Efforts to scientifically deduce the neurological underpinnings of navigational abilities in humans and other species arguably began in 1948. An American psychologist named Edward Tolman made the heretical assertion that rats, until then regarded as mere slaves to behavioral reinforcement or punishment, create ‘‘cognitive maps’’ of their habitat. ... Tolman hypothesized that humans have cognitive maps, too, and that they are not just spatial but social. ‘‘Broad cognitive maps,’’ he posited, lead to empathy, while narrow ones lead to ‘‘dangerous hates of outsiders,’’ ranging from ‘‘discrimination against minorities to world conflagrations.’’ ...

The cognitive map is now understood to have its own physical location, as a collection of electrochemical firings in the brain. In 1971, John O’Keefe, a neuroscientist at University College London, and a colleague reported that it had been pinpointed in the limbic system, an evolutionarily primitive region largely responsible for our emotional lives — specifically, within the hippocampus, an area where memories form. ....found that our brains overlay our surroundings with a pattern of triangles. Any time we reach an apex of one, a ‘‘grid cell’’ in an area of the brain in constant dialogue with the hippocampus delineates our position relative to the rest of the matrix. In 2014, O’Keefe and the Mosers shared a Nobel Prize for their discoveries of this ‘‘inner GPS’’ that constantly and subconsciously computes location....

he showed that when cabbies frequently access and revise their cognitive map, parts of their hippocampuses become larger; when they retire, those parts shrink. By contrast, following a sequence of directional instructions, as we do when using GPS, does not activate the hippocampus at all...

What seems clear is that our ability to navigate is inextricably tied not just to our ability to remember the past but also to learning, decision-making, imagining and planning for the future. And though our sense of direction often feels innate, it may develop — and perhaps be modified — in a region of the brain called the retrosplenial cortex, next to the hippocampus, which becomes active when we investigate and judge the permanence of landmarks. ...

Joel immediately asked Genz to bring scientists to the Marshalls who could help Joel understand the mechanics of the waves he knew only by feel — especially one called di lep, or backbone, the foundation of wave-piloting, which (in ri-meto lore) ran between atolls like a road. Joel’s grandfather had taught him to feel the di lep at the Rongelap reef: He would lie on his back in a canoe, blindfolded, while the old man dragged him around the coral, letting him experience how it changed the movement of the waves. ... When oceanographers from the University of Hawaii came to look for it, their equipment failed to detect it. The idea of a wave-road between islands, they told Genz, made no sense.

Privately, Genz began to fear that the di lep was imaginary, that wave-piloting was already extinct....

Huth began creating possible di lep simulations in his free time and recruited van Vledder’s help. Initially, the most puzzling detail of Genz’s translation of Joel’s description was his claim that the di lep connected each atoll and island to all 33 others. That would yield 561 paths, far too many for even the most adept wave pilot to memorize. Most of what we know about ocean waves and currents — including what will happen to coastlines as climate change leads to higher sea levels (of special concern to the low-lying Netherlands and Marshall Islands) — comes from models that use global wind and bathymetry data to simulate what wave patterns probably look like at a given place and time. Our understanding of wave mechanics, on which those models are based, is wildly incomplete. To improve them, experts must constantly check their assumptions with measurements and observations. Perhaps, Huth and van Vledder thought, there were di leps in every ocean, invisible roads that no one was seeing because they didn’t know to look....

he felt tremendous ambivalence about what gaining resources to preserve his culture, or any native culture, seemed to require: allowing outsiders, whether academics or reporters, to commodify it. Secrecy and hands-on training is integral to the tradition of wave-piloting; explaining the di lep would disrupt those features of it even while immortalizing it in books and journals, perhaps inspiring more Marshallese children to become ri-metos....

To teach way-finding, the Marshallese use stick charts, wood frames crosshatched like dream catchers to represent swells coming from four cardinal directions, with shells woven in to symbolize the position of the atolls. These meant nothing to the first European explorers to see them, just as Mercator projections meant nothing to the Marshallese....

Over the last several years, organizations like the United States military and the Federal Aviation Administration have expressed concern about their overwhelming reliance on GPS and the possibility that the network’s satellite signals could be sabotaged by an enemy or disabled by a strong solar flare. The United States Naval Academy has once again begun training midshipmen how to take their position from the stars with a sextant....

they found that the path they had taken was exactly perpendicular to a dominant eastern swell flowing between Majuro and Aur. And at places where the swell, influenced by the surrounding atolls, turned slightly northeast or southeast, the path bent to match. It was a curve. Everyone had assumed that a wave called ‘‘backbone’’ would look like one. ‘‘But nobody said the di lep is a straight line,’’ van Vledder said.

What if, they conjectured, the ‘‘road’’ isn’t a single wave reflecting back and forth between every possible combination of atolls and islands; what if it is the path you take if you keep your vessel at 90 degrees to the strongest swell flowing between neighboring bodies of land? Position your broadside correctly, smack in the di lep’s path, and your hull would rock symmetrically, side to side
cognitive_mapping  cartography  mapping  navigation  marshall_islands 
29 days ago
The User-Centered Library: Digital UX Workshop Preview
By now the concept of user experience (UX) has shown up on most librarians’ radar at some point. Whether you’ve found yourself curious about how better digital design could help your library’s traffic, you wish had a UX specialist on staff, you’re engaged by Aaron Schmidt’s The User Experience column, or you’ve considered learning more about user-centered design yourself, the chance to improve the library’s user experience is within everyone’s reach.
The upcoming Digital UX Workshop: Crafting Exceptional Digital Experiences for the User-Centered Library is the most recent in Library Journal’s series of Lead the Change online courses. Created in collaboration with Electronic Resources & Libraries (ER&L), the five-week online workshop runs from October 20 through November 17, and will feature a roster of library and UX experts tapping a wide range of experience. Library staff members who want to develop introductory or intermediate UX skills, web and general managers working on library projects, and librarians already concentrating on UX who are looking for fresh inspiration all stand to benefit from the workshop, as well as the hands-on experience and networking opportunities provided.
“Libraries exist in the digital world and are user-centered at their core, so these skills and ways of thinking are critical for the future of all types of libraries, explained Bonnie Tijerina, ER&L president and founder and a 2010 LJ Mover & Shaker. “This is why I see the workshop as being valuable for a seasoned web professional, a newbie to UX, a library team working on a digital project, and anyone thinking about their users’ online experience.”
user_experience  libraries  interfaces 
29 days ago
What is Cities and Memory?
Cities and Memory is a global field recording & sound art work that presents both the present reality of a place, but also its imagined, alternative counterpart – remixing the world, one sound at at time.

Every faithful field recording document on the sound map is accompanied by a reimagination or an interpretation that imagines that place and time as somewhere else, somewhere new.

The listener can choose to explore locations through their actual sounds, to explore reimagined interpretations of what those places could be – or to flip between the two different sound worlds at leisure.

A world of sounds
There are currently over 1,300 sounds featured on the sound map, spread over more than 55 countries.
sound_art  sound_map  mapping  sound_history 
4 weeks ago
Explore the World’s Sounds Through a Map of Field Recordings and Remixes
Since 2014, hundreds of artists have been making field recordings, transforming them into new sounds, and sending both files to the online project Cities and Memory. Its founder, the Oxford-based musician Stuart Fowkes, has steadily been building a global, collaborative sound map that highlights the relationship between sound and personal memories and experiences. To date, it features over 1,300 sounds from more than 55 countries, with each sound accompanied by its remix: one user has tweaked a rush of water from Finland’s Lake Kilpisjärvi into a slow, ambient track; another, the shrill singing of cicadas in Sandy Bay, into a heavy, thrumming drone.
sound  sound_map  field_recording  sound_space  sound_history 
4 weeks ago
Palantir Files Nasty Lawsuit Claiming Early Investor Stole Its Ideas
I n 2006, then-New York City mayor Michael Bloomberg issued an executive order establishing the Office of Special Enforcement, a citywide agency responsible for enforcing “quality of life” regulations—a nebulous, ideologically charged concept that refers to anything from music venues with too many noise complaints to nightclubs that facilitate prostitution to decrepit structures that pose a fire hazard....

New York City enlisted the CIA-backed data analysis firm Palantir Technologies. In December, 2011, the city granted Palantir the first of at least five contracts, ultimately amounting to more than $2.5 million, according to a review of public records obtained by Gizmodo. Palantir’s software has since become a centerpiece of New York’s mission to improve “community livability and property values”—that is to say, quality of life.

When Bill de Blasio took office in 2014, he doubled down, and paid Palantir $907,413 for 24 “Gotham” server cores and licenses for the Department of Finance. Later that same year, the City paid $20,000 to provide 10 inspectors from the Office of Special Enforcement (OSE) with Palantir’s mobile technology, connecting them to everything the city knows about every place within it. ...

Palantir’s platform, in the company’s own words, “enables organizations to integrate disparate data sets and conduct rich, multifaceted analysis across the entire range of data.” It’s basically a high-powered version of Excel, rendering an incomprehensible amount of information legible, parsable, and ripe for analysis....

In its first purchase order to Palantir, New York City’s Department of Information Technology and Telecommunications (DOITT) paid $1.2 million for 32 perpetual server core licenses, at $37,500 each. (A “perpetual server core license” means the city is licensed to use the Palantir software in perpetuity.) ...

In December 2013, MODA issued its first annual report, lauding the Department of Finance’s use of Palantir, which enabled it “to better understand tax fraud in NYC.” The New York City Sheriff’s Office—a division of the Department of Finance—was using the technology to track illegal cigarette importation rings and “developing their own in-house intelligence team,” chief analytics officer Michael Flowers wrote. The report also revealed a pilot project in which MODA and OSE, using Palantir, “developed a tablet application that allows inspectors in the field to easily see everything that the City knows about a given location.”...

Palantir described what the mobile program would look like in a September 2013 statement of work. “Palantir Mobile takes the capabilities of the Palantir Platform out of the enterprise and into the field, where mobile users can collect information in real time and relay it back to the Mayor’s Office (headquarters) for deeper analysis,” the document reads. In so doing, inspectors from the Office of Special Enforcement leverage “information from data sources across the city to hold building owners and establishments responsible for the upkeep and maintenance of the City’s quality of life.”

That information is maintained by something called “DataBridge,” a repository of data from 50 sources belonging to “roughly 20 agencies and external organizations.” There are nine datasets—updated nightly—that are integrated and managed using Palantir, a mayoral spokesperson told Gizmodo: “This is how we regulate and maintain oversight for all access and privacy issues.” The City Hall official discussed the city’s use of the data-mining technology on background, and declined to provide the full list of data sources or describe what is contained in the datasets.

All city agencies have access to DataBridge, but protocols regulate their access. “If they are requesting use of data that they are not providing themselves, they have to get specific permission to use specific data within Databridge, and will not be able to access anything beyond that,” the spokesperson wrote in an email. “Any data shared with Databridge has a designated point person or agency who needs to give explicit permission before that data is used, often requiring in-depth scrutiny of that use before permission is granted.” ...

...consolidating “everything that the City knows about a given location” into one place also raises questions about privacy and civil liberties—questions Palantir has tried to preempt with a white paper posted on its website. The document outlines a variety of “access restrictions” that law enforcement agencies can choose from, and in 2014 the New York Times reported that Palantir’s safeguards include “audit trails” that third-party investigators can review. These safeguards and restrictions are not mandatory....

The NYPD—whose record on privacy and civil liberties includes extensive monitoring of Muslim communities without warrants or specific threats—also uses Palantir...

As Palantir expands beyond the world of law enforcement and national security, its impact on our civil liberties—and whether rigorous safeguards are enacted to prevent overreach—remains to be seen. As Jay Stanley, a senior policy analyst at the ACLU, pointed out last year, “Government security agencies and others using data for “‘risk assessment’ purposes are trying to decide who should be blacklisted, scrutinized, put under privacy-invading investigatory microscopes, or otherwise limited in their freedom and opportunity.”...

“It seems like the trickling down of ‘Big Data’ approaches from prosecuting real, significant wrongdoing into the enforcement of petty rules and regulations,” Stanley told Gizmodo. “Maybe it’s just enabling better management, but it seems like there is real potential for selective enforcement. If the authorities have enough data about you, they can always find something you’ve done wrong.”
big_data  smart_cities  archives  law_enforcement  surveillance 
4 weeks ago
In 1997, a partially interactive version was created in collaboration with Eric Rosolowski for SUNY-Buffalo’s Electronic Poetry Center. In that early version, columns from the periodic table were isolated on screen; the element symbols were replaced by words that contain the symbols. For example “Cu,” which is the symbol for copper, was replaced by the word “cucumber”; “Ag,” the symbol for silver, was replaced by the word “aggressions,” and so on....

The language of the poem was sourced and assembled from research into the history of the element, as well as research into the word “cucumber”—a collage procedure ultimately followed for each of the small poems in the piece. My primary sources for the research were The Oxford English Dictionary, The Periodic Kingdom by P.W. Atkins, and an online database of the elements from the Los Alamos National Laboratory....

Then I tried to figure out what catalysts could lead to a chemical reaction. What would those catalysts look like in print? I asked my father, a chemist for the U.S. Department of Agriculture, if he could send me his favorite reactions. He kindly sent me the note below which, with an amusing prescience, also touches on the themes of versioning and loss.
digital_preservation  preservation  textual_form  poetry  chemistry 
4 weeks ago
Visiting the BBCSound Archives
There is something incredibly seductive about old recordings. In “The Recording that Never Wanted to Be Heard and Other Stories of Sonification,” from The Oxford Handbook of Sound Studies, Jonathan Sterne and Mitchell Akiyama question the desire for “sonification” of ever-older recordings, especially when such desires manifest in the creation of a digital sound file in 2008 for “the world’s oldest recording,” a phonoautogram from 1860, which was nevertheless never intended to be played back—the phonoautograph was intended as a device to make the aural visual (555). ...

With a little leap of the imagination, it’s not difficult to see the parallels with the reality of sound recording limitation. The wax cylinders could only be played a few times before the sound degrades completely. Tin cylinders are not much better. This is the reason why the two Gladstone voices could be both “real” and “fake.” Celluloid is more durable, yet witness the reluctance of Dell to play one for longer than a few seconds, for preservation reasons.

Sound recordings are only as good as the medium on which they are recorded, a fact that surprisingly holds true even today. We were told by our BBC hosts that discs of shellac, vinyl, and acetate whose contents have already been digitised will not be discarded—digital recordings are ultimately taken from these physical originals....

As Josh Shepperd puts it brilliantly, “Sound trails continue where paper trails end.” As Director of the Radio Preservation Task Force at the Library of Congress, his efforts have underlined the fact that often it is the local and the rural whose radio or audio history vanishes more quickly than the national or the metropolitan. This would historically be the case with the BBC as well, which for a long time privileged London sound above regionalism (and, some would argue, still does). Since 2015, the British Library (and the Heritage Lottery Fund) have invested significantly in the Save Our Sounds campaign, positing that within 15 years, worldwide sound recordings must be digitized before recordings degrade or we no longer have the means to play the material.
archives  radio  sound  media_archaeology  sound_history 
4 weeks ago
5300 Rare Manuscripts Digitized by the Vatican
When it comes to manuscripts, the Vatican Library is no less an embarrassment of riches. But unlike the art collections, most of these have been completely inaccessible to the public due to their rarity and fragility. That’s all going to change, now that ancient and modern conservation has come together in partnerships like the one the Library now has with Japanese company NTT DATA. Their combined project, the Digital Vatican Library, promises to digitize 15,000 manuscripts within the next four years and the full collection of over 80,000 manuscripts in the next decade or so, consisting of codices mostly from the “Middle Age and Humanistic Period.” They’ve made some excellent progress. Currently, you can view high-resolution scans of over 5,300 manuscripts, from all over the world. We previously brought you news of the Library‘s digitization of Virgil’s Aeneid....

Further up, from a similar time but very different place, we see a Pre-Columbian Aztec manuscript, equally finely-wrought in its hand-rendered intricacies. You’ll also find illustrations like the circa 17th-century Japanese watercolor painting above, and the rendering of Dante’s hell, below, from a wonderful, if incomplete, series by Renaissance great Sandro Botticelli (which you can see more of here). Begun in 2010, the huge-scale digitization project has decided on some fairly rigorous criteria for establishing priority, including “importance and preciousness,” “danger of loss,” and “scholar’s requests.” The design of the site itself clearly has scholars in mind, and requires some deftness to navigate. But with simple and advanced search functions and galleries of Selected and Latest Digitized Manuscripts on its homepage, the Digital Vatican Library has several entry points through which you can discover many a textual treasure.
digitization  archives  manuscripts 
4 weeks ago
Faire Europe: Ortelius, Mercator, and the continents
Of course, this wasn’t always the case. For people in the 16th and 17th centuries, geography and cartography were rapidly changing and expanding fields, as European knowledge of other parts of the world grew by leaps and bounds. Maps were often large, unwieldy things, designed to be hung on walls, or rolled up and stored in a traveling map cylinder.

While there had been other bound collections of maps before 1570 (notably Ptolemy’s Geographia, which was reprinted with regularity), “nobody had taken the trouble to engrave to a uniform pattern a methodically selected spread of modern maps, and to market them with minimal text as a generically novel product.”

Nobody until an Antwerp map-colorist and seller2 named Abraham Ortelius, who is now credited as being the first to put together a modern atlas.

Rather than attempt to create maps of all different parts of the world from scratch, Ortelius crowd-sourced. He took existing maps from many known and notable cartographers and used them to fill his Theatrum Orbis Terrarum, which was first printed in Antwerp in 1570. He even credited all of the people whose maps were used!...

Mercator, a close friend of Ortelius, is best known today for his projection; that is, a methodology for consistently representing a round, three-dimensional world onto a rectangular, flat piece of paper. Mercator’s world map of 1569 was an immediate success, but he, like Ortelius, wanted to create an atlas, a single bound volume that could contain more detailed maps of many locations. However, unlike Ortelius, Mercator intended to create all of the maps himself.
atlas  mapping  cartography  printing  textual_form 
4 weeks ago
« earlier      
academia acoustics advising aesthetics_of_administration algorithms archaeology architecture archive_art archives art audio big_data blogs book_art books bookstacks bookstores branded_places branding cartography cell_phones china cities classification collection collections comics computing conference craft criticism curating data data_centers data_visualization databases dead_media design design_process design_research digital digital_archives digital_humanities digitization discourse diy drawing ebooks education epistemology exhibition exhibition_design filetype:pdf film formalism furniture geography geology globalization google graduate_education graphic_design guerilla_urbanism hacking history home illustration information information_aesthetics infrastructure installation intellectual_furnishings interaction_design interface interfaces internet koolhaas korea labor landscape language learning lettering liberal_arts libraries library_art listening little_libraries little_magazines locative_media logistics machine_vision magazines making mapping maps marketing material_culture material_texts materiality media media:document media_archaeology media_architecture media_city media_education media_form media_history media_literature media_space media_theory media_workplace media_workspace memory methodology models multimodal_scholarship museums music music_scenes my_work networks new_york newspapers noise notes nypl object_oriented_philosophy objects organization palimpsest paper pedagogy performance periodicals phd photography place pneumatic_tubes poetry popups postal_service presentation_images preservation print printing privacy professional_practice public_space public_sphere publication publications publishing radio reading rendering research satellites screen security sensation sensors signs smart_cities smell social_media sound sound_art sound_map sound_space sound_studies space storage surveillance sustainability syllabus teaching telecommunications telegraph telephone television temporality text_art textual_form theory things tools transportation typewriter typography ums urban_archaeology urban_design urban_form urban_history urban_informatics urban_media urban_planning urban_studies video visualization voice wedding word_art workflow writing zines

Copy this bookmark: