IDEO builds interactive font map using artificial intelligence
evin ho (KH): seeing all the recent advances in AI and machine learning made me eager to explore how the technology could be applied in design. in particular, I’ve read about a lot of work in the AI research community around computer vision, where algorithms are now able to perform some basic visual recognition tasks as well as people can. this made me wonder whether this new capability in visual recognition could be applied to the visual decisions designers make in their process. when thinking about these decisions, font choice came to mind since there is some subjectivity to comparing fonts and therefore, no clear way that fonts relate to each other, unlike other aspects of visual design such as color....

I’m eager to explore whether visualizing popular pairings on the font map could potentially surface patterns that were previously not known. finally, I’m excited to explore how the font map could evolve into a generative tool — now that we have this map, there is probably a way to explore the space between fonts, allowing designers generate new fonts that don’t yet exist.
artificial_intelligence  machine_learning  typography 
9 hours ago
About the Database Challenge - David Wojnarowicz Knowledge Base
Researchers involved in the development of this resource were challenged by the nature of David Wojnarowicz’ life and art. His works often do not fit into typical art historical classifications of medium and style, nor do they fit into standard archival descriptions. As depicted in the diagram on the left designed by project researcher Francisco Chaparro, there is a complex web of relationships between David Wojnarowicz, his artworks, people he knew and worked with, places associated with him and his art, texts related to his work, and external resources that may be helpful to users of this knowledge base. This recognition drove us away from designing a hierarchical database towards a wiki platform that affords multiple linkages within the resource, promoting a deeper understanding of these relationships.

To illustrate these complex relationships, Wojnarowicz used the same images, objects and references in multiple contexts, allowing them to develop multiple meanings and functions. One example of an object that was presented in multiple contexts is a life-sized shark that he covered with maps and exhibited as a component of his Burning Child installation at the Gracie Mansion Gallery in 1984. The shark was also shown as a hanging sculpture apart from the installation in addition to appearing in photographs. It exists in his personal papers as both an independent artwork and an element of another artwork.

The Magic Box, depicted below on the right, is another example. It contains sixty-nine objects collected by Wojnarowicz, including plastic toys, jewelry, stones, feathers, seeds, and photographs. The Magic Box disrupts archival and art historical concepts of classification, provenance, context and description since specific functions of the box and its contents are not known. Yet the combination of objects holds complex symbolic and material values that shed light on Wojnarowicz’ life and art. Some objects were integrated into his art production and appear in his photographs and films; others were not. To make these links discoverable, should we identify the Magic Box and/or its components as artworks? Should we link them to multiple artworks on the wiki?
archives  collection 
Planet enlists machine learning experts to parse a treasure trove of Amazon basin data | TechCrunch
Planet, the satellite imaging company that operate the largest commercial Earth imaging constellation in existence, is hosting a new data science competition on the Kaggle platform, with the specific aim of developing machine learning techniques around forestry research. Planet will open up access to thousands of image ‘chips,’ or blocks covering around 1 sauce kilometre, and will give away a total of $60,000 to participants who place in the top three when coming up with new methods for analyzing the data available in these images.

Planet notes that each minute, we lose a portion of forest the size of approximately 48 football fields, which is a heck of a lot of forest. The hope is that by releasing this data and hosting this competition, Planet can encourage academics and researchers worldwide to apply advances in machine learning that have been put to great use in efforts like facial recognition and detect, to this pressing ecological problem....

The goal is to see if competitors can come up with new ways to monitor these situations with machine learning tools created to make sense of the data. It’s a bit like finding a needle in a haystack, according to Scott, which is why the need exists for this machine learning-driven approach, taken on from multiple teams tackling the data from multiple angles.
machine_vision  satellite_imagery  mapping 
3 days ago
Jller – Prokop Bartoníček & Benjamin Maus on Vimeo
Jiller is part of an ongoing research project in the fields of industrial automation and historical geology. It is an apparatus, that sorts pebbles from a specific river by their geologic age. The stones were taken from the stream bed of the German river Jller, shortly before it merges with the Danube, close to the city of Ulm. The machine and its performance is the first manifestation of this research.
A set of pebbles from the Jller are placed on the 2x4 meter platform of the machine, which automatically analyzes the stones in order to then sort them. The sorting process happens in two steps: Intermediate, pre-sorted patterns are formed first, to make space for the final, ordered alignment of stones, defined by type and age. Starting from an arbitrary set of stones, this process renders the inherent history of the river visible....

Technology: The machine works with a computer vision system that processes the images of the stones and maps each of its location on the platform throughout the ordering process. The information extracted from each stone are dominant color, color composition, and histograms of structural features such as lines, layers, patterns, grain, and surface texture. This data is used to assign the stones into predefined categories. Those categories represent the range of stones that can be found in the specific river and correspond directly to the age of the stone. They are the result of a classification system that is trained by sets of manually selected and labeled stones. Because there are only a limited number of stone types that can be found in a specific river, this system proves to be very accurate.
The stones get picked up by an industrial vacuum gripper, which can rotate around its own axis. This way the pebbles can also be aligned.
sorting  classification  geology  automation  machine_vision 
3 days ago
Official Google Australia Blog: Books and blockchains: new possibilities for digital literature
Today we’re excited to release two new books which, we hope, will continue to inspire fresh conventions around how we think of books and ‘bookness’, and how authors can work with developers and designers to create new formats of non-linear, dynamic literature.
A Universe Explodes, by Google’s own Tea Uglow, is on one level the story of a parent losing their grip on reality.
On another it is an exploration of the idea of ownership in digital culture, asking whether it is even possible to own a digital artefact in the same way we own a physical book or a CD, and using Blockchain to experiment with new models for owning and exchanging digital goods.
textual_form  fiction  blockchain  reading 
4 days ago
USAFacts is a new data-driven portrait of the American population, our government’s finances, and government’s impact on society. We are a non-partisan, not-for-profit civic initiative and have no political agenda or commercial motive. We provide this information as a free public service and are committed to maintaining and expanding it in the future.
We rely exclusively on publicly available government data sources. We don’t make judgments or prescribe specific policies. Whether government money is spent wisely or not, whether our quality of life is improving or getting worse – that’s for you to decide. We hope to spur serious, reasoned, and informed debate on the purpose and functions of government. Such debate is vital to our democracy. We hope that USAFacts will make a modest contribution toward building consensus and finding solutions.
data  open_data 
4 days ago
Hopeful Resilience - e-flux Architecture - e-flux
Resilience plays important roles in many different fields, ranging from economics to engineering to forestry.7 The understanding of resilience most crucial to this discussion is the one that was first forged in ecology discourse during the 1970s, and especially in the work of C.S. Holling who established a key distinction between “stability” and “resilience.”8 Working from a systems perspective and interested in the question of how humans could best manage elements of ecosystems that were of commercial interest (e.g., salmon, wood, etc.), Holling developed the concept of resilience to contest the premise that ecosystems were most healthy when they returned quickly to an equilibrium state after being disturbed. Holling called the return to a state of equilibrium “stability,” but argued that stable systems were often unable to compensate for significant and swift environmental changes. As Holling put it, the “stability view [of ecosystem management] emphasizes the equilibrium, the maintenance of a predictable world, and the harvesting of nature’s excess production with as little fluctuation as possible.” Yet this very approach assures that “a chance and rare event that previously could be absorbed can trigger a sudden dramatic change and loss of structural integrity of the system.”

Resilience, by contrast, denoted for Holling the capacity of a system itself to change in periods of intense external perturbation as a mode of persistence. The concept of resilience enabled a management approach to ecosystems that “would emphasize the need to keep options open, the need to view events in a regional rather than a local context, and the need to emphasize heterogeneity.” Resilience is, in this sense, defined in relationship to crisis and states of exception; that is, it is a virtue when such states are assumed to be either quasi-constant or the most relevant for managerial actions. Holling also underscored that the transition from valuing stability to valuing resilience depended upon an epistemological shift: “Flowing from this would be not the presumption of sufficient knowledge, but the recognition of our ignorance: not the assumption that future events are expected, but that they will be unexpected.”
Contemporary planning, finance, and design practice abstract the concept of resilience from an ecological systems approach and transform it into an all-purpose epistemology and value. These fields posit resilience as a general strategy for managing uncertainty without endpoint, while also presuming that our world is so complex that unexpected events are, indeed, the norm. Resilience also functions in the landscape of planning and management to collapse the distinction between emergence (which would simply denote something new) and emergency (which denotes something new that threatens). In this sense, the terms operates in the interest of producing a world where any change can be technically managed and assimilated while maintaining the ongoing survival of the system, even at the cost of its particular components, be they individuals, ecosystems, or species.
sustainability  resilience 
4 days ago
Using machine learning is used to find Mexico's missing people — Quartz
Or at least there hasn’t been—until now. A team of multi-country researchers, data scientists, and statisticians is using machine learning to predict which counties in Mexico are most likely to have hidden graves. If their model works as well as they hope, it will be a powerful application of an emerging technology that provide answers to one of the most difficult aspects of the desaparecidos problem: knowing where to look.
The team is composed of three separate groups: the Programa de Derechos Humanos at the Ibero-American University in Mexico City; data-focused non-profit Data Cívica, also based in Mexico City; and the Human Rights Data Analysis Group (HRDAG), a San Francisco-based organization that applies scientific analysis to human rights violations (first two links in Spanish).
machine_learning  archives  erasure 
4 days ago
How to Get Forgotten: What Ever Happened to Aaron Kuriloff? | ARTnews
I found Kuriloff’s own account of his work in an undated statement, titled “The Epistemological Question of Reality.” Considering the beauty of the functional objects involved, Kuriloff argues, the “machine esthetic” found in “the Dada work of Duchamp and Picabia” reveals the inseparability of art and life, and demonstrates that “heightened meaning can be found anywhere.” This new art shifted its “identification from the landscape of nature to the new American landscape—the object.” Through “the process of mass culture,” the object—the functional object—attained a “timeless” quality, an unchanging ideal, “like a hardware store or a Sears-Roebuck catalogue.” Kuriloff’s new American landscape was the store.

Kuriloff continued to map the qualities of this capitalist landscape in his second show at Fischbach, in May 1965. He arranged multiple functional objects in compositions on monochrome bases, creating visually abundant displays of, say, drawer pulls, shelf brackets, or gardening gloves....

Two years later, Fischbach mounted what would be Kuriloff’s final solo show, where he unveiled a new body of work he called, “photo factuals,” artworks that shifted his American landscape from the store to the office, and the advertisement. They were large-scale, black-and-white photographs of office equipment—file cabinets, computer banks, cardboard boxes, desk chairs—based on images from trade magazine advertisements, meticulously uninflected, and stripped of logos or spatial context. Hung flush with the floor and unframed, the photo factuals turned Fischbach’s new 57th Street space into an austere, concrete loft simulacrum of a corporate office, like a set from a dark version of Playtime if the 1967 film had been directed by Michelangelo Antonioni instead of Jacques Tati.
conceptual_art  aesthetics_of_administration  filing  intellectual_furnishings  things 
5 days ago
Welcome - C2O library & collabtive
Established in mid-2008 in the centre of Surabaya, C2O library & collabtive is an independent library and coworking community space. 

With more than 7,000 books, journals, magazines, etc, C2O library houses quality prints and audio-visual collections on various subjects, with emphasis on history, social sciences, humanities and literature—in English and Indonesian.

More than just a comfortable place and books, C2O is built to be a strategic hub to learn and organise progressive activities: book discussions, film screenings, workshops, monthly farmers’ market, meetings of various communities, walking tours, and various others.
little_libraries  libraries  reading 
6 days ago
Google’s Dueling Neural Networks Spar to Get Smarter, No Humans Required | WIRED
In 2014, while still a PhD student at the University of Montreal, Goodfellow dreamed up an AI technique called “generative adversarial networks,” or GANs, after a slightly drunken argument at a bar. But however beer-soaked its origins, it’s a wonderfully elegant idea: One AI works to create, say, realistic images, while a second AI analyzes the results and tries to determine whether the images are real or fake. “You can think of this like an artist and an art critic,” Goodfellow says. “The generative model wants to fool the art critic—trick the art critic into thinking the images it generates are real.” Because the second AI is working so hard to identify images as fake, the first learns to mimic the real in ways it couldn’t on its own. In the process, these two neural networks can push AI toward a day when computers declare independence from their human teachers.
artificial_intelligence  evaluation 
6 days ago
A new documentary goes inside the bleak world of content moderation - The Verge
In this new documentary from the film group Field of Vision, the leader of a session on content moderation asks a simple question: when you open Facebook, you don’t, as a rule, see pornography. Why?

“The Moderators” explores the answer, going inside on office in India where the process happens. Directed by Ciaran Cassidy and Adrian Chen, a journalist who has written about the moderation process, the documentary is a window into a largely hidden part of the internet work force: employees who sit in an office and make decisions about whether to remove explicit photos, or who examine dating site profiles to weed out fakes.

The work turns increasingly bleak, and a group leader explains what it all means for the staff when he tells them they may find some images “disturbing,” before providing examples.

“Be mentally prepared for your job,” he warns.
internet  ethics  digital_labor  content_moderation 
6 days ago
Mapping Human Settlement Around the Earth - CityLab
As the human population grows, so does its footprint. To map these changes, researchers often turn to satellite imagery, because government-collected data can be infrequent and outdated. In particular, nighttime light images can offer a wealth of information about human activity. In fact, as CityLab’s Richard Florida has written, more than 3,000 studies since 2000 have used nighttime lights as a proxy for all sorts economic activities....

To find a better way, Khandelwal teamed up with economists and geographers at Columbia University, Arizona State University, and the Big Pixel Initiative at University of California San Diego. Together, they created the “Worldwide: Mapping Urbanization” campaign, an effort to track urbanization through daytime imagery and looking specifically at where the built environments lay, pixel by pixel. And they’re asking the public to help. “The basic idea behind this project is to use daytime images in combination with nighttime light to refine the measure of where people are located around the globe,” Khandelwal tells CityLab....

The campaign launched last week through the crowdsourcing site Tomnod, where contributors can help researchers identify objects and places in satellite images. In each round of this project, users are given a random image with a pink box in the center. They’re asked a simple question: is more or less than 50 percent of the space inside that box built? That is, are there more buildings and sidewalks as opposed to grass and bodies of water. The location of the image is purposely hidden so people will focus just on the what they see inside the box and use their best judgment to determine whether there is a human-made structure.

Khandelwal’s team wouldn’t be the first to focus on the human population through the lens of the built environment. Last October, during the momentous UN Habitat III conference, the European Commission’s Joint Research Center launched a comprehensive open database looking at the past 40 years of human settlements via some 12.4 billion satellite images....

His team is trying to develop machine-learning methods that could change the way cities are mapped. The hope is to get hundreds, even thousands, of mapping enthusiasts to participate over the next month. Their responses will be fed into an algorithm that will boost its accuracy in predicting what area is considered “built up.” Down the line, the researchers hope to train the algorithm to predict things like how “economically vibrant” a city is or how much wealth is in an area based on, say, the type of structure recorded in the image.
urbanization  methodology  mapping  cartography  light  economy  crowdsourcing 
7 days ago
Google Arts & Culture Experiments
At the Google Cultural Institute’s Lab, a team of Google software engineers, artists and creative coders come together to experiment at the crossroads of art and technology. We believe that through the collaboration with the cultural sector, curators and artists we can develop the best tools and technology for cultural institutions around the world. We created this space for you to explore the Google Arts & Culture Experiments. The Experiments are aimed at discovering new ways people can explore art and browse the collections of our partner museums from around the world.
archives  google  classification  interaction_design  machine_learning 
8 days ago
Chrome Experiments
Chrome Experiments is a showcase of web experiments written by the creative coding community
interaction_design  data_visualization  pedagogy 
8 days ago
Abstracts | The Conquest of Ubiquity
In 1947, a group of seasoned photojournalists including Robert Capa and Henri Cartier-Bresson founded the international photo cooperative Magnum. At first, Magnum’s challenge was to cover the world with its limited network of photographers, and to get their pictures to as many magazine clients as possible before the novelty of those pictures expired. A decade later, Magnum’s problems had to do with filing cabinets, log books, storage space, and “dead” material. In 1958, Magnum’s New York-based executive editor John Morris begged photographers to “STOP shooting for a period of one month” so that staff could figure out a better system for editing, captioning, and selling their stories
filing  archives  photography  information_overload 
8 days ago
Passing Current 32: The oblique function
He was obsessed with the mute stubbornness of the long-abandoned bunkers of Fortress Europa, massive blunt shapes tilting and shifting in the dunes of the beaches. (He wrote the concrete monoliths an extraordinary love letter in 1976’s Bunker Archaeology, the record of a decade on-and-off spent looking out at the line of the blank horizon under the low ceiling of many feet of reinforced concrete overhead, in buildings sinking under their own weight where you couldn’t tell where the floor ended and the wall began.) He wanted buildings that made us feel more alive and aware of natural forces, gravity, and the planet. With Claude Parent, Virilio worked on an architectural principle they called the “oblique function”: everything sloping and inclined, with no flat surfaces. Instead of furniture and comfort conducive to disembodied concentration, we would have dark, windowless chambers made of steep inclines; we wouldn’t just inhabit, they wrote, but “traverse” our shelter. Always off-balance, squatting, leaning, huddling, constantly resisting inertia or momentum, we would live as “perpetual dancers,” as if on the cliff faces, screes and caves of a high-altitude mountainside....

Günther Feuerstein thought that comfort could numb and console, helping people reconcile themselves to a society they should oppose: that the bourgeois apartment, with its thoughtful design and labor-saving appliances, could be a training facility for the bourgeois society. Writing for the German wing of the Situationist International in 1961 – who sought to make society more adventurous through, among other things, removing all destination information from train stations, opening rooftops and subways to pedestrian traffic, and total political revolt – Feuerstein outlined his personal work on “impractical apartments.” He sawed a few inches off two legs of every table, so they wobbled violently, installed locks that failed to keep anyone out and hinges that shrieked and groaned in use, nailed windows open so the apartment could sweat in summer, echo with honking cars and jackhammers, and fill with drifts of snow in winter, drilled holes in the walls, and put in obstacles and baffles through which he could clumsily crash at night in the dark, as the miswired switches turned on lights in other rooms....

Madeline Gins and Shusaku Arakawa designed for immortality: living spaces so unexpected, stimulating, destabilizing, and uncomfortable that they could break lifetimes of habit and force continuous openness to experience – a theory they called “reversible destiny.” They could extend the human lifespan by this architecture, they believed, both by the overwhelming immediacy of constant disorientation, and in having to adapt and move in an environment that resists you.... Reversible Destiny houses and apartments feature poles to cling to while ascending or descending the bumpy, pebbled, sloping floors around the kitchen pit – a terrain of gullies, humps, and ditches, sometimes broken up by maze-like arrangements of barriers. The study is a golden egg with no flat surfaces, which functions as an echo chamber. (The acoustics make the whole building hum and croak.) Switches are at ankle height, and electrical outlets mounted on the ceiling, along with rows of hooks from which to hang guywires, hammocks, festoons, swings, and any clothing you might need. Every surface and object is painted one of thirty-odd bright, contrasting shades.
intellectual_furnishings  detournement  comfort  habitus  architecture 
8 days ago
Study finds female professors outperform men in service -- to their possible professional detriment
“We find strong evidence that, on average, women faculty perform more service than male faculty in academia, and that the service differential is driven particularly by participation in internal rather than external service,” the study says. “When we look within departments -- controlling for any type of organizational or cultural factor that is department specific -- we still find large, significant differences in the service loads of women versus men.”
All that matters because service loads “likely have an impact on productivity in other areas of faculty effort such as research and teaching, and these latter activities can lead directly to salary differentials and overall success in academia,” the paper says. “In the urgency to redress not only differences in time use but compensation imbalances, as well, the service imbalance is one that deserves to rise to the forefront of the discussion.”....

The authors assert that service is an area of inequity that can be addressed relatively easily, via careful monitoring of service requests and allocations. Female faculty members, it says, “could be mentored to show more selectivity in their service-related choices and cultivate their ability to say no to requests.” Department chairs and deans, meanwhile, “could be made to be more fully aware of how service assignments are being meted out. A simple increase in overall awareness of this issue may improve overall attitudes toward service loads, remove traces of gender bias from service expectations and enable both women and men to accept or decline service requests with equal ease and impunity.”
Guarino in an interview underscored the concept of awareness, saying that women don’t necessarily know they’re doing or -- as the case may be -- being asked to do more until they see objective proof of service imbalances between male and female faculty members.
academia  labor  committees 
12 days ago
Points of Presence - YouTube
Few users of social media and mobile devices recognise how their everyday swipes, likes, and retweets mobilises a global megastructure that spans the earth, impacts ecologies, and plunges under the sea. This experimental 20-minute video submerges the audience in the socio-ecological tangles of the materiality of the internet. It shows what can been seen and mediates the unseen. The video focuses not on the consumerism surrounding digital culture but rather on the symbiotic relationship between information infrastructure and the geographic, geologic, oceanographic, and atmospheric elements, immersing the audience in the textures, sounds, vertical vision, of the digital ecology of the North Atlantic. 'Points of Presence', though tracing several undersea cables, reveals how the internet is a material political object intertwined with the natural environment, human labour, and the mobility of data.
infrastructure  infrastructural_tourism  internet  film 
12 days ago
Abigail Reynolds is awarded the next BMW Art Journey - Announcements - e-flux
Abigail Reynolds’ artistic practice is closely linked to books and libraries. Having studied English Literature at Oxford University, she frequently draws inspiration from literary essays and figures to imagine places and moments from the past, present and future. Given this deep connection to libraries and literature, it is no surprise that Reynolds’ BMW Art Journey project for 2016/2017, The Ruins of Time: Lost Libraries of the Silk Road, will allow her to connect the complex religious and secular narratives of Europe and Asia and to expand her current interests and working methods through an extensive multi-continent series of visits to historic and fabled repositories of books. The artist will trace 16 sites of libraries lost to political conflicts, looters, natural catastrophes and war. Conceptually, Abigail Reynolds intends to explore blanks and voids, with the library symbolising the impossibility of encompassing all knowledge. "The research I have done towards this journey privileges the known," the artist stated in her proposal for the Art Journey, "but it will bring me to question what we understand as knowledge. I do not want to embark on a history lesson, but on a philosophical journey."
Along the way, Reynolds will gather representations in various forms: 3D scans, photography, microscope imagery, written text, plans or cataloguing systems. Based on this extensive research, she intends to create a cluster of book forms, prints, collages and moving-image works, the latter being her first attempt to work in this medium. Images, texts and other documents originating from the experience will, after its conclusion, be included in a book—thus completing a journey that both starts and ends with the institution of the library.
libraries  deep_time  history  travel  library_art 
13 days ago
Ithaka S+R, OCLC Research to examine how universities, libraries are changing
How do you measure the impact of a library when the number of books on its shelves is no longer its defining characteristic?
The research arms of Ithaka and the library collaborative OCLC have launched a joint project to find out. Over the next 14 months, researchers with the organizations plan to survey the higher education landscape to identify how colleges and universities are differentiating themselves, explore the different types of services libraries are investing in, and help college librarians articulate the new ways in which they are creating value for their institutions.
“Our research question is: What happens when libraries differentiate themselves in terms of services, not collection size; are there multiple models of success?”...

“If these books that are filling the shelves and occupying an awful lot of prime real estate on campus aren’t being used, what else should the facilities be used for, and what is the right kind of support for the faculty and students in the institution?” Marcum, a senior adviser for Ithaka S+R, said. “Just as there are different types of institutions, there are going to be different measures of success for libraries.”
Two recent projects highlight some of the directions university libraries are headed in. Georgia Institute of Technology, with its focus on STEM fields, has decided to move virtually all of its physical books to a storage facility. Arizona State University, in comparison, will also move much of its physical collection out of its main library, but use the space to better showcase its special collections and, perhaps, exhibit rotating collections organized around a monthly theme.
Those are two examples of the “clusters” of similar institutions that Dempsey and Marcum said their project may outline. For example, their research could find groups of colleges defined by their focus on teaching students or their faculty’s research output. At the same time, the project will look at which “bundles” of services libraries at those institution are prioritizing, with the goal of producing a framework that can be used to display a library’s strengths in key areas.
libraries  academic_libraries  collections  service 
13 days ago
Mayor de Blasio Brings NYC's First Neighborhood Innovation Lab for Smart City Technologies to Browns | City of New York
Mayor Bill de Blasio, Chief Technology Officer Miguel Gamiño, and New York City Economic Development Corporation President James Patchett today announced that Brownsville, Brooklyn will be home to the City’s first Neighborhood Innovation Lab. The tech equity initiative brings together community members, government, educators, and tech companies to help address neighborhood concerns with cutting-edge technologies.

Brownsville’s Neighborhood Innovation Lab will kick off this week with a series of strategic planning sessions for community leaders. Over the next four months, these community advisors will work with the City to define neighborhood needs and explore how smart city technologies can help improve quality of life and support local economic development. The first community forum, with activities for all ages, is scheduled for May 2017. Also, beginning this summer, the first set of new technologies – including trash cans that alert sanitation workers when they are full, solar-powered benches that offer free cell phone charging, and interactive digital kiosks – will be rolled out in Brownsville. Community residents will be invited to test out these devices and share feedback that City agencies will use to evaluate the impact and value of these technologies. ..

“Neighborhood Innovation Labs provide a unique opportunity to strengthen our collaboration with community, and also open new doors for local residents to learn about careers in technology, a fast-growing sector of our economy.”

The model for Neighborhood Innovation Labs was first announced at the White House in conjunction with President Obama’s Smart Cities Initiative in September 2015, and fine-tuned as part of the Envision America program in 2016. Neighborhood Innovation Labs are a public-private partnership led by the Mayor’s Office of Technology and Innovation, New York City Economic Development Corporation, and NYU’s Center for Urban Science and Progress. Brownsville Community Justice Center will serve as the lead community partner for the City’s first Neighborhood Innovation Lab, and Osborn Plaza will serve as the anchor site for public programs and initial technology demonstrations. ...

“We have identified the smart cities and civic tech industry as having major potential for job growth in New York City," said NYCEDC President and CEO James Patchett. "By connecting this industry with neighborhoods across the city, we can both increase the impact of smart cities solutions and teach communities about an entirely new segment in our economy. This is all part of the de Blasio Administration’s strategy to invest in high-growth industries and connect New Yorkers to better opportunities by creating 100,000 jobs over the next ten years. We are proud to partner with the Mayor’s Office of Technology and Innovation and NYU's Center for Urban Science and Progress on this important initiative and look forward to seeing its impact across our city.”
smart_cities  civic_engagement  public_process  brownsville 
15 days ago
What if you could listen in on the chemical communication within your body? – We Make Money Not Art
The Rhythm of Life, first shown at Museum Boijmans van Beuningen in Rotterdam, offered visitors a chance to listen in on the electro­chemical messages transmitted by their bodies, hearing their emissions as complex percussive rhythms. But before placing their hands in the PMT and engaging with the work, the visitors had to agree to donate their personal body data to scientific and artistic research....

Although the transformation of the functional state of the living organism into sound was an important dimension of the work, the artists and designers were also interested in looking at the processes and authoritative gestures that legitimise the collection of personal information and how informed consent is attained and defined.

In the age of the quantified self, what does it mean to donate biological data? How much does it (or should it) matter to us that we can keep control over it? Does this biological data have more value for us than other types of data?...

The tension within the work was designed from the placing of a piece of scientific equipment within an artistic context (an art Museum) and the questions of legitimacy that this raises. Due to the nature of the biophoton data, there were certain practices participants were required to perform to ensure the “purity” of the measurements, such as sanitising their hands or wearing a pair of black gloves to block off ambient light. This placed considerable emphasis on the performativity of implied consent during the donation process. The written consent forms for the donation of the data were worded and designed to be as close to legitimate consent forms used within lab practice as possible. ...

The partnering artwork developed with the Data and Ethics Working Group deliberately probed the authoritative gestures associated with data consent and ownership, occupying a grey zone between subversive performativity and bureaucracy. In doing so, it operated around, rather than in strict accordance with, a typical clinical methodology demanded in scientific studies....

Biophotons are emitted within the visible light spectrum and although sensors in our retinas can respond to individual photons, neural filters prevent us from eliciting a conscious response. They become “invisible” to the naked eye, to prevent us from constantly seeing too much noise in the low light range. Therefore as a communicative process, it exists beyond the range of our conscious (human) perception. As we mentioned above, there was an interest in the rhythmical qualities of biophoton emissions and, from the perspective of “sensing data”, we were triggered by possibility of making this communicative process perceptible. As sound is absorbed through the whole body, we were most triggered by the ambiguous experience of the fluctuations in these body emissions over time, to think more abstractly about the meaning and experience behind these rhythmical patterns.
tools  measurement  self_tracking  quantified_self  methodology  data_art  data_sonification 
15 days ago
Technical manuals, reports, operational flowcharts, memos, and other such administrative documents are rarely afforded much critical weight. This is unsurprising: their everydayness and utilitarian design provides little spectacle, their messages are often either banale and self-evident, or overly complex and jargonistic. We ordinarily find ourselves inundated with these materials in familiar administrative spaces such as the office, the academy, the hospital, the bank. Their affected neutral tone might inform us of the terms of our contract of employment, how to "manually handle" a cardboard box, who our line manager is, or remind us to fill the form out in block capitals. Such documents can be considered as a particular form of media in their own right—a form of media that has its own powerful and paradoxical capacity to mediate.

Such documents are what Fuller and Goffey (2012) call grey media. Grey media are the artefacts of institutional bureaucracy—the enframing and didactic materials that ostensibly formalise behaviour and communication in institutional spaces. Their greyness facilitates their falling away into the background, to be read once and acknowledged (perhaps with a signature—a favoured implement of bureaucracy), and then filed away. But it is precisely within this grey, recessive banality that its latent power resides. If its purpose is to make a gesture towards standardising operating procedures, then it is at its most institutionally potent in the time of crisis or error: when procedure fails, the power of grey media is formally acknowledged. It is duly drawn out of its recessive state to become the template of rigorous analysis, the basis for establishing what—or who—did not follow procedure, and what the disciplinary consequences are.

Grey media thus provides a valuable subject of analysis for those who wish to understand the mechanisms of institutionality—what can be understood as the technologies through which the institution governs its personnel—for it offers one way of peering into the internal logic of institutional control. Sometimes, these documents help us to see how an institution sees itself, and perhaps even demonstrating how it designs itself. Grey media's administrative weight varies from the imperative to the perfunctory: payroll spreadsheets and out-of-office replies; corporate profit reports and petty cash books; internal security reviews and kitchen hygiene reminders posted on the communal fridge; and so on.

...a second moment when grey media's power is established: that of the leak. The tactical contravention of the institution's confidentiality clause—the giving-over of private administrative documents for public scrutiny—has featured prominently in the mass media and, as history demonstrates, with great consequence. Grey media often lies at the core of the political leak.
administration  gray_literature  handbooks  management  standards 
16 days ago
From Architecture to Kainotecture - Accumulation - Accumulation
All the architecture that we know of is architecture of the Holocene. Architecture has had to deal with a lot of unpredictable factors, but the climate of the Holocene has always been an assumed constant. This is the case even with architectures that deals with very unpredictable conditions. ...

So while there have been architectures for unpredictable climates, they have been unpredictable within the general form of the Holocene. What does not yet exist is a way of building for a climate that is outside the parameters of the Holocene...

The version of architecture’s potential future I want to concentrate on is symbebekotecture: building for the accident, and also to some extent, accidental building. It is a ridiculously awkward term for an often ridiculous and awkward set of building practices that have already taken place, yet their accidental and improvised quality make them precursors to any practice of kainotecture....

Mentioning the accident and architecture together immediately suggests the work of Paul Virilio, particularly his Bunker Archaeology.2 One could think of the bunker as a kind of symbebekotecture, a structure built to withstand the most likely accidents inflicted upon it by attack....

How can one construct a knowledge of whether the “built form” of a mechanized invasion can happen on a given terrain about which one cannot gather direct evidence? This might be the key point on which one might usefully think of Bernal as a precursor to any possible kainotecture, as this is where he really shone as a symbebekotect. By what experimental accidents could one construct a knowledge with which to design this mass accident? ...

It may seem cynical to advocate a kind of territorial conquest for a discipline as a possible advantage to be derived from the Anthropocene, but I don’t think anything much is going to be accomplished in the Anthropocene by good intentions, moral appeals or dispiriting disquisitions on how awful it already is and will continue to be. Rather, it is a time to orient knowledge production around likely future contingencies that give knowledge workers useful capacities that they will need sooner or later, and which indeed are already proving useful now. To be a designer of accidents in a tide of accidents ought not to happen entirely by accident, as it did in the case of Bernal and his co-workers.
architecture  defense  militarism  resilience 
16 days ago
How to Own a Pool and Like It - Triple Canopy
KJL0 was one of 630 speakers recorded in 1986 at the facilities of Texas Instruments in Dallas, where researchers from the electronics company, Massachusetts Institute of Technology, and non-profit research institute SRI International were working to develop the first acoustic-phonetic database of American English—a definitive corpus, or collection of utterances, that would provide an empirical basis for linguistic research. Researchers sampled newspapers, novels, literary fragments, recipes, and self-help guides from an older, text-based corpus to compose 450 unique phrases, known as elicitation cues; these were meant to be short and easy to say, while prompting a comprehensive range of American English accents, dialects, and pronunciations. ...

Unsurprisingly, many speakers exhibited KJL0’s stilted elocution, known as “lab speech,” which was blamed on the contrived recording environment.
Funded by the United States Defense Advanced Research Projects Agency (DARPA) through its Information Science and Technology Office, this five-month recording project was designed to create a database that could be used for automatic speech recognition (ASR), which entails the conversion of speech into data that can be stored, processed, and manipulated. ..

DARPA aimed to establish a single resource for representing “realistic” speech. This standardization of speech, and of the speaking subject, is at the foundation of today’s proliferating voice-controlled search engines, autodidactic artificial intelligence systems, and forensic speaker-recognition tools...

In order to get machines to recognize the many permutations of speech in the United States, the researchers at Texas Instruments had to construct speaker archetypes that could represent each region and demographic, and then make their characteristic dialects and patterns legible as data. ...

SRI International engineer Jared Bernstein created two shibboleths meant to demonstrate drastic differences in pronunciation between regions: “Don’t ask me to carry an oily rag like that”and “She had your dark suit in greasy wash water all year.”...

Before being recorded, every speaker was classified by level of education, race, age, and height (earlier research had shown a correlation between size and vocal resonance). A speech pathologist identified irregularities such as vocal disorders, hearing loss, and linguistic isolation...

They constructed the typical American English speaker as white, male, educated, and, oddly, midwestern. This assumption was based in part on decades of American broadcasting, which had established the
midwestern dialect of anchors like Walter Cronkiteas neutral and accent-less—what was referred to as General American or Broadcast English...

Researchers at Texas Instruments did not seem particularly concerned with the social interactions and forms of stratification that might occur within a dialect region, and how that might shape one’s speech; in order to create a workable, stable corpus, they conceived of dialects simply as reflections of particular places at particular times, nullifying past and future migrations and social and economic shifts. They neutralized the richness of language as spoken in these regions by selecting speakers that were considered to be more useful in demonstrating dialectic variation: Men were believed to be more likely than women to use regional forms, so twice as many of them were chosen to represent each area...

The United States Air Force used the corpus to help pilots in noisy cockpits verbally interface with their equipment. The Department of Defense used the corpus to develop a system that aimed to automatically identify the sex of the speaker. ...

TIMIT’s ubiquity can be explained by its extensive coverage of the micro-sounds that make up all speech, and not of American speech variations, as originally intended....

TIMIT is still the only fully annotated phonetic resource available to researchers in the booming field of ASR. Consequently, the corpus is likely to have been incorporated into any ASR technology that has launched in the last three decades.
TIMIT has even found its way into non-English ASR research. Because TIMIT’s phones were transcribed in the International Phonetic Alphabet, which provides a common notation system for the world’s languages, they can be “mapped” onto target languages that lack a phonetic database...

In the 1960s, linguists began to utilize computers’ newfound storage capacities to process large volumes of samples and develop a more nuanced understanding of speech. In the nascent field of corpus linguistics, researchers collected vast swathes of printed materials in order to train machines to parse large bodies of literature and identify the grammatical function of each word; determine consistent terms associated with genres; and survey mass market publications to demonstrate that certain words were falling out of usage or assuming new meanings...

The Brown Corpus included five hundred samples that comprised more than one million words (all from prose printed in the United States in 1961), and served as a source for TIMIT’s elicitation cues.
Like TIMIT, the Brown Corpus was ambitious in scope but profoundly limited by time and resources. It was
constructed somewhat haphazardly, drawing from newspapers, textbooks, men’s health magazines, and pulp novels, among them How to Own a Pool and Like It, Advances in Medical Electronics, With Gall & Honey, and Try My Sample Murders....

The irreducible singularity of each human voice makes it fodder for digital assistants and cockpit interfaces, but also an enticing tool for securing and controlling populations. The voice may be law enforcement’s ideal metric for identifying suspected criminals. Whether ASR meets the standards of evidence for criminal investigations remains a question; nevertheless, researchers and private practitioners periodically claim to have solved the puzzle presented by the full spectrum of human voices, resulting in waves of scientific and legal controversy....

The use of voiceprints and spectrographic methods in courts quickly declined. (Voiceprints have only been accepted by a court once since 1995.) While the Daubert ruling certainly played a large role, there were other factors: namely, the emergence of more promising mathematical and probabilistic methods due to advances in computing. The past two decades have seen most forensic labs in the United States outfitted with some form of Forensic Automatic Speaker Recognition (FASR). Like ASR, FASR uses computer programs to measure the physical parameters of speech in order to extract distinctive features like dialect, cadence, or speech abnormalities. The forensic component is the creation of speaker profiles, which are meant to determine the speaker’s physiological traits as well as physical surroundings.
language  speech_recognition  databases  artificial_intelligence  standardization  voice  identity 
17 days ago
VoCA Journal Back in Time with Time-Based Works
Artists’ relationships to publishing and the distribution of printed matter took a conceptual turn in the 1960s. Self-publishing and using the format of a book or journal as a container for new ideas were strategies artists seized upon for connecting more directly with audiences and for allowing experimental projects to be transported globally through the mail via small books, magazines, and ephemera. Artists’ publications facilitated alternative networks of discourse and exchange of ideas, and several artist-run organizations founded in the mid-1970s focused on the promotion and critical reflection of this medium. In many ways, it aligned with other new media practices, like the nascent video art scene of the late 1960s and early 1970s. There was a critical significance to decentralizing the production and distribution of media.

Several important hubs for the distribution of artists’ books and related materials popped up around world, fostering a network that linked individual artists and collectives. For example, the artist collective General Idea founded a bookstore and archive called Art Metropole in 1975 for artists’ publications and multiples in Toronto. Also in 1975, Ulises Carrion started a bookstore (later turned archive), Other Books and So, in Amsterdam, which served as an important European node in this artists’ book network. In New York, two important organizations bubbled up around this same time: Franklin Furnace and Printed Matter. Printed Matter was founded in 1976 by a collective of artists and art workers that included Lucy Lippard, Sol LeWitt, Edit DeAk, Pat Steir and Walter Robinson, among others. The focus of the organization at its founding was to develop a reliable distribution system for books made by artists.

...the two organizations quickly recognized the redundancy of their respective intentions and missions, which led to an interesting agreement: it was decided that Franklin Furnace would become an archive and meeting space for artists’ publications and Printed Matter would continue on as the distributor and commercial venue for the sales and promotion of these books. This agreement clearly set the agendas of the two organizations. Franklin Furnace transformed into an independent archive and quickly became the leading repository for experimental publications by contemporary artists. In addition to the book archive, Franklin Furnace ran an active exhibition and performance program...

This correlation was further explored in a series of four exhibitions that took place at Franklin Furnace in 1980-81, collectively entitled, “The Page as Alternative Space.” Taking inspiration from Pindell’s essay, Martha Wilson invited specialists in the field to organize different iterations of the show. The documentation of these projects, including checklists, invitations cards, and installation photographs, formed the end-point for our recent exhibition at MoMA. ...

As word of the archive spread, the archive had a more global reach, attracting donations of books and announcements from a diverse international group of artists/publishers from Europe, South America and Asia.

In this way, the archive simultaneously promoted and fostered a network of distribution in the moment as well as preserving the content of this artist movement for future researchers. As part of the process of donating books to the Franklin Furnace collection, artists and publishers were asked to complete submission forms, which included basic bibliographic information like author, title, date of publication, and publisher. The forms also included space for artists to provide a statement. In some cases, artists chose to leave this field blank; in others, they provided a statement that gives direct insight into the spirit and intention of the work. ...

Artists’ books spread the word—whatever that word may be. So far the content of most of them hasn’t caught up to the accessibility of the form. The next step is to get the books out into the supermarkets, where they’ll be browsed by women who wouldn’t darken the door of Printed Matter or read Heresies and usually have to depend on Hallmark for their gifts. I have this vision of feminist artists’ books in school libraries (or being passed around under the desks), in hairdressers, in gynecologists’ waiting rooms, in Girl Scout Cookies…”

...When I founded Franklin Furnace in 1976, I invited artists to read to the public. Every single artist I invited to read chose to manipulate the performative elements of light, sound, relationship to the audience, props, costume and time as part and parcel of the work. (The misnomer “performance art” had not taken hold as yet.) The word in vogue at the time was “piece,” which encompassed the thought, the action, the documentation—drawn or photographed or filmed or published or taped—whatever.
textual_form  artists_books  conceptual_art  publishing  bookstores  archives  performance 
17 days ago
What does fake news tell us about life in the digital age? Not what you might expect » Nieman Journalism Lab
Amidst all of the panic, finger-pointing, hype, bandwagons, and fatigue, what are we to make of this highly mediatized and politicized issue? How are we to understand and collectively respond to the phenomena which are the center of concern? As a network of researchers specialising in digital methods for social, political, and cultural inquiry, over the past few months, we’ve been engaged in a number of projects to trace the production, circulation, and reception of fake news online — and to see how we might bring fresh perspectives and unfamiliar angles to the public debate...

Below are four ways of seeing fake news differently, drawing on our ongoing research collaborations around A Field Guide to Fake News with the Public Data Lab. The guide focuses not on findings or solutions, but on starting points for collective inquiry, debate, and deliberation around how we understand and respond to fake news — and the broader questions they raise about the future of the data society.
fake_news  my_work 
17 days ago
Letter from the Editors | continent.
ordering development practices anew and offering alternative metaphors of fulfillment and care, complete with their knots, complexities, and breakdowns. They address how we organize to sustain technologies across time, charting formations of labor that catalyze around recuperating social and material order. They also explore what happens when we let go of these ties – whether through acts of abrupt severance or slow forgetting – revealing the negotiated limits of repair and its aftermath. In short, this collection affirms there is life beyond design – a welcome thought to those of us who have always found Silicon Valley rather problematic in its universal approach to material production. But it also suggests that we are always already living this possibility, even if our stories so often work to obscure it.

Contributors to the R3pair Volume follow and unsettle objects across all these themes and registers. Marisa Cohn follows a long-lived space mission, tracing the temporal logics of design and repair work and how these are knitted and intertwined through the maintenance of the craft. Lara Houston follows this theme to Kampala, Uganda, where the livelihood maintenance and mending of mobile phones de-naturalizes linear life cycles, extending and complicating the timeliness of repair. Linda Hilfling Ritasdatter’s “Bugs in the War Room” manoeuvres between the imaginaries and reals of supposedly-retired computer languages, Y2K bug, and software development outsourcing. Benjamin Sims extends questions of timeliness in yet another direction, showing how the repair of High Performance Computing systems at Los Alamos is not only reactive, but also inherently forward-facing and anticipatory. By contrast, Jamie Allen tells a story of longevity, and shows how accounts of resilient technological temporalities (here, the Centennial Light Bulb) may be hijacked by motivations of monetary profit.
repair  maintenance 
18 days ago
Helping libraries keep pace with the demands of the digital age - Knight Foundation
Charlotte Mecklenburg Library | $250,000 | Charlotte, North Carolina: For a design and visioning process for transforming the Charlotte Mecklenburg Library into a 21st century urban library.
The MIT Media Lab | $250,000 | Cambridge, Massachusetts: To build a library residency program in which librarians and Media Lab technologists can collaborate.
libraries  knight_foundation 
20 days ago
Full text of "Official catalogue of the New-York exhibition of the industry of all nations. 1853"
Division D. Courts 11, 16, 17, 22.

1 Morse's patent electrio telegraph apparatus in operation, and the wires in
direct connection with the principul lines in the United States. — Wm. M. Swain
(President of Morse's Magnetic Telegraph Company) ; Offices in New- York, Phila-
delphia, and other cities.

2 House's electro-magnetic printing telegraph, in operation. — John B. Richards,
prop. 621 Grand street, New-York City.

3 Electric telegraph register ; various kinds of telegraph insulators ; electro-
magnetic telegraph battery. — John M. Batcheldeb, inv. is prop. 83 Washington
street, Boston.

4 Planetarium of entirely new construction, showing the actual motions of the
Sun, Earth, Moon, Mercury, and Venus, with all the phases of the planets, changes
of the seasons, and other astronomical phenomena. — Thomas H. Barlow, prop,
Lexington, Kentucky.

6 Biker's manual orrery, and other astronomical apparatus. — J. L & D. J.
Riker, manu. IS Suffolk street, New-York City, ...

S2 Philosophical Instruments, etc. — Class 10.

48 Specimens of the daguerreotype art on extra large platea. — A. Bibbek,
Daguerrean Artist, Dayton, Ohio.

49 Portraits in daguerreotype.— S. P. Barnaby, Daguerrean Artist, Dayton,

50 Specimens of daguerreotyping. — William E. North, Daguerrean Artist,
Cleveland, Ohio.

61 Specimens of the daguerreotypic art. — E. L. "Webster, Daguerrean Artist,
Louisville, Kentucky.
media_history  exhibition  media_archaeology  maps  objects  epistemology 
20 days ago
Study: Library directors moving ahead with plans to rethink libraries
Academic libraries are in transition away from serving principally as collection builders and content providers, where size is a metric of success. Many leaders see a future where they will be valued for the contributions they make in support of instruction and learning, and in the case of research universities, in support of research, including their distinctive collections.”...

Compared to those at baccalaureate and master’s institutions, library directors at doctoral institutions were more likely to name a lack of employee skills in key areas as one of their primary constraints. And looking ahead to the next five years, those library directors said they expect to hire new staffers who will focus on a range of specialized topics, among them data analytics, digital humanities and preservation (click the thumbnails to see full-size images)....

Faculty members are also concerned about their students’ ability to find and evaluate scholarly information -- slightly more than half of the 9,203 faculty members surveyed rated students’ skills as “poor.” About one-third of library directors said the same.
academic_libraries  information_literacy 
21 days ago
Open Media Scholarship | open access in media studies
Open Media Scholarship is a nonprofit site, established in 2017, dedicated to promoting open access (OA) in media, communication, and film scholarship and teaching. The site features OA journals, books, and resources, along with searches for OA articles and datasets. The rationale for Open Media Scholarship is that media scholars, for a variety of reasons, should be at the forefront of the OA movement. That argument is elaborated in “Open Media Scholarship: The Case for Open Access in Media Studies” (2016) by Jeff Pooley (Muhlenberg College), who created and maintains this site.
There are three ways to stay updated about new content: Twitter, RSS, or a monthly newsletter.
media_studies  publishing  open_access 
22 days ago
Epochal Aesthetics: Affectual Infrastructures of the Anthropocene - Accumulation - Accumulation
And, it is through the violent infrastructures of geology that new forms of politics are emerging, such as those at Standing Rock around the Dakota Access Pipeline that insist of a different vision of temporal affiliation and material filiation. As Lauren Berlant argues, “An infrastructural analysis helps us see that what we commonly call ‘structure’ is not what we usually call it, an intractable principle of continuity across time and space, but is really a convergence of force and value in patterns of movement that’s only solid when seen from a distance. Objects are always looser than they appear. Objectness is only a semblance, a seeming, a projection effect of interest in a thing we are trying to stabilize.”2 If infrastructures are also structures of feeling and convergences of force, then the appreciation of those affects need to reach down below the surface into the substratum to see how those forces both maintain and disrupt edifices of intention on the surface.

The underground spaces of extraction and the cavernous holes of excavated fossil fuels are now the curatorial spaces that await an anticipatory geology and a future direction beyond human agency. These spaces of extraction are nonetheless tied to the global socio-economic expulsions of late neoliberal subjective life through the invisible labors of subterranean workers and the toxicities that accompany these material mobilizations. The material destratifications of marine, mineral and chemical flows of carbon and nitrogen are returning in various accumulative modalities of pollution, toxicity and anthropogenic climate change, which in turn reconfigure the biopolitical possibilities of life. These affectual material infrastructures are shifting a sense of the planetary by generating new orders of time and a geo-logics of existence, simultaneously hacking and re-syncing the planet and its temporal structures to produce an arrangement of the future that looks decidedly irrational and unthought: infrastructures of geologic mobility that far outstrip, but are a direct consequence of, conventional forms of material communication and transnational infrastructures; geo-logics of social and material expulsions whose visible effects have been hidden in plain sight all along, understood as the waste and excess of normative modes of agency, architectural planning and capital accumulation. The task of geomorphic aesthetics is to think these new duration and material recombinations as an unthought affective infrastructure that subtends the architectures of materiality and resource distribution that themselves inscribe the planetary present of global-world-space. Specifically, geoaesthetics might make sensible how geologic forces move across time and space to disrupt the provisional unity of global-world-space and render new geographical imaginations of intemperate locations in both political and geologic time. This new form of geopolitics would understand the “geo” as a temporal disfigurement of political space rather than as a descriptive mode of spatiality....

The geologic and how it comes to matter beyond representational genres as a sensibility of time as well as a quality of materiality is firstly an archeology of the unthought; an arena of building and accumulation that has been rendered invisible and contractually mute. In this zone of agitation between matter, what comes to matter and that which has seemingly escaped material memory, aesthetics become a crucial space for engaging with geologic force and time and its proposition to stand against the architectures of agency (and reason) that brought the Anthropocene into being....

The development of actual geographic infrastructures that deliver a convincing architecture of global world-space3 began in earnest in the early twentieth century, but the origin of the desire for real-time globality and telepresent communications are already evident in the colonial networks of Empire, the telecommunication and transportation networks powered by coal and before coal by slavery. These architectures of globalizing space came to a point of culmination with the International Geophysical Year (IGY) of 1957-58 and the deployment of a line of weather stations that circled the globe, watching the meteorological shifts in weather patterns for the telling presence of ballistic missiles. That is, the synoptic accumulation of weather that later became the baseline correlative of climate was initially envisaged as a screen for watching incoming rocketry. Alongside these geopolitical experiments in the vertical architectures of aerial and atmospheric control, Sputnik was launched to provide the first geographical “elsewhere” of territory (after the oceans and Antarctica). The new satellite also launched the geographic imagination of an extra-territorial planetary state of sensing and consolidated the Earth as artifactual sphere of operations. The launch of artificial satellites and the landing on the Earth’s own satellite (the moon) initiate the beginnings of what Jennifer Gabrys calls a Program Earth (2016). As Gabrys notes, Sputnik “activated a multitude of new experiences for inhabiting the earth,” first as an audio map of a new orbital environment, and then as a programmable space.....

“Satellites were promoted as making an easy transition from military research and development to ecological and social applications. Remote sensing developed into a critical technology and method within environmental science and became a crucial way in which to study environmental change on a global scale.”5 While the globe appears in the context of the Cold War as the ultimate commodity fetish of the military-industrial complex, how these geographical imaginations of universal spatiality create a homogenizing surface of projection matters in the operationalization of materiality, particularly in how the earth as a global world-space is viewed as infrastructure of intentional propositions rather than a geography undercut by geological processes....

These bipolar measurements attempted to achieve the first synoptic measurements as a co-present image of the world’s weather, which laid the foundation for climate to be constructed as a globalized space of data exchange. This first global meteorological model of climate circulation served as the baseline data point for anthropogenic climate change and ozone depletion.
Climate change (both long term climate shifts and anthropogenic-induced) introduced fractures into universalizing architectures and ruptured the neat enclosure of the whole, both in representational and socio-political terms. Just as global-world-space is nearly mapped from pole to pole, climate change introduces a shock to the imagination of earth systems as independent from social action. Climate change becomes representative—in the double sense of representing practices and ontological arrangements—of the unintended global (side-)effects of these totalizing Western colonial visions of the world and their inability to deal with the material and representative excess of closed world systems. It also demonstrates how world resources have been mobilized for Western growth, while entropy or waste has been exported both into the atmosphere and various colonial and neo-colonial sacrifice zones that constitute the dynamics of global-world-space. ...

Rather than looking at the futurism of climate apocalypses to come or dwell in the cathartic images of catastrophe, the search might be for a geologic imaginary that disrupts notions of the uninterrupted agency of the liberal subject (and its reins on biologic life) within the ever-increasing accumulations of planetary-scale architectural signatures of the human. Such a notion of geologic life might turn towards the intercedial registers of mineralogical agency and its trajectories and modes of stratifications. Examining inhuman and nonhuman excess in the aesthetics of identity formation in the originary scene of the Anthropocene prompts a further question about the role of the nonlocal—or inhuman—in subjectivities and its identifications, determinations, and qualities.
infrastructure  geology  temporality  satellites 
23 days ago
A key to democratizing urban solutions is building better models
So how do urbanists seek information on the potential efficacy of urban planning solutions? We build models. As Sidewalk reimagines cities for the digital age, Model Lab is exploring new ways to approach modeling as a means toward addressing big urban challenges. Models can be used to shape, test, get stakeholder feedback on, and adjust ideas related to land use, transportation, government processes, and many other areas of city life....

The conceit of transportation models (and those looking at other areas of urban life) is that by representing individual decision-making we can learn something about how these decisions lead to a collective impact on a system, and in turn, about how the impacted system affects individual decision-making....

What would make a traveler more or less willing to share a ride? What is the traveler trying to optimize? Cost? Time? Comfort? Convenience? All of the above? A first pass of model variables may include: time and cost, relative to other alternatives; a willingness to share a small space with other people; the physical difficulty of entering or exiting a shared vehicle; and, the perceived burden of out-of-my-way travel necessary to serve other passengers. Does your model have any variables we missed?...

Thinking about how individual preferences intersect with policies — or infrastructure and services — is only the beginning of the fun of building models. Our next step is to translate these preferences and policies into mathematical expressions and then turn these mathematical expressions into computer simulations. Then we compare, over and over, the performance of these simulations against observed outcomes. When we’re done we have a model — one that we hope is useful...

Models provide the opportunity to create a dispassionate venue in which ideas can be explored and tested by anyone interested in the topic at hand. To achieve this goal we must first build models that resonate with decision-makers and the public as credible, legible, realistic, and compelling. We must then allow anyone the ability to create their own solutions, and investigate and explore solutions created by others....

Efforts to improve models using these advances are already underway. Firms such as AirSage, Teralytics, and Streetlight Data have introduced mobile location data to the transportation planning field. Researchers and practitioners are now using location-based data to test the usefulness of machine learning techniques to urban planning problems....

The model provides immediate feedback on how each alternative performs in terms of traffic congestion, transit travel times, transit mode share, greenhouse gas emissions, and other measures. The small groups tinker with their ideas, seeking to optimize the measures they feel are most important — informing their group discussion. The modeling tool used in the workshop is the same one that will be used throughout the project’s life. This consistency allows people to engage at the same level as professional planners do.... The city planners encourage the participants to be skeptical of the model results and question the assumptions driving the simulation. Models are only as good as the information we put into them, and important assumptions are often contentious. The modeling software allows the groups to adjust certain assumptions about behavior or future land developments. After leaving the meeting, residents are encouraged to try out other ideas and alternatives with the model, which are accessible online....

Today, the planning work I’ve described above can take months, if not years. If cities can reduce this time to weeks and extract good ideas from the community, we can create a future in which residents are more engaged and governments are more nimble, responsive, and effective.
urban_planning  models  modeling  simulation  governance  public_process 
23 days ago
The Future Agency - The Verge
The urban AI, hologram genie, and smart bathroom were part of the Museum of Future Government Services, a series of seamless interactive installations that demonstrated to attendees — Emirati politicians and civil servants, as well as foreign dignitaries and business leaders — how the UAE would serve its citizens several decades hence...

Of course, none of the products demonstrated at the 2014 summit actually existed. Rather, Tellart’s job is to create believable, immersive visions of the future based on the needs of its clients, which range from the UAE to Google, Purina, and the California Academy of Sciences — anyone who needs a little bit of tomorrow today. As the company’s co-founder Nick Scappaticci says, “We are the industrial designers of the 21st century.”...

Design fiction is created by a loose confederation of agencies, artists, engineers, and designers who are shaping our expectations of technology and society in decades to come by showing us what that incipient world might look like, down to its cliche brand logos. It’s science fiction made real in the form of interactive exhibitions, product demonstrations, and behind-the-scenes consulting work. And it tends to pop up at any event Davos-ish enough to include the word “influencers.”...

Data-driven future prediction emerged around 1948 with the launch of RAND Corporation. The nonprofit think tank’s “scenario analysis” practice connected military planning with private technology development. RAND’s tactics were adopted by Shell in the 1970s, creating a precedent for the corporate future-consulting we see today....

1991 saw the launch of IDEO, known for helping companies develop new products; the Dutch conceptual design group Droog was founded in 1993; digital-savvy ad makers Blast Radius and Mother in 1996; Tellart in 2000; and Barbarian Group in 2001. Younger competitors like Superflux, Red Paper Heart, Midnight Commercial, and Marshmallow Laser Feast arose in the 2010s....

UAE’s self-conscious utopianism has been labeled “Gulf Futurism.” It can be seen as the deployment of technology in “the proto-fascism of a society that privileges success and speed over human life,” as the Dubai and Brooklyn-based writer Rahel Aima put it in a 2013 interview. ...

The idealized narrative that Tellart created is meant to comfort one of the wealthiest and yet most ecologically imperiled regions in the world. In this future, the money from oil has solved all the problems that oil dependency creates — thanks to technology, the desert becomes a permanent oasis. Tellart’s work reassures its viewers that the environment is an issue that will simply be fixed one day, through no effort on their part, save perhaps cultivating a taste for bugs.
futurism  futuring  speculative_design  government  smart_cities 
24 days ago
A Journey Into the Merriam-Webster Word Factory - The New York Times
But at the center of the main upstairs work area stands a howling mass of irreplaceable historical chatter: the Consolidated Files.

The files, kept in red cabinets that snake around the middle of the room, contain millions of citations: small slips of paper documenting individual word uses, drawn from newspapers, books, radio, packaging and other sources, stretching from the 1980s back well into the 19th century.
dictionary  language  filing  cabinets  intellectual_furnishings 
24 days ago
Trading One Bad Map for Another? - Atlas Obscura
Earlier this month, the social studies classrooms of Boston Public Schools underwent a slight but significant change in decor. Down came the Mercator Projection—a common choice of world maps for schools—which distorts the size of each land mass but keeps continental shapes intact. Up went a different map, the Peters, which stretches out the world in order to give each continent a proportionally accurate amount of room. On the Peters, Canada—so huge on the Mercator—shrinks to its proper size, while Africa, which the Mercator shows shrunk and jammed beneath a too-large Europe, stretches out.

Boston educators are celebrating the choice. “It was amazingly interesting to see [the students] questioning what they thought they knew,” social studies director Natacha Scott told NPR after the Peters was introduced. And news articles about the swap tell a tidy story, in which a more enlightened representation of the world rightfully replaces an outmoded one.

But many map historians are privately disheartened—not by the switch itself, but by the resuscitation of the Mercator-Peters rivalry, a conflict that has bedeviled public-facing cartography for decades, and which they see as a manufactured, false choice.“News of Boston public schools’ decision to go with the Peters projection has gone viral over the past week, and my teeth have not stopped itching,” Jonathan Crowe writes on his blog, The Map Room. “It is incredibly short-sighted and narrow-minded to say it should be one or the other,” says Mark Monmonier, author of Rhumb Lines and Map Wars: A Social History of the Mercator Projection. Even Ronald Grim, curator of the Norman B. Leventhal Map Center at the Boston Public Library, had concerns: “In my mind, both the Mercator and the Peters are controversial projections,” he says in a phone interview. “But we were not asked to be part of the decision.”...

white supremacists have celebrated these geometric incidentals. “[The Mercator] has been used by some pro-Western, pro-Imperial types in the 19th and 20th centuries to map the world, as Europe and North America look much bigger than they are vis-a-vis the more tropical areas they exploited,” writes Matthew Edney, a professor of cartographic history at the University of Southern Maine, in an email. Others argue that by enabling exploration in the first place, the Mercator is an inherently colonialist map. And in the present day, its ubiquity contributes to a virulent strain of racially inflected geographic ignorance—for instance, it makes Africa appear to be the same size as Greenland, when it’s really about fourteen times bigger.

It’s this last issue that Boston Public Schools, which serves a student population that is 74 percent black, is responding to. “We were primarily concerned with the notion of decolonizing our curriculum"...

To other experts, though, trading the Mercator for the Peters isn’t a step up. Instead, they say, it swaps geographic distortion for a kind of historical ignorance. When the historian Arno Peters came up with his projection, in the 1970s, he was unwittingly copying a much older map, the Gall Projection, which was invented by a Scottish minister in the 1860s. (The Peters projection is also known as the the Gall-Peters, for this reason.) On its own, it was never particularly popular. “Arthur Robinson famously said that it looks like long underwear hung out on a clothesline,” says Monmonier. “Most cartographers were not big fans of it.”

Peters’ main success, then, was in rebranding. In order to push for his map’s adoption, he constructed a careful case, based primarily on comparing it to the Mercator. The Mercator “over-values the white man and distorts the picture of the world to the advantage of the colonial masters of the time,” Peters wrote. Only his own equal-area map, he said, avoided these problems while still preserving accuracy and clarity.....

Actual cartographers, though, remained—and remain—unimpressed. “Arno Peters concocted a veritable farrago of lies to sell his map,” writes Edney. “He came up with a number of properties so that he could say, ‘only two maps meet all properties properly, Mercator’s projection and mine—and Mercator’s suffers from all these political problems, so use mine!’ All of his properties are complete B.S.”...

Even without this backstory, Edney, Monmonier and others say there are plenty of better equal-area maps than the Peters. Edney recommends the Ecker IV projection, which preserves the continents’ proportionality and positions without sacrificing too much of their shape. Monmonier thinks anyone concerned with geographic fairness should be using demographic base maps, in which each country’s size is based on its population. Grim thinks the more, the merrier: “If they asked me, what I would advocate is that they have many maps, or several at least,” he says. He is working on an editorial to this effect, which he hopes to publish sometime soon.
maps  projection  mapping  cartography 
25 days ago
It’s the end of the world as we know it, and Norway’s gonna save the printed word » MobyLives
Norway, reports Huffington Post UK’s Oscar Williams, is already on it. In fact, they’ve been on it for a while. Nearly a decade ago, in 2008, the nation constructed, on the island of Spitsbergen about 800 miles from the North Pole, the Svalbard Global Seed Vault, a secure seed bank (the world’s largest) capable of surviving nuclear disaster.

And now, this year, Norway has completed its second “doomsday vault” in Svalbard — and this one’s a library. It’s called the World Arctic Archive. That’s right: to Norway’s mind, preserving humanity’s written and printed culture (one of the many things that the Trump administration, and plenty of other world leaders, are determined to undermine through both intimidation and budget cuts) is second in importance only to the ability to grow food (which is to say, to continue living).

The World Arctic Archive, nestled deep in the permafrost, utilizes a mass storage technique developed by a firm called Piql. Piql uses film to physically store information for up to an estimated 1,000 years. Their technology does not rely on digital storage, which is more vulnerable to hacking and wear. To this point, Piql’s Stefan Axelsson, an expert in computer security and Associate Professor at the Department of Informatics and Media Studies at NTNU Gurgaon, notes that runes carved in stone by Vikings a millennium are still readable today. Comparatively, current standards of media storage are “highly volatile.”
preservation  archives  temperature  analog 
25 days ago
T Shanaathanan: A Cabinet of Resistance | Smriti Daniel
At the Kochi-Muziris Biennale, in Thamotharampillai Shanaathanan’s Cabinet of Resistance, there are 30 drawers. In each is a card bearing a sliver of history. They take the form of narratives, photographs and drawings that come from three decades of war, when the Sri Lankan state fought the militant separatist group Liberation Tigers of Tamil Eelam (LTTE). These histories of resistance chart how people negotiated the vagaries of war, including economic embargos, displacements, restricted travel and censored communications....

Shanaathanan staged it in the Jaffna Library, which had been burned down by an organised mob in 1981. The library used to have one of the largest collections of books in Asia, but at the time it was rebuilt, many shelves were yet to be filled. The artist co-opted these. Pieces of barbed wire, old ID cards, spent bullets—all were placed inside rows of clear plastic bottles which took the place of books in the library rooms. There were no other identifiers or explanations. Unable to separate their grief from that of their neighbour’s, people sometimes broke down weeping, confronted by their collective suffering....

A senior researcher for the Centre for Policy Alternatives, Hattotuwa is fascinated by how each drawer in the library cabinet opens up entire worlds. “By presenting the lives of others through this work, Shanaa is able to transport us, through index cards, into the banality of violence, in all its forms – beyond the enfilade of a battlefield, to the domestic; beyond the headline grabbing deaths, to loss so painful, it can only ever be told as parable; away from the communal to the deeply personal.”
library_art  sri_lanka  card_catalogue  violence 
26 days ago
Anna-Sophie Springer & Etienne Turpin | Necroaesthetics | 22.04.2016 - YouTube
This talk discusses cultural, colonial, economic, and environmental stories and histories to present a mediated history of the deanimated specimen of natural history. By focusing on the progressive scientific discovery of the birds of paradise Anna-Sophie Springer and Etienne Turpin consider three specific moments: first, the groundbreaking role of British naturalist Alfred Russel Wallace's 1850s bird collection; Second, Sir David Attenborough's first capture of birds of paradise on moving image in 1957 for his BBC nature documentary Zoo Quest; And, third, the high definition imagery produced in this current decade by nature photographer Tim Laman and ornithologist Ed Scholes in the context of the digitized database specimen. By moving among these examples, The presentation makes explicit the mediated history of deanimation, thus asking: can these collections of dead specimens and their various mediated specs be renegotiated through a materialist history of exhibition making? And, can such an approach facilitate and embody aesthetic and political commitments at odds with the modernist project of colonial science?
colonialism  ornithology  birds  natural_history  epistemology 
26 days ago
carmelo battaglia's journey to the invisible cities of rome
designer carmelo battaglia presents us his latest piece, a time-lapse video called ‘viaggio nelle città invisibibili di roma’ (journey to the invisible cities of rome) where he takes us to abandoned places in rome that he brings back to life with a camera, natural light, and a very precise eye. battaglia’s immersive journey transports the viewer into the heart of forgotten roman realities like the half-built skeleton of santiago calatrava’s sport city, the gasometro, the abandoned flaminio stadium, and the never finished piazza dei navigatori....

carmelo battaglia decided to represent his photographic architectural project using the time-lapse technique because this one helped to understand the concept of time: by capturing frames in sequence, you can speed time resulting in the possibility to observe and focus on the things that are blurred in our everyday life. it helps to understand the architectural elements in a different way within time while they are animated by the movement of light which creates a dance of shadows and gives dynamism to its static nature.
media_city  film  palimpsest  temporality 
26 days ago
Clemson doctoral student produces rap album for dissertation; it goes viral | Clemson University News and Stories, South Carolina
The album, “Owning My Masters: The Rhetorics of Rhymes and Revolutions” uses hip-hop to explore such ideas as identity, justice, economics, citizenship and language. The songs have garnered tens of thousands of views on YouTube, more than 50,000 streams and downloads on SoundCloud and hundreds of thousands of hits on Facebook, all before Carson defends them as a whole to his doctoral committee Friday in the Watt Family Innovation Center auditorium.

Using a music album for a dissertation, as opposed to the usual written document, has never been done at Clemson before, but Carson says it was the only way he could do it.
dissertations  multimodal_scholarship  music 
29 days ago
Jane Jacobs and the Death and Life of American Planning
a swelling perception, especially among young scholars and practitioners, that planning is a diffuse and ineffective field, and that it has been largely unsuccessful over the last half century at its own game: bringing about more just, sustainable, healthful, efficient and beautiful cities and regions. It was there because of a looming sense that planners in America lack the agency or authority to turn idealism into reality, that planning has neither the prestige nor the street cred to effect real change.

To understand the roots of this sense of impotence requires us to dial back to the great cultural shift that occurred in planning beginning in the 1960s. The seeds of discontent sown then brought forth new and needed growth, which nonetheless choked out three vital aspects of the profession — its disciplinary identity, professional authority and visionary capacity....

It is well known that city planning in the United States evolved out of the landscape architectural profession during the late Olmsted era. Planning’s core expertise was then grounded and tangible, concerned chiefly with accommodating human needs and functions on the land, from the scale of the site to that of entire regions. One of the founders of the Chapel Hill program, F. Stuart Chapin, Jr. (whose first degree was in architecture), described planning as “a means for systematically anticipating and achieving adjustment in the physical environment of a city consistent with social and economic trends and sound principles of civic design.” 3 The goal was to create physical settings that would help bring about a more prosperous, efficient and equitable society. And in many ways the giants of prewar planning — Olmsted Jr., Burnham, Mumford, Stein and Wright, Nolen, and Gilmore D. Clarke — were successful in doing just that...

The postwar period was something else altogether. By then, middle-class Americans were buying cars and moving to the suburbs in record numbers. The urban core was being depopulated. Cities were losing their tax base, buildings were being abandoned, neighborhoods were falling victim to blight. Planners and civic leaders were increasingly desperate to save their cities. Help came soon enough from Uncle Sam. Passage of the 1949 Housing Act, with its infamous Title I proviso, made urban renewal a legitimate target for federal funding. Flush with cash, city redevelopment agencies commissioned urban planners to prepare slum-clearance master plans. Vibrant ethnic neighborhoods — including the one my mother grew up in near the Brooklyn Navy Yard — were blotted out by Voisinian superblocks or punched through with expressways meant to make downtown accessible to suburbanites. Postwar urban planners thus abetted some of the most egregious acts of urban vandalism in American history. Of course, they did not see it this way. Most believed, like Lewis Mumford, that America’s cities were suffering an urban cancer wholly untreatable by the “home remedies” Jane Jacobs was brewing and that the strong medicine of slum clearance was just what the doctor ordered....

Thus ensued the well-deserved backlash against superblock urbanism and the authoritarian, we-experts-know-best brand of planning that backed it. And the backlash came, of course, from a bespectacled young journalist named Jane Jacobs. Her 1961 The Death and Life of Great American Cities, much like the paperwork Luther nailed to the Schlosskirche Wittenberg four centuries earlier, sparked a reformation — this time within planning. To the rising generation of planners, coming of age in an era of cultural ferment and rebellion, Jacobs was a patron saint. ... But change did not come easily; the field was plunged into disarray. A glance at the July 1970 Journal of the American Institute of Planners reveals a profession gripped by a crisis of mission, purpose and relevance....

So thoroughly internalized was the Jacobs critique that planners could see only folly and failure in the work of their forebears. Burnham’s grand dictum “Make no little plans” went from a battle cry to an embarrassment in less than a decade. Even so revered a figure as Sir Ebenezer Howard was now a pariah. Jacobs herself described the good man — one of the great progressives of the late Victorian era — as a mere “court reporter,” a clueless amateur who yearned “to do the city in” with “powerful and city-destroying ideas.” 6 Indeed, to Jacobs, not just misguided American urban renewal but the entire enterprise of visionary, rational, centralized planning was suspect. She was as opposed to new towns as she was to slum clearance — anything that threatened the vitality of traditional urban forms was the enemy. It is largely forgotten that the popular United Kingdom edition of Death and Life was subtitled “The Failure of Town Planning.” How odd that such a conservative, even reactionary, stance would galvanize an entire generation.....

The Jacobsians sought fresh methods of making cities work — from the grassroots and the bottom up. The subaltern was exalted, the master laid low. Drafting tables were tossed for pickets and surveys and spreadsheets. Planners sought new alliances in academe, beyond architecture and design — in political science, law, economics, sociology. But there were problems. First, none of the social sciences were primarily concerned with the city; at best they could be only partial allies. Second, planning was not taken seriously by these fields. ...

This brings us to the first of the three legacies of the Jacobsian turn: It diminished the disciplinary identity of planning. While the expanded range of scholarship and practice in the post-urban renewal era diversified the field, that diversification came at the expense of an established expertise — strong, centralized physical planning — that had given the profession visibility and identity both within academia and among “place” professions such as architecture and landscape architecture. ...

The second legacy of the Jacobsian revolution is related to the first: Privileging the grassroots over plannerly authority and expertise meant a loss of professional agency. In rejecting the muscular interventionism of the Burnham-Moses sort, planners in the 1960s identified instead with the victims of urban renewal. New mechanisms were devised to empower ordinary citizens to guide the planning process. This was an extraordinary act of altruism on our part; I can think of no other profession that has done anything like it....

The fatal flaw of such populism is that no single group of citizens — mainstream or marginalized, affluent or impoverished — can be trusted to have the best interests of society or the environment in mind when they evaluate a proposal. The literature on grassroots planning tends to assume a citizenry of Gandhian humanists. In fact, most people are not motivated by altruism but by self-interest....

...the same community activism has at times devolved into NIMBYism, causing several infill projects to be halted and helping drive development to greenfield sites. (Cows are slow to organize.) It’s made the local homeless shelter homeless itself, almost ended a Habitat for Humanity complex in Chapel Hill, and generated opposition to a much-needed transit-oriented development in the county seat of Hillsborough (more on this in a moment). And for what it’s worth, the shrillest opposition came not from rednecks or Tea Party activists but from highly educated “creative class” progressives who effectively weaponized Jane Jacobs to oppose anything they perceived as threatening the status quo...

The third legacy of the Jacobsian turn is perhaps most troubling of all: the seeming paucity among American planners today of the speculative courage and vision that once distinguished this profession. ...

Most of what was embraced post-Jacobs must remain — our expertise on public policy and economics, on law and governance and international development, on planning process and community involvement, on hazard mitigation and environmental impact, on ending poverty and encouraging justice and equality. But all these should be subordinated to core competencies related to placemaking, infrastructure and the physical environment, built and natural. I am not suggesting that we simply toss in a few studio courses and call it a day. Planners should certainly be versed in key theories of landscape and urban design. But more than design skills are needed if planning is to become — as I feel it must — the charter discipline and conscience of the placemaking professions in coming decades....

in addition to being taught courses in economics and law and governance, students should be trained to be keen observers of the urban landscapes about them, to be able to decipher the riddles of architectural style and substance, to have a working knowledge of the historical development of places and patterns on the land. They should understand how the physical infrastructure of a city works — the mechanics of transportation and utility systems, sewerage and water supply. They should know the fundamentals of ecology and the natural systems of a place, be able to read a site and its landform and vegetation, know that a great spreading maple in the middle of a stand of pines once stood alone in an open pasture. They need to know the basics of impact analysis and be able to assess the implications of a proposed development on traffic, water quality and a city’s carbon footprint. And while they cannot master all of site engineering, they should be competent site analysts and — more important — be fluent in assessing the site plans of others.
urban_planning  jane_jacobs  urban_history  pedagogy 
29 days ago
Home | International Cloud Atlas
Since the International Cloud Atlas (ICA) was last updated (four decades ago in the case of Volume I; three in the case of Volume II) our understanding of some types of clouds and other meteorological meteors has advanced, and technology has fundamentally changed our world. We have witnessed the creation of the internet, email and mobile telephones with digital cameras. Yet the cloud atlas has only been available in print format.

Accurate and consistent cloud and weather observations remain critically important for weather, climate and hydrology, so ensuring that observations are globally standardized remains an important need. In the absence of on-line access to the ICA, alternative atlases began to appear on the web, and with them returned a threat to the global standardization of cloud classification, a key reason for the original development of the ICA in 1939.
clouds  meteorology  atlas 
4 weeks ago
Arizona State U library reorganization plan moves ahead
Many other universities are reorganizing their libraries as they see an increase in the use of electronic resources and demand for cafes, multimedia classrooms, maker spaces, writing centers and other spaces devoted to teaching, learning and research. ASU, which under Crow's leadership has relentlessly pursued an innovation agenda, joins their ranks to argue for the benefits of libraries at a time when federal funding is on the cutting block.
The university in October 2014 hired James J. O’Donnell, a classical scholar who previously served as provost at Georgetown University, to lead the university library through the reorganization process. In an interview, he said one of his priorities since taking the job has been to figure out what to do with the 4.5 million physical items in the library’s collections.
“It’s time to realize that all of our users are primarily online users of our collections,” O’Donnell said. Reorganizing a university library around that concept “means changing your service model, your staffing structure and organization, and bringing in a bunch of new people,” he said....

The university last year received a $50,000 grant from the Andrew W. Mellon Foundation to support that work. O’Donnell said he plans for the renovated library to highlight a “carefully chosen print collection.” Its special collections feature prominently in those plans, as they will be moved from their current location “hidden away on the fourth floor” to the main floor, he said.
“We want it to be a place that says libraries are important because libraries have the good stuff,” O’Donnell said. “Libraries have and manage access to the best-quality learning and research resources, and we have the wizards to help you find what you need. We can take you to lots and lots of places that the open internet just can’t plain take you, and we can show you how to get there.”

O’Donnell also said the library is considering a future in which it will feature smaller “thematic exhibits” with accompanying events on a rotating basis. One semester might be devoted to Italy; the next, sustainability....

The library is taking some cues from the retail world on how to design the rotating exhibits to invite visitors to attend and explore, O’Donnell said. The retail angle extends to how the library is talking about its operations. The library will store the rest of its collections in off-site shelving on its Polytechnic campus, some 20 miles away from the Hayden Library. But librarians don’t refer to the off-site shelving as “storage,” he said. Instead, they are being encouraged by Crow to see it as a “fulfillment center,” similar to those used by online retailers.
An informational website that the university set up to raise awareness about the library renovation completes the comparison to Amazon. It explains that books “will remain accessible to the ASU community through expedited delivery options similar to the Amazon Prime service.”...

Off-site storage has become a popular solution for university libraries looking to free up some space by removing stacks. The Georgia Institute of Technology, for example, is engaged in its own library renovation project that involves moving virtually all of its physical books to a facility it shares with nearby Emory University (but keeping some as a “visual cue,” administrators said last year).
Irene M. H. Herold, president of the Association of College & Research Libraries, said in an interview that the trend of using off-site storage is one example of how the university library profession is changing.
“Our focus is where it has been all along,” said Herold, university librarian at the University of Hawaii at Manoa. “We’re not just knowledge preservers and information-literacy, critical-thinking instructors. We’re also engaged in knowledge creation. It’s just that the knowledge that’s being created is able to be accessed and shaped and shared in such different ways than in the past.”...

O’Donnell expanded on his vision for the renovated building in an email. “I want a building that is a showplace (a sign of ASU's academic and achievement) and a showcase (a place to make people aware of library treasures and resources and of the achievements of student and faculty partners) and a showroom (a place for users to go to find out about and road test and learn how to use information resources for best contribution to academic work and ambition).”
libraries  academic_libraries  collections  collection_management  offsite_storage  storage  pedagogy  architecture 
4 weeks ago
The DATA2GO.NYC mapping and data tool allows you to access federal, state and local data concerning the economic well-being of all of the city's neighborhoods. Using the tool you can view information in many different areas, such as educational attainment and average incomes in each New York neighborhood.

Using the DATA2GO.NYC interactive map you can view information on over 300 different indicators, in areas such as education, demographics and the economy. If you select an indicator from the map's drop-down menu you can view a choropleth view of that data on the city map. If you select any of the neighborhood's on the map you can also view how it compares to all other New York neighborhoods for the selected indicator on an accompanying chart view.
mapping  cartography  data_sets  new_york 
4 weeks ago
Wreck Park: Interview
I get impatient about materialism—or, materiality. I shouldn’t say “materialism,” because that’s actually an interesting Marxist thing. I mean, studying materiality. . . . It’s often actually displaced and transfigured connoisseurship. So, for me, your first few chapters instead do materiality in a theoretical way that’s robust. It’s not just about, “Well, I know my stuff because I spent my time looking at fiber optic cables.” You mobilize specialist knowledge in a very different way.


It’s really unfortunate. So many recent books on digital media base their authority on the fact that their author has actually touched the fiber optic cable or looked at the code, rather than on well-built arguments....

The utter banality of it is why I only wanted to take pictures of the outside rather than the inside; it’s because there’s no mystical secret that you can find by getting access to these buildings. You’re not going to read some code in the blinking lights that will somehow undermine algorithmic control....

One reason tracing the chain of consumption is so popular is partly because scholars like the gadgets—the batteries, plastic surfaces, and so on—that they play with. What often gets lost are the people not only behind the scenes but also not even considered legitimately part of the scene.

The most recent thing I’ve written, a thing I just finished earlier today, is about understanding so-called pirates, spammers, or people who write fraudulent messages as an integral part of the cloud’s system of work. They aren’t far off from content moderators or microlaborers who are treated as disposable, as essentially human spam.
my_work  infrastructure  tourism  infrastructural_tourism  materiality  internet  labor 
4 weeks ago
Hyperserfs - The Chronicle of Higher Education
When universities enter into agreements with corporations to sponsor research, the extent of their claim on the intellectual property is spelled out. But when it comes to curricular or resource-based arrangements like the Hyperloop contest, what the university stands to really gain is not so clear. And yet, strapped for funding and competing to stay relevant, many universities still eagerly support such high-profile projects in hopes of accruing cultural capital, raising their profile among peer institutions and prospective students, or striking it rich. As a result, universities have commodified not only student knowledge but learning and student life for profit.

In so doing, they are betraying their public promise, which is to advance science and technology for the benefit of everyone. We can appreciate the results of science or the technological breakthroughs brought about in the postwar university, and learning by doing is an essential pedagogical technique. University and student-powered research led to innovations such as oral contraceptives, the polio vaccine, the computer, and the internet — in a climate that fostered the pursuit of human knowledge, for the sake of all humans.

This climate has given way to one where, as Jacob Rooksby has argued, companies are in the classroom at the same time universities increasingly act as companies — in a climate that fosters the pursuit of intellectual property, patents, copyright, and branding.

In today’s environment, universities must do more to ensure that the public and social investment in STEM benefits the common good, rather than serves primarily private interests. We need more research-based, public service in STEM fields, and it needs to be free not only of student debt but also of political and commercial restraints.
academia  sponsored_research  funding  neoliberalism 
4 weeks ago
Data61 revamps government data to make it more publicly accessible | ZDNet
The Commonwealth Scientific and Industrial Organisation's (CSIRO) Data61 is looking to make government data more accessible to all Australians, revamping and, and working on the creation of government dashboards -- to name just a few projects currently underway.

The Australian government initially launched in 2010 as a tool allowing for the publishing of open data across all jurisdictions of government in the country. It then evolved in 2013 to see application programming interface (API)-style access to datasets.

Since then, more data has been collected and more government departments, enterprises large and small, and the average Australian citizen have become eager to get their hands on the information.

Last year, the Department of Prime Minister and Cabinet went to Data61 with a brief of building "world-leading" data infrastructure -- essentially a revamp of the portal
open_data  dashboards  australia 
5 weeks ago
Artificial data give the same results as real data — without compromising privacy | MIT News
Although data scientists can gain great insights from large data sets — and can ultimately use these insights to tackle major challenges — accomplishing this is much easier said than done. Many such efforts are stymied from the outset, as privacy concerns make it difficult for scientists to access the data they would like to work with.
In a paper presented at the IEEE International Conference on Data Science and Advanced Analytics, members of the Data to AI Lab at the MIT Laboratory for Information and Decision Systems (LIDS) Kalyan Veeramachaneni, principal research scientist in LIDS and the Institute for Data, Systems, and Society (IDSS) and co-authors Neha Patki and Roy Wedge describe a machine learning system that automatically creates synthetic data — with the goal of enabling data science efforts that, due to a lack of access to real data, may have otherwise not left the ground. While the use of authentic data can cause significant privacy concerns, this synthetic data is completely different from that produced by real users — but can still be used to develop and test data science algorithms and models.
machine_learning  data_science  artificial_intelligence 
5 weeks ago
The Center for Land Use Interpretation
Underground Mines Turned Into Farms, Night Clubs, Data Centers, Physics Labs, and Paintball Fields

A former underground sandstone mine in Festus, Missouri, was turned into a roller rink and night club called Caveland, where musical acts including MC5 and Bob Seger once played (though not at the same time, unfortunately). It closed in 1985, and sat idle until 2003, when it was bought on Ebay in 2003, and turned into a private home. It is located down the street from Festus RV and Boat Storage (inside another mine, in the same bluff).

Some former mines have been turned into physics research facilities, like portions of the Soudan Iron Mine, in Michigan, which has the MINOS neutrino detector installed deep in the mine, measuring neutrinos sent through the ground from the Fermi Lab, outside Chicago, hundreds of miles away.

The Sanford Lab in the former Homestake mine, in Lead, South Dakota, is another particle physics lab inside a mine. The underground mine is huge, sometimes referred to as the largest mine in the western hemisphere. Over 125 years, more than 40 million ounces of gold was extracted from the mine, until it closed in 2001. There are 370 miles of tunnels, as deep as 8,000 feet down. With private and public funding, the mines have been turned into research for cosmic radiation, like dark matter and nutrinos, which are more visible deep underground, due to the filtering effects of the earth above them.

Part of a decommissioned copper mine in White Pine, Michigan, has been converted by Prairie Plant Systems into an underground grow house and plant research facility called SubTerra. Doing this kind of research in an underground facility isolates it from the outdoor environment, limiting the possibility of contamination from inside to outside, or outside to inside, both important, especially if working on genetic modifications....

Banking Bunkers and other Purpose-Built Underground Storage Spaces...

Off-site underground bank vaults built during the Cold War are not uncommon, though few are in use as records storage vaults now. There is another underground banking bunker in rural northern Connecticut, similar to the vault in Pepperell, though a bit larger, at 10,000 square feet. It was built in 1962 by the Underground Record Protection Cooperative Trust, a group of banks and insurance companies. Its primary function was to store records out of the way of nuclear attack (outside the city, and underground), but, like Pepperell, it could also house a few dozen people, presumably executives associated with the Trust, for a few weeks, and had decontamination showers, cots, and food rations. After its original purpose ended, somewhere in the early 1990s, the bunker changed hands a few times, then fell into disuse. In 2013, it opened as a secure wine storage facility called Horse Ridge Cellars.....

One of the most notorious banking bunkers is the former Federal Reserve facility at Mount Pony, near Culpeper, Virginia, a three-level underground vault built by the Federal Reserve and the Treasury Department in 1969 to house the hub of their nationwide communications network, and to store $241 billion in cash (including rows of palletized $2 bills), which would be used to jump start the economy following a nuclear attack. After being offered for sale in the early 1990s, the facility was purchased by philanthropist and film preservationist David W. Packard, and thoroughly redeveloped over many years into the National Audio Visual Conservation Center of the Library of Congress. Inside are millions of movies, TV episodes, and audio recordings on every conceivable recording format, including nitrate film, kept in one of the two underground film vaults there....

A considerable part of the history of document management and archiving centered around another film medium, microfilm, which was first developed to more efficiently record and store documents by the banking industry in the 1920s. Kodak bought these technologies and expanded them into a standard for archiving newspaper, paper which degraded quickly. As printed information in all forms continued to test the storage limits of archives and libraries, microfilm became the standard master format for much of the printed material produced in the world. Library basements and off-site library storage areas are often full of microfilm, though it is going away—being digitized, like everything else.

One of the largest collections of microfilmed records is inside the Granite Mountain Records Vault, the principal storage facility for the genealogical research programs of the Mormon Church...

The corporation with the most underground space is—or was—AT&T. AT&T’s communications infrastructure of the 1950s to 1990s included hundreds of underground facilities, ranging from single story equipment vaults to multi-level underground system control centers. Many of these were hardened concrete structures to help vital communication equipment survive a nuclear attack. It also reduced the likelihood of damage from vandals, as many of these facilities were unmanned, and sometimes very remote...

One of the most common types of large underground telco vault can be found at repeater stations along the national microwave tower and coaxial cable networks. Staring in the 1990s, many of these facilities were sold off. ... For more than a decade, the Netcong Bunker was the principal network operations center for AT&T....

The only entity to rival AT&T in the development of underground space is its partner throughout the Cold War, the US Government itself. Major underground military command and control centers exist underneath the Pentagon; at Raven Rock, Pennsylvania; and at Cheyenne Mountain, Colorado. At Offutt Air Force Base in Nebraska, the underground command center for Stratcom is being upgraded with a new $1.2 billion HQ right now.

And then there is FEMA, the bunker masters, operating out of the underground motherearthship, Mount Weather, in Virginia, with somewhere around 600,000 square feet of subterranean space, and accommodations for hundreds of government officials, if not thousands, including the president. ...

thousands of backyard bomb shelters (and storm shelters too) built for family use, and numerous apocalyptic (or pragmatic, depending on your POV) group facilities, made by secretive survivalist preppers, and religious groups, like the Scientology bunker in Trementina, New Mexico (one of three operated by Scientology’s Church of Spiritual Technology), or the $25 million shelter complex at the Royal Teton Ranch, built for a few hundred members of the Church Universal and Triumphant, near Gardiner, Montana....

At the other end of the nuclear weapons underground spectrum from missile silos and Cold War bunkers is the other small matter of radioactive waste storage, which has resulted in two of the largest purpose-built underground facilities in the country.
underground  mining  media_archaeology  infrastructure  microfilm 
5 weeks ago
The New Version of Administrative Creep | Vitae
When it starts to happen, administrative creep typically isn't felt as an overly onerous task. Requests for faculty involvement in administrative tasks are framed as opportunities to volunteer. The faculty member is presented with an ultimatum that runs along the lines of "if you don't volunteer for X committee/task force/working group, it will be done by administrators without you." The subtext is that not being present will result in negative effects for faculty — as if what is in the best interest of administration is not in the best interest of faculty.

Unfortunately, our presence on those groups often legitimizes their administratively-focused decisions. Additionally, the more that we are brought into administrative decision-making, the more an administrative identity begins to shape who we are and what we do. We start to think like administrators — focused not on what we do best (teaching and research), but rather what administration requires.

Another example of administrative creep is the focus on filling classes — i.e., selling credits. Too often, we spend time discussing a strategy to "offer the classes that will fill," rather than a strategy to "offer the classes our students need to succeed." Administrators want faculty members to be nimble, to adjust our course offerings to meet students' demands. That means adding sections in high-demand courses, and cancelling sections of under-enrolled classes.

In that model, administrators view professors as parts in a machine, easily shifted. While that is a problem in and of itself, the more pernicious problem is the degree to which faculty members start discussing which courses to offer based on fill rates, and planning our hiring around the fill-rate paradigm....

Just Don't Do It. One of the most important and effective responses to administrative creep is for faculty to just say no. A significant percentage of the administrative tasks we are asked to perform are simply unnecessary. Here's how the game works: Administrators are hired. They require certain tasks of faculty (writing reports and assessments, for example) and are then responsible for evaluating the results. Those administrators have created a loop that justifies their positions but does nothing to strengthen teaching and learning for our students. Why not eliminate the very administrative positions that our universities have created to serve this circular logic? A step in that direction is to say no to administrative requests that support the loop. That doesn't mean that we should decline all such requests — just that we limit our involvement to the ones that reflect faculty interests in improving our curricula, teaching students, building our scholarship, and creating knowledge. - See more at:
academia  administration  bureaucracy 
5 weeks ago
DHQ: Digital Humanities Quarterly: The .txtual Condition: Digital Humanities, Born-Digital Archives, and the Future Literary
In 1995 in the midst of the first widespread wave of digitization, the Modern Language Association issued a Statement on the Significance of Primary Records in order to assert the importance of retaining books and other physical artifacts even after they have been microfilmed or scanned for general consumption. "A primary record," the MLA told us then, "can appropriately be defined as a physical object produced or used at the particular past time that one is concerned with in a given instance" (27). Today, the conceit of a "primary record" can no longer be assumed to be coterminous with that of a "physical object." Electronic texts, files, feeds, and transmissions of all sorts are also now, indisputably, primary records. In the specific domain of the literary, a writer working today will not and cannot be studied in the future in the same way as writers of the past, because the basic material evidence of their authorial activity — manuscripts and drafts, working notes, correspondence, journals — is, like all textual production, increasingly migrating to the electronic realm. This essay therefore seeks to locate and triangulate the emergence of a .txtual condition — I am of course remediating Jerome McGann’s influential notion of a “textual condition” — amid our contemporary constructions of the "literary", along with the changing nature of literary archives, and lastly activities in the digital humanities as that enterprise is now construed. In particular, I will use the example of the Maryland Institute for Technology in the Humanities (MITH) at the University of Maryland as a means of illustrating the kinds of resources and expertise a working digital humanities center can bring to the table when confronted with the range of materials that archives and manuscript repositories will increasingly be receiving.
archives  materiality 
5 weeks ago
"Smart Cities" Are Too Smart for Your Privacy | Center for Internet and Society
all providers of broadband Internet access are required to build in wiretap capabilities under the Communications Assistance for Law Enforcement Act (CALEA), 47 U.S.C. §§ 1001 et seq.  There is no exception for municipalities providing such services.  That being the case, one wonders exactly how federal, state and local authorities will use the back door in a municipal network. How will municipalities execute wiretaps or obtain customer information from their own municipal provider?  Will the municipal provider be transparent with its customers and provide notice of such requests where lawful much as other commercial providers do today?  A smart city would work that out in advance of deploying services.

For another thing, cities are not subject to suit under the federal wiretap law for wrongfully intercepting and disclosing communications between citizens according to the Court of Appeals for the Seventh Circuit. See Seitz v. City of Elgin, 719 F.3d 654 (7th Cir. 2013). To explain, even though the relief section of the Wiretap Act says that a claim may be brought against a “person or entity” that violates the Act, the substantive prohibitions in section 2511 of Title 18 apply only to a “person.”  According to Seitz, Congress did not include cities in the definition of “person.”  The Sixth Circuit reached a different conclusion a decade earlier, so at best, there is an apparent split in the circuits, but the Seventh Circuit probably has it right. See Adams v. City of Battle Creek, 250 F.3d 980 (6th Cir. 2001). Legislative arcania aside, it is a serious issue for municipal providers, raising concerns ranging from E&O insurance to qualified privileges and immunity to employee misconduct to contingent liability and litigation risks.

The answers for smart cities may be to provide a dumb pipe, but don’t bet on that future, especially because providing such services will raise revenue for cities from subscription fees to advertising and analytics dollars.  Some cities may choose to outsource all of the platform or network operations of a smart city, and that also will raise a host of privacy questions about ownership and use of data.  
smart_cities  surveillance  privacy 
5 weeks ago
(2) All You Can Do with Catalogs: Accessing, Organizing, Disseminating Local and Global Knowledge (15th-19th Centuries) | Paola Molino, Dagmar Riedel, Guy Burak, Martina Siebert, and Seth Kimmel -
It all began with a serendipitous crossing of the paths of four scholars working on the transmission of knowledge and the history of science in European, Middle Eastern, and East Asian societies. We all share an abiding interest in the composition of finding aids between 1400 and 1800, when the transformation of feudal societies into territorial states prompted the ruling elites to invest into the construction of imperial libraries and archives, whose design projected transregional connections and supranational ambitions to the world at large. Although new cataloging principles emerged for the collections housed within these new physical spaces, their compilers did not explicitly break with the already recognized knowledge traditions, attempting rather to harmonize the established authoritative epistemes into new classificatory regimes. The finding aids of early modern societies are fascinating objects in their own right: As artifacts they are primarily paper tools and, yet, their written contents can also be understood as a graphic representation of ideas....

Paola Molino, Islam Dayeh, and Martina Siebert investigated how the construction of libraries and the design of their research facilities developed in conjunction with the organization of finding aids. Particular attention was given to the technical terminology of classification schemes as regard to the various purposes of bibliographical information, and to the appreciation of finding aids as intellectual achievements in their own right. In the discussion, we explored the possibility of a methodology for the study of finding aids as sources for a transregional history of knowledge. What is the impact of ideology on classification schemes?...

Christian Jacob discussed lists and catalogs from a meta-perspective, exploring their morphology and their uses. He stressed that every form of knowledge needs a set of practices and a space, namely a materiality that allows for its representation. He furthermore posited that the power of catalogs cannot be understood without examining their relationship to geographical maps. Seth Kimmel pushed this thesis further by investigating how bibliography and cartography were intertwined in the cataloging project of Hernando Colón (1488 –1539), a cartographer, explorer, and bibliophile who had established a private universal library in Seville. In contrast, Cevolini focused on a mechanical indexing device for the storage of written notes and excerpts, known as the "ark of studies" and designed by Thomas Harrison (1595–1649).Cevolini interpreted the ark as an external memory device which illustrated how new cognitive habits were accompanied by new organizational strategies....

Since readers increasingly rely on global online catalogs in order to access books as digital surrogates, what will happen to the relationship between a
library's spatial organization and the systematics of its catalogs? Richard took as his starting point the cataloging practices in Muslim societies since the tenth and eleventh centuries. Although there is much evidence for vibrant library traditions in Turkey, Iran, and India, very few catalogs of historical library collections have passed down to us. Richard observed that the librarian's personal responsibility for a collection under his care might have worked as a disincentive for the compilation of publicly available finding aids, since a catalogue can also be used to control the work of the librarian....
archives  libraries  cataloguing  finding_aids  classification 
5 weeks ago
The Dark History of HathiTrust
This research explores the ways values, power, and politics shape and are shaped by digital infrastructure development through an in-depth study of HathiTrust’s “dark history,” the period of years leading up to its public launch. This research identifies and traces the emerging and iterative ways that values were surfaced and negotiated, decision-making approaches were strategically modified, and relationships were strengthened, reconfigured, and sometimes abandoning through the process of generating a viable, robust and sustainable collaborative digital infrastructure. Through this history, we gain deeper understandings and appreciations of the various and sometimes surprising ways that values, power, and politics are implicated in digital infrastructure development. Shedding light on this history enables us to better contextualize and understand the affordances, limitations, and challenges of the HathiTrust we know today, better envision its range of possible futures, and develop richer appreciations for digital infrastructure development more broadly.
archives  books  libraries  google  hathi_trust  infrastructures  labor  copyright 
5 weeks ago
Mondothèque: A Radiated Book / Un livre irradiant / Een irradiërend boek (2016) [EN, FR, NL] — Monoskop Log
“This Radiated Book started three years ago with an e-mail from the Mundaneum archive center in Mons, Belgium. It announced that Elio di Rupo, then prime minister of Belgium, was about to sign a collaboration agreement between the archive center and Google. The newsletter cited an article in the French newspaper Le Monde that coined the Mundaneum as ‘Google on paper’. It was our first encounter with many variations on the same theme.

The former mining area around Mons is also where Google has installed its largest datacenter in Europe, a result of negotiations by the same Di Rupo. Due to the re-branding of Paul Otlet as ‘founding father of the Internet’, Otlet’s oeuvre finally started to receive international attention. Local politicians wanting to transform the industrial heartland into a home for The Internet Age seized the moment and made the Mundaneum a central node in their campaigns. Google — grateful for discovering its posthumous francophone roots — sent chief evangelist Vint Cerf to the Mundaneum. Meanwhile, the archive center allowed the company to publish hundreds of documents on the website of Google Cultural Institute.

While the visual resemblance between a row of index drawers and a server park might not be a coincidence, it is something else to conflate the type of universalist knowledge project imagined by Paul Otlet and Henri Lafontaine with the enterprise of the search giant. The statement ‘Google on paper’ acted as a provocation, evoking other cases in other places where geographically situated histories are turned into advertising slogans, and cultural infrastructures pushed into the hands of global corporations.

An international band of artists, archivists and activists set out to unravel the many layers of this mesh. The direct comparison between the historical Mundaneum project and the mission of Alphabet Inc speaks of manipulative simplification on multiple levels, but to de-tangle its implications was easier said than done. Some of us were drawn in by misrepresentations of the oeuvre of Otlet himself, others felt the need to give an account of its Brussels’ roots, to re-insert the work of maintenance and caretaking into the his/story of founding fathers, or joined out of concern with the future of cultural institutions and libraries in digital times.” (from the Introduction)
google  otlet  archives 
6 weeks ago
The open source city as the transnational democratic future | Transnational Institute
In City of fears, City of hope 7 (2003), Zygmunt Bauman talks about two important concepts related to the modern city: mixophobia (the fear used by institutions to discourage the use of the public space) and mixophilia (human and cultural mixing in cities). His main conclusion, however, is that nation-states are in decline and cities are our era’s principal political space....

The mutation of the global city into the global street is a desirable political agenda for the planet. The global street (a space both physical and semantic) and the rebel cities (as a combative remixing of the right to the city) have become narrative expressions of the global “outside”. Indeed, some of the most important social uprisings in recent times, such as the Gezi Park revolt in Turkey, the Movimento Passe Livre (MPL) in Brazil and the Gamonal protest in Burgos (Spain), have had the urban space as their initial cause. The city is also the setting for the continuation of many revolts: in Augusta Park in São Paulo, Can Batlló in Barcelona or the community-managed Embros Theatre in Athens, among many others....

These revolts have also allowed for constructing new models of participation and governance. During the Acampada Sol camp-out by Spain’s 15M in Madrid, which lasted for several weeks in May and June 2011, an online tool called Propongo 15 was developed to allow anyone to make policy proposals. Although these policy proposals did not necessarily translate into policy changes, the online tool, whose source code was later used by the government of Rio Grande do Sul in Brazil, revealed society’s longing for participatory democracy. ...

A society’s operating system would therefore be a series of common practices and human relationships, not just a set of online platforms. ...

to a large extent about promoting voluntary work by citizens in order justify the disappearance of the welfare state. To avoid reinforcing this, city autonomies and citizen self-management and collaboration have a crucial role to act as an incentive for mutual complementarity between public administration and citizens....

As well as using free technology, any city council that wishes to build an open source city will therefore have to recognise and protect existing citizen practices (as well as foster new ones) that reproduce the commons and strengthen that new, post-capitalist mode of “production” whether they are community centres, self-managed spaces, gardening networks or peer-to-peer file sharing networks....

The participatory repertoire of the Barcelona en Comú political confluence, which is currently governing the city of Barcelona, is seen as one of the models to be replicated. “Its radical democracy draws on a set of tools, techniques, mechanisms and structures to develop municipal policies from the bottom up. These include assemblies at various levels (neighbourhood, thematic, coordination, logistics, media, communication etc) and online platforms (for communicating, voting, working).”...

The fact that different cities are sharing the code for their digital platforms breaks with the smart city’s logic of proprietary technology and the paradigm of branded cities competing with each other. What has now been baptised as Spanish “intermunicipalism” seeks to create a network of “rebel cities for the common good” which share repositories, tools, digital platforms and methodologies.
cities  protest  public_sphere  public_space  smart_cities 
6 weeks ago
Will Democracy Survive Big Data and Artificial Intelligence? - Scientific American
Today, Singapore is seen as a perfect example of a data-controlled society. What started as a program to protect its citizens from terrorism has ended up influencing economic and immigration policy, the property market and school curricula. China is taking a similar route. Recently, Baidu, the Chinese equivalent of Google, invited the military to take part in the China Brain Project. It involves running so-called deep learning algorithms over the search engine data collected about its users. Beyond this, a kind of social control is also planned. According to recent reports, every Chinese citizen will receive a so-called ”Citizen Score”, which will determine under what conditions they may get loans, jobs, or travel visa to other countries....

The new, caring government is not only interested in what we do, but also wants to make sure that we do the things that it considers to be right. The magic phrase is "big nudging", which is the combination of big data with nudging. To many, this appears to be a sort of digital scepter that allows one to govern the masses efficiently, without having to involve citizens in democratic processes. Could this overcome vested interests and optimize the course of the world?..

In a rapidly changing world a super-intelligence can never make perfect decisions (see Fig. 1): systemic complexity is increasing faster than data volumes, which are growing faster than the ability to process them, and data transfer rates are limited. This results in disregarding local knowledge and facts, which are important to reach good solutions. Distributed, local control methods are often superior to centralized approaches, especially in complex systems whose behaviors are highly variable, hardly predictable and not capable of real-time optimization. This is already true for traffic control in cities, but even more so for the social and economic systems of our highly networked, globalized world.

Furthermore, there is a danger that the manipulation of decisions by powerful algorithms undermines the basis of "collective intelligence," which can flexibly adapt to the challenges of our complex world....

By allowing the pursuit of various different goals, a pluralistic society is better able to cope with the range of unexpected challenges to come.
Centralized, top-down control is a solution of the past, which is only suitable for systems of low complexity. Therefore, federal systems and majority decisions are the solutions of the present. With economic and cultural evolution, social complexity will continue to rise. Therefore, the solution for the future is collective intelligence. This means that citizen science, crowdsourcing and online discussion platforms are eminently important new approaches to making more knowledge, ideas and resources available....

We are at the historic moment, where we have to decide on the right path—a path that allows us all to benefit from the digital revolution. Therefore, we urge to adhere to the following fundamental principles:
1. to increasingly decentralize the function of information systems;
2. to support informational self-determination and participation;
3. to improve transparency in order to achieve greater trust;
4. to reduce the distortion and pollution of information;
5. to enable user-controlled information filters;
6. to support social and economic diversity;
7. to improve interoperability and collaborative opportunities;
8. to create digital assistants and coordination tools;
9. to support collective intelligence, and
10. to promote responsible behavior of citizens in the digital world through digital literacy and enlightenment....

Several types of institutions should be considered. Most importantly, society must be decentralized, following the principle of subsidiarity. Three dimensions matter.
Spatial decentralization consists in vibrant federalism. The provinces, regions and communes must be given sufficient autonomy. To a large extent, they must be able to set their own tax rates and govern their own public expenditure.
Functional decentralization according to area of public expenditure (for example education, health, environment, water provision, traffic, culture etc) is also desirable. This concept has been developed through the proposal of FOCJ, or “Functional, Overlapping and Competing Jurisdictions”.
Political decentralization relating to the division of power between the executive (government), legislative (parliament) and the courts. Public media and academia should be additional pillars.
smart_cities  surveillance  big_data  china  democracy  governance  nudge  digital_literacy 
6 weeks ago
Most of the time, innovators don’t move fast and break things | Aeon Essays
The history of technology is too important to be left to the technologists. Relying on PayPal’s founders Elon Musk or Peter Thiel to tell us how that history goes is like turning to Bill Clinton or Newt Gingrich to tell the political history of the 1990s. Books such as Walter Isaacson’s The Innovators (2014) or Steven Johnson’s How We Got to Now (2015) give us accounts of lone genius men toiling in industrial labs and Bay Area garages. This view of innovation – narrow and shallow – casts a long shadow, one that obscures the broad and deep currents that actually drive technological innovation and shape its impact on society....

Over the past two centuries, almost all professional scientists and engineers have worked not to cut down the old trees of technologies and knowledge and grow new ones, but to nurture and prune the existing ones. In corporate-based science and technology, disruption is very rare, continuity rules, and makes change and advance possible. At different times in history, such disruption was even discouraged. At the great industrial labs of the early 20th century, companies such as General Electric (GE) or AT&T didn’t want their engineers and scientists to create excessive technological novelty – tens of millions of company dollars had been invested to build existing technological systems. Instead, research managers such as Willis R Whitney, head of GE’s research, sought incremental improvements that would marginally advance the company’s technologies and extend its intellectual property regime.....

As political artefacts, standards embody certain ideologies. For the internet, it is an aspiration towards openness – open systems, open access, open source. In the US, this ideology has deep historical roots. Some ideas inherent in this openness can be traced from the civil liberties driving resistance towards England’s Stamp Act in the mid-18th century to 20th-century ideals of open societies as alternatives to fascist and communist regimes. The philosopher Langdon Winner argued in 1980 that artefacts have politics, beliefs and assumptions about the world and society that are embedded and written into their very fabric.

As a result, technical standards – the very ‘things’ that allow my laptop and your iPhone to seamlessly (more or less) connect to networks as we move about the planet – requires the International Organization of Standardization (ISO), as well as recognition and cooperation from state agencies such as the US Federal Communications Commission or the International Telecommunication Union. Techno-libertarians might claim ‘I made it’ but the reality is that, without international standards, whatever they made wouldn’t work very well.
innovation  methodology  great_man_theory  research  standards 
6 weeks ago
Nothing Tweetable: A Conversation or How to “Librarian” at the End of Times – In the Library with the Lead Pipe
There’s that Kurt Vonnegut quote from A Man Without a Country, “So the America I loved still exists, if not in the White House or the Supreme Court or the Senate or the House of Representatives or the media. The America I love still exists at the front desks of our public libraries” (Vonnegut, 2005, 103).
6 weeks ago
The Avery Review | Hudson Yards: A Sustainable Micropolis
The island of Manhattan is slowly tilting toward Hudson Yards. When completed, the project will have added 17 million square feet of residential and commercial space, 14 acres of public open space, 100 shops and restaurants, a cultural space, luxury hotel, and public school.1 We’ve grown familiar with the ingredients that go into making this kind of urban development soup: sustainable design, infrastructure upgrades, and a mixture of retail, commercial, and residential space. ...

One thing that is surely being sustained at Hudson Yards is the public funding of private ventures. For anyone doubting the level of engagement between the state and business, consider that about $3 billion in taxpayer money was poured into infrastructure improvements targeted toward Hudson Yards in order to entice investment....

Within this whirlwind of development, the Metropolitan Transportation Authority (MTA), owner of the twenty-six-acre Hudson Yards site, is, for the first time, using real estate to secure its debt, raising over $1 billion in bonds backed by expected returns from lease agreements with Related and Oxford....

Every mayor in New York since the 1970s attempted to jump-start development in Hudson Yards. Most recently, in January of 2005, the land between Twenty-Eighth and Forty-Second Streets and Eighth and Twelfth Avenues was rezoned from industrial to mixed-use residential and commercial (MTA’s railyards run from Thirtieth to Thirty-Third Streets and Tenth to Twelfth Avenues). The extension of the No. 7 line was proposed in the mid-2000s alongside plans to build a new Jets Stadium over the railyards as part of a strategy to secure the 2012 Summer Olympics. At the time, the MTA did not have the funds available to move ahead with the extension, and following the defeat of the new stadium proposal the MTA solicited bids for a mixed-use development from developers....

The management of the environment cannot be thought of in isolation of land use questions and, in turn, planning. Hudson Yards therefore situates itself squarely in the middle of the set of questions circling around the role of the state, and, by extension, of planning in public-private partnerships. ...

Hudson Yards is America’s largest private real estate development. Once completed, Related Companies, the major stakeholder in the project, estimates it will add 2.5 percent to New York City’s domestic product.13 Hudson Yards is estimated to cost the developers $20 billion. And Related’s Stephen Ross is securing commercial tenants at project cost—the first of whom were Coach and L’Oréal—in order to finance the retail and residential portions of the project, where he aims to make his profit....

Related was able to raise $600 million for the initial phase of the Hudson Yards development through EB-5 and is aiming to raise another $600 million.23 In a $20 billion project these amounts don’t add up to much but do lead to a pertinent question: How much of the $20 billion was amassed through federal programs meant to distribute economic well-being equitably rather than concentrating it in a commercial enterprise zone? In a political landscape that asks the citizen to step aside to give room to the developer to exercise his or her “private risk,” it is worth asking just how much of that risk is, indeed, private....

What, then, is public space in Hudson Yards? It is as powerful an agent as the buildings themselves are. If we shift our focus from how the individual parcels and buildings are zoned to how this development connects to its global context, we see the emergence of what Keller Easterling defines as “infrastructure space,” which, while material, is hidden and silent. ...

Hudson Yards is dizzying in its list of interconnected technologies, a system rather than a set of buildings. Shannon Mattern offers a thoughtful account of the data-driven infrastructure that the Hudson Yards development promotes as sustainable design. “While such systems are environmentally ‘smart’—they eliminate noisy, polluting garbage trucks; minimize landfill waste, and reduce offensive smells—they also cultivate an out-of-sight, out-of-mind public consciousness,” Mattern explains.28 Our definitions of sustainability tend to act like putty, stretching and squeezing to fill the holes in our ecological thinking so that we can get to a holistic picture of what it means to live responsibly on our planet. The aesthetics of sustainability emerge from the urgency to suppress, cover up, and ultimately control our environment....

hanks to “smart” technology, residents and tenants will be connected to an energy-monitoring system, curbing consumption inefficiencies. Thad Sheely, senior vice president of operations at Related Companies, admits that this may be something we will have to wait for. Hudson Yards is laying down the infrastructural foundation in order to collect energy data, even though they’re not quite sure what to do with it all yet. He explains, “There’s something here, but Google and Apple haven’t figured it out yet, either. But this is a road we have to be on, but no one knows where it goes … our focus has been to get the hardware right, so that we have the bandwidth, the connectivity, two-way communication, and the ability to upload and download and collect data.”

The project is so far ahead of itself; it hasn’t yet caught up to its promise, though this kind of data collection is fundamental in the design of Hudson Yards. The infrastructure of energy use monitoring is not simply integrated into the project but is an active agent in shaping and designing the buildings, open spaces, and even natures that comprise this development. With the thumbprint swipe of a screen one can reflect on their participation in Hudson Yards, coded through the metrics of energy use. Hudson Yards is as much space as it is interface, with all of the potential impacts (foreseeable and unforeseeable) that this entails....

What these sustainable aesthetics and rhetorics of efficiency occlude is the messy and vital public that is excluded from a publicly funded private development—the many, many New Yorkers that real estate projects like Hudson Yards do not sustain. The aesthetic of sustainable design is an easier sell, and easier signifier, than the more complex and invisible changes to development that could otherwise move toward environmental and social equity. ...

As a set of technological and rational ideas meant to move a city forward, modernism persists. The city continues to be managed as a machine even when it is likened to an ecology, holding the promise of an explicit spatial order that needs to be restored in order for urban health to be maintained. Concerns surrounding equity, sustainability, and the environment, entering mainstream development discourse relatively recently, are subsumed within the perception of the city as economic engine of growth.
As the Hudson Yards project underscores, the process through which the image of sustainable development is upheld is not neutral. Rather than focusing on the city as an artifact of an urbanization process, evaluating the process of urbanization itself requires pulling in the economic, political, environmental, and social movements that shape urban design and development.
smart_cities  hudson_yards  sustainability  funding 
6 weeks ago
The Neural Network Zoo - The Asimov Institute
With new neural network architectures popping up every now and then, it’s hard to keep track of them all. Knowing all the abbreviations being thrown around (DCIGN, BiLSTM, DCGAN, anyone?) can be a bit overwhelming at first.

So I decided to compose a cheat sheet containing many of those architectures. Most of these are neural networks, some are completely different beasts. Though all of these architectures are presented as novel and unique, when I drew the node structures… their underlying relations started to make more sense.
networks  neural_nets  machine_learning  artificial_intelligence 
6 weeks ago
Activists Rush to Save Government Science Data — If They Can Find It - The New York Times
As thousands of academics, librarians, coders and science-minded citizens have gathered at what are called “data rescue” events in recent weeks — there were at least six this past weekend alone — the enormousness of extracting government data that is easily found has become apparent, as has the difficulty in tracking down the rest.

Some open-data activists refer to it as “dark data” — and they are not talking about classified information or data the government might release only if compelled by a Freedom of Information Act request.

“It’s like dark matter; we know it must be there but we don’t know where to find it to verify,” said Maxwell Ogden, the director of Code for Science and Society, a nonprofit that began a government-data archiving project in collaboration with the research libraries in the University of California system.

“If they’re going to delete something, how will we even know it’s deleted if we didn’t know it was there?” he asked.

The obstacles have spurred debate among open-data activists over how to build an archiving system for the government’s science data that ensures that the public does not lose access to it, regardless of who is in power.

“No one would advocate for a system where the government stores all scientific data and we just trust them to give it to us,” said Laurie Allen, a digital librarian at the University of Pennsylvania who helped found Data Refuge. “We didn’t used to have that system, yet that is the system we have landed with.”

At the moment, the closest thing to a central repository is, which, under a 2013 Obama administration directive, is supposed to link to all of the public databases within the government. But it relies on agencies to self-report, and the total size of all the data linked to by the directory, Mr. Ogden recently found, comes to just 40 terabytes — about as much as would fit on $1,000 worth of hard drives.

NASA alone provides access to more than 17.5 petabytes of archived data, according to its website (a petabyte is 1,000 times bigger than a terabyte), over dozens of different data portal systems.

And one-third of the links on, Mr. Ogden found, take users to a website rather than the actual data, which makes it hard to devise software that can automatically copy it.

Even databases that are listed on — and there are more than two million, according to Mr. Ogden’s published logs — often sit behind an interface designed for ease of use but built with proprietary code almost impossible to reproduce...

Andrew Bergman, a graduate student in applied physics at Harvard, along with two physics department colleagues, suspended his studies to help found the Environmental Data and Governance Initiative, which has also helped to organize the events....

The transition to digital distribution that made government documents more accessible, librarians say, has also left them more at risk. Without physical copies in libraries, the internet’s promise of making government information more widely available has made it far more centralized.

Except when certain data is the subject of a lawsuit or multiple F.O.I.A. requests, it remains unclear what compels an agency to keep it online.

“Destroying federal records is a crime,” said Patrice McDermott, who heads a public advocacy organization called Open the Government. “Taking them off of the internet does not have the same penalty.”

In a recent letter to the federal Office of Management and Budget, Ms. McDermott’s group cited a clause in the 1995 Paperwork Reduction Act that requires agencies to “provide adequate notice when initiating, substantially modifying, or terminating significant information dissemination products.”

But what that means for the age of big data has not been defined.
data  archives  digital_archives  deleting  big_data 
6 weeks ago
Building Technology Heritage Library : Free Texts : Download & Streaming : Internet Archive
The Building Technology Heritage Library (BTHL) is primarily a collection of American and Canadian, pre-1964 architectural trade catalogs, house plan books and technical building guides. Trade catalogs are an important primary source to document past design and construction practices. These materials can aid in the preservation and conservation of older structures as well as other research goals.
About the Building Technology Heritage Library

The BTHL contains materials from various private and institutional collections. These materials are rarely available in most architectural and professional libraries. The first major architectural trade catalog collection is that of the Canadian Centre for Architecture, which encompasses more that 4,000 catalogs from the early 19th century through 1963. In addition to the architectural trade catalogs, the initial contributions include a large number of house plan catalogs, which will be of great interest to owners of older homes. The future growth of the Building Technology Heritage Library will also include contemporary materials on building conservation.
archives  architecture  furniture 
6 weeks ago
How to Escape Your Political Bubble for a Clearer View - The New York Times
The filter bubble describes the tendency of social networks like Facebook and Twitter to lock users into personalized feedback loops, each with its own news sources, cultural touchstones and political inclinations. We seem to like these places, and so do social media companies — they keep us clicking from one self-affirmation to another. But now our bubbles are being blamed for leading us toward the most divisive presidency in recent memory, and suddenly, the bubble doesn’t feel so inviting anymore. So media and tech companies are pivoting, selling us a whole suite of offerings aimed at bursting the bubbles they helped to create.

Few people get a kick out of acknowledging their own biases, so new digital features are easing the way with candy-colored visuals and interactive quizzes. Download the Chrome extension PolitEcho and watch as it crawls through your Facebook network and visualizes its political bias based on how many of your friends “like” pages dedicated to Breitbart, Marco Rubio, Bernie Sanders or NPR. Then hop over to the PBS website and take a quiz, conceived by the libertarian Charles Murray, that rates your affiliation with “mainstream American culture.” (Rack up real-American points for having evangelical Christian friends, eating at IHOP and watching “Dr. Phil.”)....

Other tech products invite us to reach out and understand other people without the hassle of actually talking to them. FlipFeed, a Twitter plug-in created by M.I.T. researchers, provides a voyeuristic thrill: Click a button, and your regular Twitter feed is replaced by that of a random, anonymous user of a different political persuasion. (It’s perfect for seeing how the other half live-tweets a Trump news conference.) And the iPhone app Read Across the Aisle gamifies political outreach — as you read articles from The Huffington Post or The Federalist through the app, you’ll see a meter turn red or blue based on the particular site’s ideological bent....

The real ingenuity of these solutions lies in stripping opposing ideas of their negative emotional impact. It’s not too hard to find people who disagree with you online — just create a Twitter account, state an opinion and watch the haters roll up — but the heated social media climate provides a tense, abstracted version of human connection that often leaves both sides more polarized. To alleviate the tension, BuzzFeed is testing a new feature, “Outside Your Bubble,” which pulls in opinions from across the web and gives them a neutral platform. A curator takes the often-emotional comments, removes them from their combative context and rephrases them as cogent, dispassionate bullet points...

Meanwhile, a new crop of online media offerings comes equipped with like-minded guides who travel to the other side and present their findings. Every week, the Washington journalist Will Sommer publishes a kicky newsletter digest, “Right Richter,” which aggregates right-wing perspectives for left-leaning audiences. Slate’s “Today in Conservative Media” feature provides a similar service. And Crooked Media, a political podcast network created by former Obama staffers, just debuted a new show, “With Friends Like These,” in which the liberal journalist Ana Marie Cox shepherds listeners through conversations with conservative guests....

A cynical impulse lies behind many of these kumbaya vibes. The same social media networks that helped build the bubbles are now being framed as the solution, with just a few surface tweaks. On the internet, the “echo chambers” of old media — the ’90s buzzword for partisan talk radio shows and political paperbacks — have been amplified and automated. We no longer need to channel-surf to Fox News or MSNBC; unseen algorithms on Facebook learn to satisfy our existing preferences, so it doesn’t feel like we’re choosing an ideological filter at all....

as Mr. Trump rose, Facebook found itself assailed by critics blaming it for eroding the social fabric and contributing to the downfall of democracy. Facebook gave people what they wanted, they said, but not what they needed. So now it talks of building the “social infrastructure” for a “civically-engaged community.” Mr. Zuckerberg quoted Abraham Lincoln as inspiration for Facebook’s next phase.

The agitators and audiences for these new fixes have an ulterior motive for expanding their horizons, too. Recent calls to burst the filter bubble have come largely from liberals and #NeverTrump conservatives alarmed by their election losses. Their bipartisan spirit has partisan roots. President Trump’s critics feel the practical need to break down these ideological cocoons, so they can win next time.
epistemology  information_literacy  bubbles 
6 weeks ago
Kameelah Rasheed: Who Will Survive in America? – Guernica
When I was in middle school and high school, the thing that was most frustrating about how we learned history was the assumption that it charted along a linear path and that there were no interruptions, no moments when we were able to stop and think about why decisions were made. We were just told the decisions were made, and assigned to write a paper. I wanted to figure out, “What are those places in history where things don’t make sense and everyone has to pause because nobody has the answers?”

I’ve been reading a lot of Susan Howe. She’s a poet in her own right, but she also does a lot of work on Emily Dickinson and this idea of a glitch or a disobedient history—those points in history when things stop or stutter. I’m really interested in this idea of the stutter in history. As an educator, I used a lot of primary sources with my students in order to say, “Yes, this thing happened, but let’s analyze how it happened, and why it happened.”

In my own practice, I like thinking about those stutters, again, but also about footnotes—those parts of history that are so minute that they don’t end up in history books, but are still worth exploration. Maybe they focus on a particular neighborhood or a particular personal life experience. I spend a lot of time researching and thinking about the macro-narratives that exist around specific moments in history.... I believe that how people engage with my practice is a pedagogical experience. I see my work as an opportunity to do the kind of historical thinking in a public space that I wasn’t given the opportunity to do as a student and that I tried to give to my students when I was a teacher....

I reject the way that we have imagined the making of the archive as an administrative, objective, almost sterile process. I feel like archives are very dirty, very messy, really. Archiving is a subjective process; it’s a process that I hope engages and is relational and is not about someone sitting alone in an office. I have around four thousand found images of black families, and for me, making that archive is no different than creating an installation, because in both circumstances I’m collecting; I’m accumulating and I’m also trying to establish relationships between the things that I’m collecting. So in this archive of four thousand found images of black families, I’m making decisions....

I think that archiving is an art. To be able to organize things in a way that makes sense to others, in a way that’s inviting, in a way that tells a story, is an artistic process. In its best form, archiving is about storytelling. No one collects or creates an archive just for the purpose of having it. It’s about wanting to tell a story, wanting that story to be available to people in the future, and wanting that story to be interrupted by people who have other materials to contribute in the future.

I’m not coming to this practice of archiving as someone who has studied it in school, I’m learning both about the ways that it’s done professionally and about the ways that black families have done it for centuries, just to hold onto things. I’m trying to figure out what’s the best middle ground between the institutional questions and the ways that grandmothers and aunts put stuff in plastic bags underneath their beds, or organized photo albums, or sewed things into socks. There are all of these different ways that black folks have been archiving for centuries because we’ve been very much aware of the possibility of someone saying that we never existed. I’m interested in validating the institutional forms of archiving as well as the very home-grown forms of archiving which obviously deserve credit because, for the most part, what we know about our own history has not come from institutions doing this work, it’s come from us holding onto things....

There is a reality that we were never meant to survive in this American context as anything more than slaves. Why do you need to archive a slave? Why do you need to archive property other than on a bill of payment or a bill of sale? We’ve always had to be responsible for ourselves. Institutions like the Schomburg [a branch of the NYPL which specializes in African American life and history] have the stated purpose of doing this and other spaces have been created, like the new museum [of African American history and culture] that just opened up in DC. They exist as repositories for a lot of private collections. The reality is that for a long time we had to do it on our own....

We are at a point in history when visibility and inclusion are often conflated with radical change. If I hire a black person to be a screenwriter on my show, then radical change has occurred; if a person from a marginalized community who doesn’t traditionally get to be in the spotlight gets their fifteen minutes then, woah!, radical change has occurred. I’m really interested in interrogating visibility as a concession, as a premature celebration, because visibility in and of itself without the rigor of analyzing why certain people were invisible to begin with is limited. It is much more productive to think about how individuals can become not the first and only but the first of many.

I’m really focused on distinguishing between the optics of diversity and the actual structural impact of diversity. Everyone wants to hire a black person at their job, everyone wants to have a party around diversity, but are they really willing to do the work and make the sacrifices to get their organization (or our nation) to make structural change? That doesn’t come from cherry-picking people who will be hoisted up as markers of inclusion....

a lot of the radical work done in movements prior to our generation was not necessarily done through hyper-visibility. People covertly published things, and covertly educated people, and covertly got training. So I’m interested in how we can think about, not so much hiding, but strategic opaqueness—refusing to be legible.

There is also a persistent notion that in order to become palatable we need to package our history in a particular way and generalize it, and make it easy to understand. In doing so, though, we lose all of the nuance. I think it’s OK to be illegible; I think it’s OK to allow people to be confused because that’s a productive moment to incite people to do the real rigorous work of learning about history. We don’t always have to be visible in ways that are comfortable for other people because there is a sacrifice in that quick reading....

Then there’s Aria Dean, who wrote an essay that I keep referencing in almost every conversation I have with people, called “Poor Meme, Rich Meme,” in Real Life magazine, where she talks about black illegibility and the fact that for black people, on our own, there is no necessity for coherence. We can be as varied as we want to be. It is only when people need to read us and make sense of us that we consolidate ourselves, and in doing that we lose the variance in who we are. I’m trying to figure out the best ways to make us understandable without risking homogeneity or histories with easy narrative arcs.
historiography  archives  informal_archives  community_archives  diversity  visibility 
6 weeks ago
A Geology of Media | Public Seminar
Once you start digging beyond the idea that media is about interpreting signs, there’s no end to how deep the rabbit hole can become. Behind the system of signs is the interface that formalizes them (Manovich) or simulates them (Galloway). Behind that is the information turbulence the interface manages (Terranova), the hardware it runs on (Chun) and the stack of levels that processes it (Bratton). All of which incorporates the labor that operates it (Berardi) or is enslaved by it (Lazzarato) and which is incorporated within integrated circuits (Haraway). The class of workers who make the content might be doubled by a class of hackers who make the form (Wark).

The rabbit hole keeps going, becoming more of a mineshaft. For some the chemical and mineral dimension is also a big part of what appears when one looks behind the sign (Negarestani, Leslie, Kahn), which brings us to Jussi Parikka’s A Geology of Media (U. Minnesota Press, 2015). Which tunnels down into the bowels of the earth itself. Parikka: “Geology of media deals with the weird intersections of earth materials and entangled times.” (137)

In this perspective, “Computers are a crystallization of past two hundred to three hundred years of scientific and technological development, geological insights, and geophysical affordances.” (137) But one could also reverse this perspective. From the point of view of the rocks themselves, computers are a working out of the potentials of a vast array of elements and compounds that took billions of years to make but only decades to mine and commodify – and discard. History is a process in which collective human labor transforms nature into a second nature to inhabit. On top of which it then builds what I call a third nature made of information, which not only reshapes the social world of second nature, but which instrumentalizes and transforms what it perceives as a primary nature in the process. There’s no information to circulate without a physics and a chemistry.  “The microchipped world burns in intensity like millions of tiny suns.” (138)...

We’re used to thinking about a geopolitics of oil, but perhaps there’s a more elaborate Great Game going on these days based on access to these sometimes rare elements. Reza Negarestani’s Cycolonpedia is an extraordinary text which reverses the perspective, and imagines oil as a kind of sentient, subterranean agent of history. One could expand that imaginary to other elements and compounds. For instance, one could imagine aluminum as an agent in the story of Italian Fascism. Since bauxite was common in Italy but iron was rare, aluminum rather than steel became a kind of ‘national metal’, with both practical and lyrical properties. The futurist poet Marinetti even published a book on aluminum pages. What aluminum was to twentieth century struggles over second nature, maybe lithium will be to twenty-first century struggles over third nature....

It might make sense, then, to connect the study of media to a speculative inquiry into geology, the leading discipline of planetary inquiry. (A connection I approached in a different way in Molecular Red, by looking at climate science). Parikka: “Geology becomes a way to investigate the materiality of the technological media world.” (4) James Hutton’s, Theory of the Earth (1778) proposed an image of the temporality of the earth as one of cycles and variations, erosion and deposition. Hutton also proposed an earth driven by subterranean heat. His earth is an engine, modeled on the steam engines of his time. It’s a useful image in that it sees the world outside of historical time. But rather than having its own temporality, Hutton saw it as oscillating around the constants of universal laws. This metaphysic inspired Adam Smith. Hence while usefully different and deeper than historical time, Hutton’s geology it is still a product of the labor and social organization of its era.

Still, thinking from the point of view of the earth and of geological time is a useful way of getting some distance on seemingly fleeting temporalities of Silicon Valley and the surface effects of information in the mediated sphere of third nature. It also cuts across obsolete assumptions of a separate sphere of the social outside of the natural....

Parikka: “Media work on the level of circuits, hardware, and voltage differences, which the engineers as much as the military intelligence and secret agencies gradually recognized before the humanities did.” (3)...

But besides the intriguing spatial substitution, bringing the depths of geology into view, Parikka is also interested in changing temporal perspectives. German media theorist Wolfgang Ernst has written of media as a temporal machine, paying close attention to the shift from narrative to calculative memory. Also of interest is Siegfried Zielinski’s project of a media studies of deep time. Zielinski was trying to escape the teleological approach to media, where the present appears as a progressive development and realization of past potentials. He explores instead the twists and cul-de-sacs of the media archive. Parikka takes this temporal figure and vastly expands it toward non-human times, past and present....

A manifesto-like text in Mute Magazine once proposed we move on from psychogeography to a psychogeophysics. Drawing on the rogue surrealist Roger Caillois, the new materialism of Rosi Braidotti and Timothy Morton’s studies of hyperobjects, Parikka develops psychogeophysics as a low theory approach to experimentally perceiving the continuities of medianatures. “Perhaps the way to question these is not through a conceptual metaphysical discussion and essays but through excursions, walks, experiments, and assays? … Instead of a metaphysical essay on the nonhuman, take a walk outside…” (63)...

Here we might learn more from natural scientists trying to reach into the humanities than from philosophers trying to reach into the natural sciences. Parikka usefully draws on Stephen Jay Gould’s model of evolutionary time as a punctuated equilibrium, as a succession of more or less stable states in variation alternating with moments of more rapid change. There’s no sense of progress in this version of deep time, no necessary evolution from lower to higher, from simple to complex....

However, one limit to Parikka’s project is suggested by this very figure of the fossil, particularly if we think of what Quentin Meillassoux calls the arche-fossil. How is it possible to have a knowledge of a rock that existed before humans existed? How can there be knowledge of an object that existed in the world before there could be a correlative subject of knowledge? I’m not sure Parikka’s double articulation of media and geology really addresses this proposition.
chemistry  geology  media  deep_time 
6 weeks ago
Wendy's Subway / About
Wendy’s Subway is a non-profit library and writing space located in Bushwick, Brooklyn. It provides an open, versatile space where cultural production flourishes through reading, research, and collaborative practice, and is manifested in performance, publication, and education. Wendy’s Subway hosts a range of public programs, including readings and screenings, interdisciplinary talks and lectures, discussion and reading groups, and writing workshops. The non-circulating library holds a collection of books and documents with a special focus on poetry, art, theory, and philosophy, as well as the Laurin Raiken Archive, an extensive resource for the study of art history and criticism. Wendy’s Subway is operated by its membership of poets, curators, novelists, artists, and critics with an interest in hybrid forms and cross-disciplinary discourse.
libraries  library_art 
6 weeks ago
The Rise of the Modern Kitchen | Architect Magazine | Products, Kitchen, Interior Design, Cabinets
Kitchen renovation is one of the largest markets in the remodeling industry. In this month’s exploration of the BTHL, we trace the evolution of the residential kitchen from the simple cupboard of the early 20th century to the unified cabinetry and coordinated finishes of today. Our story begins with the Hoosier cabinet, a free-standing cupboard that incorporated storage space and a working countertop. Its name derives from the marketing reach of the Hoosier Manufacturing Co. of Indiana. At the time, a typical kitchen had a series of separate cabinets and appliances. By the mid-1920s, several manufacturers of kitchen cabinets were marketing cabinets and other millwork that could be joined to create a more unified appearance.

The major breakthrough leading to the kitchen of today occurred in the 1930s with the introduction of modular kitchen cabinets and continuous countertops. That era also corresponded to design changes and innovations within the Modern movement in materials, appliances, and plumbing fixtures. The period was a truly remarkable decade of residential transformation and the kitchen was the place where many Americans got their first chance to express their Modern design sensibilities.
archives  furniture  cabinets  storage  architecture 
7 weeks ago
« earlier      
academia acoustics advising aesthetics_of_administration algorithms archaeology architecture archive_art archives art audio big_data blogs book_art books branded_places branding cartography cataloguing cell_phones china cities classification collection collections computing conference craft curating data data_centers data_visualization databases dead_media design design_process design_research digital digital_archives digital_humanities digitization discourse diy drawing ebooks education epistemology exhibition exhibition_design filetype:pdf film furniture geography geology globalization google graduate_education graphic_design guerilla_urbanism hacking historiography history home illustration information information_aesthetics infrastructure installation intellectual_furnishings interaction_design interface interfaces internet koolhaas korea labor landscape language learning lettering liberal_arts libraries library_art listening little_libraries little_magazines locative_media logistics machine_vision magazines making mapping maps marketing material_culture material_texts materiality media media:document media_archaeology media_architecture media_city media_education media_form media_history media_literature media_space media_theory media_workplace media_workspace memory methodology multimodal_scholarship museums music music_scenes my_work networks new_york newspapers noise notes nypl object_oriented_philosophy objects organization palimpsest paper pedagogy performance periodicals phd photography place pneumatic_tubes poetry popups postal_service presentation_images preservation print printing privacy professional_practice public_space public_sphere publication publications publishing radio reading rendering research satellites screen security sensation sensors signs smart_cities smell social_media sound sound_art sound_map sound_space sound_studies space storage surveillance sustainability syllabus teaching telecommunications telegraph telephone television temporality text_art textual_form theory things tools transportation typewriter typography ums urban_archaeology urban_design urban_form urban_history urban_informatics urban_media urban_planning urban_studies video visualization voice wedding word_art workflow writing zines

Copy this bookmark: