Best (and Worst) Practices for Designing Learning Spaces | Library Babel Fish
A new Project Information Literacy report by the ever-curious researcher, Alison Head, has just been published, the first in a new “practitioner’s series.”  Planning and Designing Academic Library Learning Spaces involved interviewing 49 librarians, architects, and consultants involved in 22 library construction projects that were completed between 2011 and 2016. The research probes how these three parties negotiate their values and incorporate them into designs, what kinds of learning are these new and renovated spaces meant to support, and what best practices (and worst practices) might inform libraries embarking on a renovation.
It’s interesting to read this report against Scott Bennett’s influential study from 2003, Libraries Designed for Learning. He interviewed library directors about recent building projects and argued that we need to do a better job of considering “the library in the life of the user” rather than “the user in the life of the library,” that we should not focus planning on student learning rather than library functions. This study suggests we’ve made some progress. The most important goal for these renovations were creating flexible user-defined spaces for collaborative and individual learning. None of them focused on the needs of the library staff.
media_architecture  libraries  academic_libraries  planning 
2 days ago
How to survive your first year as department chair (essay)
Before you think about quitting, let’s first think about what you would need to be successful in this new role. Let me suggest a few strategies for moving forward.
Develop a new mentor map. Whenever we transition from one role to another (in your case from faculty member to administrator) it’s a whole new ball game. There are new rules (written and unwritten), you need a different set of skills to be successful, and you need a whole new set of mentors, sponsors, support and community. In other words, it’s perfectly normal for you have a wide range of needs during your transition, including chair-specific professional development, emotional support, a positive community of peers, accountability, role models, sponsors, access to opportunities and substantive feedback on your performance....

Seek external training. It sounds as if your campus provided a basic nuts-and-bolts overview of your new role, but that hasn’t been sufficient to meet your professional development needs. You may want to consider external professional development opportunities that are geared toward department chairs, such as ACE’s Leadership Academy for Department Chairs. Also, many disciplines have a “chairs conference” or training at their annual meetings that provide excellent information and networking. If you’re unable to attend an external conference, also consider online training opportunities for skill development...

Create a mastermind. I would also be remiss if I didn’t share my favorite strategy for transitioning into a new role: creating a mastermind group. It’s a simple, free and powerful tool that will provide you with emotional support, community and accountability. All you have to do is find a small group of positive peers (department chairs, either on or off the campus) who are willing to commit to meeting weekly for the purpose of tackling challenges and supporting one another in solving problems.
Here’s how it works:
You invite a group of peers (three or four other department chairs, working in a similar type of institution).
The group picks a weekly one-hour meeting time that every member agrees to hold sacred in their calendar (no canceling and no interruptions).
You meet at the designated time (face-to-face, by conference call or Google Hangout).
Every person gets an equivalent amount of time to present a challenge and let the group brainstorm on how to resolve it. Start the meeting with a quick round of wins and end with everyone clearly stating their concrete step forward.
academia  administration  chair 
2 days ago
Theory & Craft of Digital Preservation: My Next Book | Trevor Owens
Interdisciplinary dialog about digital preservation often breaks down when an individual begins to protest “but that’s not preservation.” Preservation means a lot of different things in different contexts. Each of those contexts has a history. Those histories are tied up in the changing nature of the mediums and objects for which each conception of preservation and conservation was developed. All to often, discussions of digital preservation start by contrasting digital media to analog media.  This contrast forces a series of false dichotomies. Understanding a bit about the divergent lineages of preservation helps to establish the range of competing notions at play in defining what is and isn’t preservation.

Building on work in media archeology, this chapter establishes that digital media and digital information should not be understood as a rupture with an analog past, Instead, digital media should be understood as part of a continual process of remediation embedded in the development of a range of new mediums which afford distinct communication and preservation potential. Understanding these contexts and meanings of preservation establishes a vocabulary to articulate what aspects of an object must persist into the future for a given preservation intent.
preservation  digital_preservation 
3 days ago
bulletproof dress — how to write a personal statement for graduate...
The personal statement is a slightly misleading title for this document. It is not primarily about you holistically in the way your college personal statement was. It serves ONE MAJOR PURPOSE: to demonstrate to a department that you understand how to formulate and pursue a research question, and that there is a good fit between your question and the department.
Your personal statement (for the humanities and social sciences) should follow roughly this outline.
advising  UMS 
3 days ago
Necsus | The audiovisual essay as performative researchNECSUS
All three of the audiovisual essays I have selected and discussed work in a number of ways as standalone forms of research expression. They convey meanings that can be summarised in writing; indeed, some of these are partly conveyed in writing in the films, as I have noted. But their self-contained performative acts (to rework Derrida) do not merely come back ‘to a constative or descriptive discourse’; they perform, they accomplish, they do what they say they do.[35] Yes, they can communicate with, take up their place in, and make a direct and original research contribution to bodies of work that do not take audiovisual forms. In this respect, they are not only multimodal artefacts but also transmedia ones. Yes, their accompanying written statements, or exegeses, certainly help to situate them, and possibly make them more legible, in these wider research threads and traditions. But these separate sets of words do not, indeed cannot replace or stand in for a key part of the ‘original knowledge’ that the audiovisual essays themselves generate, because the latter is performative, an integral part of ‘the force and effect of a creative production’
multimodal_scholarship  video  video_essays  performance 
3 days ago
The Internet Archive is building a Canadian copy to protect itself from Trump - The Verge
The Internet Archive, a digital library nonprofit that preserves billions of webpages for the historical record, is building a backup archive in Canada after the election of Donald Trump. Today, it began collecting donations for the Internet Archive of Canada, intended to create a copy of the archive outside the United States....

Kahle estimates it will cost “millions” of dollars to host a copy of the Internet Archive in Canada, but it would shield its data from some American legal action.

The future of privacy and surveillance under the Trump administration remains unpredictable, but the president-elect has shown support for greater law enforcement surveillance powers and legal censorship, including “closing that internet up in some ways” to fight terrorism. “Somebody will say, 'Oh freedom of speech, freedom of speech.' These are foolish people,” ...

Kahle notes that moving the internet archive would both insulate it from efforts to take down specific content, and make it harder to request data on user activity — something that more traditional librarians fought when American surveillance powers expanded under George W. Bush. And whatever happens, a Canadian copy would create more redundancy for data that can be seemingly ubiquitous but deceptively fragile. “The history of libraries is one of loss,” writes Kahle. “The Library of Alexandria is best known for is disappearance.”
libraries  digital_preservation  LOCKSS 
4 days ago
Tactical Poetics: FloodNet's Virtual Sit-ins | Rhizome
But Electronic Disturbance Theater had little interest in playing the role of a shadowy underground resistance. Rather, they simply wanted to show that there were thousands of users watching, ready to act in solidarity with the Zapatistas in whatever ways possible. Further, they eschewed the notion of the necessity for a hacker skill set in order to be an effective online activist. They harnessed a mundane function of the internet in a way that your average user could replicate, even creating a do-it-yourself “Disturbance Developers Kit” kit for widespread public use.

FloodNet’s mundane engagement of the internet’s infrastructure makes it all the more interesting today. EDT used the native language of the internet to create gaps in its functionings. Brett Stalbaum notes that part of EDT’s thinking was an awareness that “there is an important distinction between representation and engagement.”8 Rather than amplifying a representation of the struggles of the Zapatistas—by way of, say, an awareness campaign in wealthy western countries—the group engaged the Mexican government with direct action.

The question of representation versus engagement when it comes to online activism has re-emerged as a central problem in recent years. The accelerated rise of social media platforms has caused a radical shift. Activists can disseminate their messages far and wide on platforms like Twitter and Facebook. On these platforms, communities have been extended beyond their geographical limitations. One can organize with greater ease and reach. There is always the possibility and aim of going viral. But most of these and the other popular forms of online activism belong to the realm of representation and do little for engagement. The internet has become a fantastic place to make things visible, but an increasingly difficult place to cause a disturbance.

With FloodNet, EDT showed that symbolic and direct action could be combined. The project had a direct effect on the infrastructure of its target, but it was framed as a symbolic intervention, the inscription of an error message onto a remote server. For Dominguez, the affective and the effective potential of the project could not be separated from one another:

The question of aesthetics, at least for us, creates a disturbance in the “Law” to the degree that it cannot easily contain the “break” and it is forced to enter into another conversation—a conversation that power-as-enforcement may not want to have.

Spreading a hashtag is one way to use the infrastructure built for us by Silicon Valley, but while it has many merits, it is too reliant on and easily contained by the intended functions of the social media system. Where are the gaps? The weak points? How can we create the break that cannot be contained, and force the conversation that power-as-enforcement does not want to have? FloodNet sets a precedent for online art and activism that feeds off of the network’s own indeterminacy and an awareness that the virtual is closer to the real than we imagine.
civil_disobedience  hacking  infrastructure 
5 days ago
By looking at imagery metadata, interviews, meeting minutes of space treaties, and technological failures, Uncharted is an ongoing research inquiry into the rhetoric of the contemporary globe—and by what mechanisms it came to be.
cartography  satellites  mapping 
6 days ago
Sidewalk Labs Spinoff Could Be Coming to Your City – Next City
A year after Google-backed smart city tech company Sidewalk Labs launched, the company plans to expand its reach with a network of themed labs. The labs will work with cities to churn out products focused on issues such as citizen engagement, internet connectivity, transportation and access to public space.

Dan Doctoroff, CEO of Sidewalk Labs, explained the move in a blog post Wednesday, saying that the labs will be led by entrepreneurs and consist of “hyper-focused, cross-disciplinary teams of policy experts, engineers, product managers, and designers  —  a full range of urbanists and technologists.”

The first eight or nine could be up and running within six months to a year, with more to follow. The first four have already been named: Build Lab will tackle housing affordability by exploring ways to construct more affordable and flexible buildings; Care Lab will look at health challenges faced by low-income residents; Manage Lab will focus on how cities can use data to relieve budget pressures and improve efficiency; and Model Lab will explore tools to make transportation more affordable and sustainable.

“The aim is to keep these labs open: engaging the public, sharing what we’ve learned, and refining our ideas,” Doctoroff wrote.

The idea is also that these entities will spin off as separate companies, creating a network of tiny Sidewalk Labs across the U.S. They’ll also partner with other organizations or city agencies, similar to Sidewalk’s current partnerships with U.S. DOT and Transportation for America.

Doctoroff elaborated on the creation of a “living laboratory district” too. He wrote that there isn’t a “single city today that can stand as a model for our urban future,” and that such a district with “real-world conditions” could be the best way to try out solutions.

It could take the form of a similar “living laboratory” effort recently started in Spokane, Washington. Six partners, including the city and Washington State University, are using the 770-acre University District as a “blank canvas” to test out smart city tech. The effort is driven by the city’s new smart cities lab, and aims to answer “some of the thornier questions around smart cities collaborations,” such as who governs the partnership and who owns data gathered by a private company for the city.
sidewalk_labs  smart_cities  urban_planning 
6 days ago
Google Might Hold a Contest to Make One City the Most Advanced in the World |
Now, Sidewalk Labs is shedding more light on those plans. In a blog post yesterday, CEO Dan Doctoroff confirmed that the company is scouting locations for the city of the future, and might hold a competition to determine the eventual location.

Doctoroff writes that the project would serve to "explore coordinated solutions, showcase innovations, and establish models for others to follow." The city of the future, in Sidewalk Labs' view, would offer free high-speed Wi-Fi for all and would include automated trash systems, sustainable energy, and self-driving cars.

When applied on a citywide scale, Doctoroff says, these advancements could reduce greenhouse emissions by two-thirds and save the average resident an hour of time each day, due largely to transportation improvements.

Many of the urban improvements Sidewalk Labs envisions stem from autonomous driving. The company predicts this innovation will reduce the need for on-site storage, since people will be able to cheaply order goods on demand--thus meaning they require less living space and cheaper rents. And eliminating parking spaces would mean more outdoor open areas. "It would put everyone within a short walk of a park," Doctoroff writes.
sidewalk_labs  urban_planning  smart_cities 
6 days ago
Panels | The Sonic Boom: Sustaining Sound Studies
The Sonic Boom Conference is designed to be interdisciplinary in scope, echoing the manner in which Sound Studies links together and enriches disparate fields of learning and practice. Our panel sessions draw together distinguished scholars and practitioners from a wide variety of fields — history and music, film studies and neuroscience, otolaryngology and video game design. We have organized these speakers into three focused panels, in the hopes of fostering new dialogues on three core topics: sound in culture and society, sound science, and sound practices.
sound_studies  sound_design  sound_space  archives  preservation  listening  neuroscience 
7 days ago
China’s New Tool for Social Control: A Credit Rating for Everything - WSJ
Hangzhou’s local government is piloting a “social credit” system the Communist Party has said it wants to roll out nationwide by 2020, a digital reboot of the methods of social control the regime uses to avert threats to its legitimacy.

More than three dozen local governments across China are beginning to compile digital records of social and financial behavior to rate creditworthiness. A person can incur black marks for infractions such as fare cheating, jaywalking and violating family-planning rules. The effort echoes the dang’an, a system of dossiers the Communist party keeps on urban workers’ behavior.

In time, Beijing expects to draw on bigger, combined data pools, including a person’s internet activity, according to interviews with some architects of the system and a review of government documents. Algorithms would use a range of data to calculate a citizen’s rating, which would then be used to determine all manner of activities, such as who gets loans, or faster treatment at government offices or access to luxury hotels.
smart_cities  social_engineering  big_data  infrastructure  surveillance  quantified_self  censorship 
7 days ago
Alternative Art School Fair | Pioneer Works
The Alternative Art School Fair presents an introduction to alternative art schools from around the US and the world.

Art education is a reflection of social and cultural evolution; it engages with structures of meaning-making and considers different frameworks for experience. The impetus to create an alternative art school is rooted not only in a desire to create “better” art, but to create the conditions for greater freedom of expression. Often run as free, artist-run initiatives, the values and visions of alternative art schools vary widely in methodology, mission and governance. But even when they are relatively small in scale they provide vital models of cultural critique and experimentation.
alternative_school  pedagogy  teaching 
9 days ago
How to Be Perfect by Ron Padgett | Poetry Foundation
Get some sleep.

Don't give advice.

Take care of your teeth and gums.

Don't be afraid of anything beyond your control. Don't be afraid, for
instance, that the building will collapse as you sleep, or that someone
you love will suddenly drop dead.

Eat an orange every morning.

Be friendly. It will help make you happy.

Raise your pulse rate to 120 beats per minute for 20 straight minutes
four or five times a week doing anything you enjoy.

Hope for everything. Expect nothing.

Take care of things close to home first. Straighten up your room
before you save the world. Then save the world.
poetry  smart  liberal_arts  UMS 
9 days ago
Chirologia, or The Natural Language of the Hand (1644) | The Public Domain Review
John Bulwer (1606 – 1656), an English doctor and philosopher, attempted to record the vocabulary contained in hand gestures and bodily motions and, in 1644, published Chirologia, or the Naturall Language of the Hand alongside a companion text Chironomia, or the Art of Manual Rhetoric, an illustrated collection of hand and finger gestures that were intended for an orator to memorise and perform whilst speaking.

For Bulwer, gesture was the only from of speech that was inherently natural to mankind, and he saw it as a language with expressions as definable as written words. He describes some recognisable hand gestures, such as stretching out hands as an expression of entreaty or wringing them to convey grief, alongside more unusual movements, including pretending to wash your hands as a way to protest innocence, and to clasp the right fist in the left palm as a way to insult your opponent during an argument. Although Bulwer’s theory has its roots in classical civilisation, from the works of Aristotle, he was inspired by hundreds of different works, including biblical verses, medical texts, histories, poems and orations, in order to demonstrate his conclusions.

The language of gesture proved a popular subject in the age of eloquence, and inspired many similar works. Bulwer’s work was primarily meant for the pulpit, but also had applications for the stage. Although we do not know if these hand gestures were ever used by public speakers as they were intended, there is some evidence of the book’s impact on popular culture. Laurence Sterne’s novel Tristram Shandy (completed in 1767) features characters who clasp their hands together in the heat of argument, one who dramatically holds his left index finger between his right thumb and forefinger to signal a dispute, and another who folds his hands as a gesture of idleness.
gesture  touch  index  catalogue  hands 
9 days ago
The Case for a New Kind of Core - The Chronicle of Higher Education
Selectivity is an obvious challenge in this exercise. I have arbitrarily limited my core curriculum to eight one-semester courses, which would amount to no more than half of an undergraduate education, so it would not eliminate the ability to have a major or to choose elective courses. Here goes:

Information Acquisition

Google is a life-changing tool that we all use, but it doesn’t overlap perfectly with one of the core methodological skills of college students, which is locating usable information. To do that well, one has to know something about the sociology of knowledge — that is, who creates information, under what conditions, subject to what distorting pressures. It is pretty easy to cure students of the idea that everything they encounter online, or elsewhere, is true; a more challenging and important task is communicating a basic typology of information (academic, documentary, journalistic, governmental, crowdsourced, and so on) along with the idea that information isn’t cleanly divided into true and false, but is instead created through constant contention and revision. Some of the purpose of this course would be to give students a basic user’s guide to higher-education study: how to use libraries and online databases, how to distinguish among a multiplicity of sources, especially online, and how to perform a basic literature review. The kind of assignments that might go with this course would ask students to write a basic summary of what’s known about a subject, or to adjudicate between two widespread conflicting claims.

Cause and Effect

This is something like a course in the basics of the scientific method, aimed at people who aren’t necessarily going into science. The core thinking process entails stating what question you’re trying to answer, then establishing a hypothesis as to what the answer might be, then finding a way to test the hypothesis by gathering material that would settle its degree of trustworthiness. The title of the course refers to the idea that causation is a key concept in almost all fields of inquiry, which is too often used sloppily or instinctively, with unfortunate results. One could teach this course using primarily scientific examples, but that isn’t strictly necessary; for years, I have been teaching a version of it to journalists, using news stories as the main material. What might explain, for example, why violent crime has decreased so much more in New York City than in Chicago? What’s important is conveying the idea that making inferences is a skill, and that a series of thinking techniques is powerfully helpful in performing it.


The focus here is on close reading of texts, a fundamental academic skill that students may have missed in high school, that they will need to succeed in college, and that will also prove to be both practically helpful and emotionally enriching as they go through life. There are a number of ways to teach it from different disciplines, which could fruitfully be combined in the course: literary reading, analytic reading, and so on. Therefore this course could have elements of an English class, or a social-science class, or a class in law or religious studies. The main idea is to learn to read for meaning, for subtlety, for contradiction and ambiguity, and for connection to other texts. Some of the same skills could potentially be applied to material from film or drama. Assignments in this course would be traditional analytic papers, whether on the full meaning of a biblical passage or the governing principles embedded in the U.S. Constitution.


I am persuaded by the broad argument that the political scientist Andrew Hacker makes (talking about elementary and secondary school rather than college) in The Math Myth and Other STEM Delusions (The New Press, 2015). For purposes of general education, not the specific education of people going into fields that require mathematics, colleges should require undergraduates to take a course that familiarizes them with the quantitative world. It is deeply present in just about everything, including not-obviously-scientific realms like politics and government. This need not be a math course per se. Hacker suggests pulling examples out of everyday life that illustrate the broad applicability of being able to think confidently about numbers — poll results, sports statistics, stock-market indicators, government economic data (these examples are mine, not Hacker’s). The idea is to make students understand how numbers are generated, how to compare quantities from different realms, and some of the basic concepts underlying probability and statistics.


Most people, including students entering college, believe that the world as it appears to them and the people around them is the world as it is. It is crucial, and not easy, to teach people that they actually have a particular perspective, which inescapably has its limits — and then to help them understand that other people experience the world profoundly differently, which ought to be understood rather than dismissed. This project is central to a number of disciplines, including sociology, anthropology, literature, psychology, and even the client-oriented aspects of professional education, any of which might be brought in. Courses on diversity or understanding other cultures would have some overlap with what I am proposing, but I worry that those sometimes take the edge off the complexity and difficulty of the subject by communicating the idea that through tolerance, respect, and understanding, a person can successfully adopt a benign, universal perspective that can honor all other perspectives. That’s appealing, but it’s important not to let students believe that their own viewpoint can ever escape being limited in important ways, or that fundamental conflicts between perspectives can ever be entirely avoided.

The Language of Form

The course title is a slightly modified version of a term that the digital humanist Johanna Drucker uses in Graphesis: Vis­ual Forms of Knowledge Production (Harvard University Press, 2014). She focuses on how we increasingly get our information in the form of visual displays rather than texts or numbers. She explores mainly a deep understanding of charts and graphs, which are ubiquitous in the life of every educated person, but the method could be extended to the third dimension so that questions of how space and volume are arranged could also be considered. This course would have elements of design, architecture, planning, art, and even ecology. I want to distinguish it, though, from "design thinking," as promoted at the Stanford and elsewhere, which understands design not as encompassing everything visual and volumetric, but as more specifically about the process of making things. This should be a course in intelligently seeing and producing visual information, not in prototyping products and training people to plan and iterate projects in teams, which is useful but less universal than what I have in mind.

Thinking in Time

This, to some extent, is a course on the historical method, but it’s meant to do more than teach people to do historical research per se. To most students arriving at college, the past often seems safer than it actually was, outcomes more inevitable than they were, and operative assumptions closer to the ones we use today. Historical thinking is a powerful way of opening people’s minds to unfamiliar possibilities and ways of thinking, a process central to a liberal education. It can make students see that everything could have turned out differently, that individual people always operate within social, economic, and cultural contexts. One could teach such a course by focusing on a period in history, but that wouldn’t be strictly necessary, and the primary aim would not be to teach students the procession of significant events in a particular time and place. Similarly, it would be a good idea to study original historical documents in this course, but that’s a means to an end, not the end itself.


Back in the 19th century, when undergraduate core curricula were the rule rather than the exception, practically everybody had to take a course in rhetoric or oratory. The requirement often had roots in the colleges’ original mission of training ministers, and it usually vanished with the advent of the elective system. This course would aim to revive the tradition by teaching students how to make a compelling and analytically sound argument, both written and spoken (and probably also, inevitably, in PowerPoint). It is an endeavor with centuries of interesting thought behind it, so one can imagine the course drawing on philosophy, law, theology, even drama — with the opportunity to consider exemplary arguments from the past. It should be obvious that the assignments would ask students to practice the skills the course is teaching them, in writing and in performance.
teaching  pedagogy  liberal_arts  curriculum 
10 days ago
Introducing Documenting the Now - Maryland Institute for Technology in the Humanities
open source Web application called DocNow that will allow researchers and archivists to easily collect, analyze and preserve Twitter messages and the Web resources they reference. The second is to cultivate a much needed conversation between scholars, archivists, journalists and human rights activists around the effective and ethical use of social media content...

many of the conversations at the conference in DC centered on the ongoing protests over the killing of Michael Brown, an unarmed African American teenager, by white police officer Darren Wilson. News of the killing and ongoing protests spread initially in social media.. Even as traditional media began reporting on the story, their narrative was challenged, and reframed by the conversation in Twitter. While the democratizing role of social media is ideologically complex, Sarah Jackson and Brooke Foucault Welles have uncovered evidence that in Ferguson, Twitter allowed individual initiators to raise awareness about the events in the initial hours following the death of Michael Brown....

relationship between the DocNow application we are building to other projects in the Web archiving space: specifically the Social Feed Manager from George Washington University and Rhizome’s WebRecorder project....

In DocNow we are explicitly interested in using the social media stream as a lens for finding and evaluating Web content... our primary use cases involve curation...

our approach to visualization and analysis. In order to allow curators and archivists to build collections of social media and Web content we will necessarily need to build views into the collected data. We can anticipate a set of views, or a dashboard of sorts that provides insight into the conversation and the Web content, as well as functionality to collect and annotate it.
archives  social_movements  black_lives_matter 
15 days ago
Data colonialism through accumulation by dispossession: New metaphors for daily data
After the hubristic declaration of the ‘end of theory’ more nuanced arguments have emerged, suggesting that increasingly pervasive data collection and quantification may have significant implications for the social sciences, even if the social, scientific, political, and economic agendas behind big data are less new than they are often portrayed. Compared to the boosterish tone of much of its press, academic critiques of big data have been relatively muted, often focusing on the continued importance of more traditional forms of domain knowledge and expertise. Indeed, many academic responses to big data enthusiastically celebrate the availability of new data sources and the potential for new insights and perspectives they may enable. Undermining many of these critiques is a lack of attention to the role of technology in society, particularly with respect to the labor process, the continued extension of labor relations into previously private times and places, and the commoditization of more and more aspects of everyday life. In this article, we parse a variety of big data definitions to argue that it is only when individual datums by the million, billion, or more are linked together algorithmically that ‘big data’ emerges as a commodity. Such decisions do not occur in a vacuum but as part of an asymmetric power relationship in which individuals are dispossessed of the data they generate in their day-to-day lives. We argue that the asymmetry of this data capture process is a means of capitalist ‘accumulation by dispossession’ that colonizes and commodifies everyday life in ways previously impossible.
big_data  colonialism  accumulation  methodology  privacy  labor  digital_labor 
17 days ago
BAK/ Future Vocabularies/ Instituting Otherwise/ Itineraries/ Research/ Learning Laboratories
A research exhibition and a symposium, Learning Laboratories: Architecture, Instructional Technology, and the Social Production of Pedagogical Space Around 1970 sets out to reconstruct educational imaginaries—the past’s conceptions of the future of education—in an archaeological excavation of learning spaces and knowledge environments of the 1960s and 1970s. In stark contrast to the present condition of crisis in education— one defined by containment and separation, extreme economization and commodification, neoliberal managerialism and an “outcome”- oriented fetishization of measurability—the architectural programs and educational research around 1970 gave rise to a number of experimental building principles and pedagogical ideals such as the “comprehensive school,” the “open plan school,” and “schools without walls.”

Through a number of selected case studies in the edu-architectural design and learning technologies of the period, Learning Laboratories explores the experimental embodiment of several spatio-pedagogical ideologies, opening out to developments in educational design, politics, and psychology. ...

The case studies include, among others: the Laboratory School and the Oberstufen-kolleg, Bielefeld, both based on concept of the “de-schooled school” (research prepared by Gregor Harbusch); televisual education—otherwise known as “tele-learning”—developed in Ivory Coast with the aim of drawing rural populations into the educational system; the extraterritorial “pilot schools,” and the nomadic schools carried through the Liberated Zones by guerrilla fighters in the anti-colonial liberation wars in Guinea- Bissau, presented here by Filipa César and Sónia Vaz Borges. The exhibition also comprises works by: Hartmut Bitomsky and Harun Farocki; Darcy Lange; Wendelien van Oldenborgh; and Florian Zeyfang, Alexander Schmoeger, and Lisa Schmidt-Colinet. These works explore through photographic, video, and written materials various learning processes, technologies, and facilities from a range of contexts in Germany, the United Kingdom, the Netherlands, and Cuba.
education  learning_space  pedagogy 
17 days ago
The NSA’s Spy Hub in New York, Hidden in Plain Sight
It was an unusually audacious, highly sensitive assignment: to build a massive skyscraper, capable of withstanding an atomic blast, in the middle of New York City. It would have no windows, 29 floors with three basement levels, and enough food to last 1,500 people two weeks in the event of a catastrophe.

But the building’s primary purpose would not be to protect humans from toxic radiation amid nuclear war. Rather, the fortified skyscraper would safeguard powerful computers, cables, and switchboards. It would house one of the most important telecommunications hubs in the United States — the world’s largest center for processing long-distance phone calls, operated by the New York Telephone Company, a subsidiary of AT&T.

The building was designed by the architectural firm John Carl Warnecke & Associates, whose grand vision was to create a communication nerve center like a “20th century fortress, with spears and arrows replaced by protons and neutrons laying quiet siege to an army of machines within.”....

True to the designers’ original plans, there are no windows and the building is not illuminated. At night it becomes a giant shadow, blending into the darkness, its large square vents emitting a distinct, dull hum that is frequently drowned out by the sound of passing traffic and wailing sirens.

...An investigation by The Intercept indicates that the skyscraper is more than a mere nerve center for long-distance phone calls. It also appears to be one of the most important National Security Agency surveillance sites on U.S. soil — a covert monitoring hub that is used to tap into phone calls, faxes, and internet data....

Documents obtained by The Intercept from the NSA whistleblower Edward Snowden do not explicitly name 33 Thomas Street as a surveillance facility. However — taken together with architectural plans, public records, and interviews with former AT&T employees conducted for this article — they provide compelling evidence that 33 Thomas Street has served as an NSA surveillance site, code-named TITANPOINTE.
Inside 33 Thomas Street there is a major international “gateway switch,” according to a former AT&T engineer, which routes phone calls between the United States and countries across the world. A series of top-secret NSA memos suggest that the agency has tapped into these calls from a secure facility within the AT&T building. The Manhattan skyscraper appears to be a core location used for a controversial NSA surveillance program that has targeted the communications of the United Nations, the International Monetary Fund, the World Bank, and at least 38 countries, including close U.S. allies such as Germany, Japan, and France....

The 2011 guide, written to assist NSA employees visiting various facilities, discloses that TITANPOINTE is in New York City. The 2013 guide states that a “partner” called LITHIUM, which is NSA’s code name for AT&T, supervises visits to the site.

The 33 Thomas Street building is located almost next door to the FBI’s New York field office — about a block away — at Federal Plaza. The 2011 NSA travel guide instructs employees traveling to TITANTPOINTE to head to the FBI’s New York field office....

It is not clear how many people work at 33 Thomas Street today, but Warnecke’s original plans stated that it would provide food, water, and recreation for 1,500 people. It would also store 250,000 gallons of fuel to power generators, which would enable it to become a “self-contained city” for two weeks in the event of an emergency power failure. The blueprints for the building show that it was to include three subterranean levels, including a cable vault, where telecommunications cables likely entered and exited the building from under Manhattan’s bustling streets....

But the site has other capabilities at its disposal. The NSA’s documents indicate that it is also equipped with powerful satellite antenna — likely the ones located on the roof of 33 Thomas Street — which monitor information transmitted through the air.

The SKIDROWE spying program focuses on covertly vacuuming up internet data — known as “digital network intelligence” — as it is passing between foreign satellites. The harvested data is then made accessible through XKEYSCORE, a Google-like mass surveillance system that the NSA’s employees use to search through huge quantities of information about people’s emails, chats, Skype calls, passwords, and internet browsing histories....

As in 1983, AT&T may not be completely alone at 33 Thomas Street. Earlier this year, a technician working at the building — who did not want to be named because he was not authorized to speak to the media — told The Intercept that a handful of Verizon employees were still based inside. However, the NSA’s documents do not suggest that Verizon is implicated in the surveillance at the TITANPOINTE facility, and instead only point to AT&T’s involvement. Verizon declined to comment for this story.
surveillance  media_architecture  telecommunications  NSA 
18 days ago
National Security Agency Said to Use Manhattan Tower as Listening Post - The New York Times
From a sidewalk in Lower Manhattan, the building at 33 Thomas Street, known as the Long Lines Building, looks like nothing less than a monument to the prize of privacy.

With not a window in its walls from the ground up to its height of 550 feet, 33 Thomas looms over Church Street with an architectural blank face. Nothing about it resembles a place of human habitation, and in fact it was built for machines: An AT&T subsidiary commissioned the tower to house long-distance phone lines. Completed in 1974, it was fortified to withstand a nuclear attack on New York, and the architect made plans to include enough food, water and generator fuel to sustain 1,500 people for two weeks during a catastrophic loss of power to the city.

Now, an investigative article in The Intercept and an accompanying 10-minute documentary film, “Project X,” opening on Friday at the IFC Center in Greenwich Village, say the building appears to have served another purpose: as a listening post code-named Titanpointe by the National Security Agency. The article and film say that Titanpointe was one of the facilities used to collect communications — with permission granted by judges — from international entities that have at least some operations in New York, such as the United Nations, the International Monetary Fund, the World Bank and 38 countries.

...Equipment in the building monitored international long-distance phone calls, faxes, videoconferencing, voice calls made over the internet.

Much of the documentation for the article and film draws on material provided by Edward Snowden, a former contractor for the agency who released information in 2013 about the N.S.A.’s collaboration with telecommunication companies in vast surveillance programs. Laura Poitras, who collaborated with Henrik Moltke on the documentary film, was a member of a group of journalists awarded a 2014 Pulitzer Prize for its reporting on Mr. Snowden’s revelations.

....the building’s location about a block from F.B.I. offices at 26 Federal Plaza; and a reference to satellite intercepts for a program called Skidrowe. The building has satellite dishes on the roof and is the only site in New York City where AT&T has a Federal Communications Commission license for such stations, according to Mr. Moltke, who wrote the article with Ryan Gallagher....

The New York Times and Pro Publica reported in August 2015, that AT&T had had a close relationship with the N.S.A. for decades and had been lauded by the agency for its “extreme willingness to help.”

However, neither the materials from Mr. Snowden nor the new reports state with certainty that the N.S.A. was using AT&T space or equipment. As it happens, while AT&T Inc. owns the land at 33 Thomas, it has only about 87 percent of the floor space; the balance is owned by Verizon.

Asked about the Intercept report, Fletcher Cook, an AT&T spokesman, did not directly respond but said the company provided information when legally required or in specific emergency cases. “We do not allow any government agency to connect directly to or otherwise control our network to obtain our customers’ information,” he said.
surveillance  media_architecture  NSA  infrastructure  listening  telecommunications 
19 days ago
Tunisians are being encouraged to read by turning taxis into libraries — Quartz
But taxi driver Ahmed Mzoughi, 49, has taken a more cerebral approach to his vehicle’s decor. Scattered on the seats and lining the dashboard are slim volumes of poetry, fat novels, and psychology books. Stuck on a side door is a decal that says, “Attention: This Taxi Contains a Book.”
That’s the tagline for a literary initiative launched in October by online book-sharing platform YallaRead (“Come on, Read” in Arabic). In collaboration with E-Taxi, an Uber-style cab-hailing service, YallaRead has put books in a select number of cabs like Mzoughi’s, giving passengers the chance to skim a few pages of Paulo Coelho or Naguib Mahfouz from the comfort of the backseat. Traffic jams are common enough in Tunis that you can read at least the first few paragraphs of a book in one trip, while a journey across the city lends itself to a full chapter.
libraries  little_libraries  transportation 
19 days ago
Watch Soundbreaking, PBS' 8-Part Documentary Exploring the History of Recorded Music | Open Culture
From November 14 through November 23, PBS is airing an eight-part series, Soundbreaking, which explores the art of recording music and the moments when new sounds were born. The series features “more than 160 original interviews with some of the most celebrated recording artists of all time,” highlighting the “cutting-edge technology” that transformed the way we make music. You can now stream 3 of the first 8 episodes online, with the rest soon to come. If there are any geo-restrictions, we apologize in advance.
music  recording  documentary 
20 days ago
The Complicated History of the Beloved Composition NotebookEye on Design | Eye on Design
Most people can trace the notebook back to around 1887, when manufacturers brought the model over from France to the U.S., where brands like Mead, Norton, Roaring Springs, and dozens of others have been pumping them out ever since. No matter what corner store you buy one in, and regardless of the maker, they all look relatively similar because there’s no copyright for the marbled cover or any of its components. If you ask why no one has thought to differentiate their brand from the rest, you might also ask why a company like Mead would mess with a good thing; its composition book has been a top seller for decades. 
notebooks  writing 
20 days ago
The File Room | Net Art Anthology
Antoni Muntadas’s The File Room is a temporary physical installation and an open-ended online database that contain records of past cases of censorship around the world. Its mission—to reintroduce deleted and suppressed material back into the public record—could never be completed, and so visitors were invited to add their own instances of artistic and cultural censorship to an open-ended archive, which is now accepting submissions anew following its restoration by Rhizome.
archive_art  database_art  censorship  erasure  aesthetics_of_administration  bureaucracy 
21 days ago
Spatial Thought - e-flux Architecture - e-flux
More like an environment than a traditional exhibition, Les Immatériaux was a scenography, an informational space or interface where objects, sounds, projections, music, and texts conveyed an image bordering on an “overexposition,” as Lyotard says, drawing on Paul Virilio’s concept of the “overexposed city.”4 Unlike the nineteenth-century world exhibitions, the aim of such an overexposure was not to project a sense of newness and amazement—not to simply affirm the seductive power of the new—but rather to trigger a “reflexive unease” in our relation to things that we already dimly sense.
In a conversation with Hans Ulrich Obrist, Phillipe Parreno recalls visiting Lyotard’s exhibition:
Les Immatériaux was an exhibition producing ideas through a display of object in space. It was very different from writing a book or developing a philosophical concept. And that’s precisely what I loved in that exhibition, that it wasn’t a conceptual exhibition. I learned later that Lyotard wanted to do another exhibition, Resistance. “Resistance” isn’t a good title. You immediately think of a series of moral issues. But when I met him, I understood that he meant in fact resistance in another way. In school when you study physics you are told that frictional forces are not important—the forces of two surfaces in contact let certain axioms become uncertain. I think that’s what Resistance was to be about.
exhibition_design  exhibitions  epistemology  media_architecture  materiality 
21 days ago
Cellphone Smudges Yield a Trove of Forensic Data - WSJ
Those smudges on your cellphone reveal intimate details about your lifestyle, a new study says, potentially offering a new tool for criminal profiling, airport screening, clinical trials and environmental exposure studies.

Traces of molecules and microbes left when you handle your phone can add up to a composite portrait, including gender, diet, medications, clothing, beauty products, and places visited, researchers at the University of California in San Diego said Monday.

Such chemical signatures likely build up whenever someone regularly touches a phone, keys, credit cards, or other personal possessions—and can linger for months, they said.

The new forensic technique, reported in the Proceedings of the National Academy of Sciences, isn’t yet admissible in court, nor is it precise enough to indisputably identify a single person, like a fingerprint or a DNA sample, the researchers said. But it has the potential to help investigators use objects found at crime scenes to narrow the range of potential suspects.
forensics  data  archives  cell_phones  biomedia  microbes 
21 days ago
The ethics of smart cities and urban science | Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences
Such a framing led to initial spatial and urban science to be roundly criticized within the social sciences for being too closely aligned with positivist thinking, being reductionist, mechanistic, atomizing, essentialist, deterministic and parochial, collapsing diverse individuals and complex, multidimensional social structures and relationships to abstract data points and universal formulae and laws [25], and producing policy interventions that not only failed to live up to their promises but also did much damage to city operations [26]. These approaches also wilfully ignored the metaphysical aspects of human life and the role of politics, ideology, social structures, capital and culture in shaping urban relations, governance and development [27]. ...

While current urban science undoubtedly draws on positivistic ideas, notably that emanating within social physics which seeks to identify the social determinates and ‘laws’ of cities while largely ignoring the longer canon and critique [30,31]—and is open to the same criticisms as earlier manifestations—it should be noted that its approach is shaped by two more recent epistemological positions [1]. The first is a form of inductive empiricism in which it is argued that through data analytics urban big data can speak for themselves free of theory or human bias or framing. Such an approach is best exemplified by Anderson [32] who argues that ‘the data deluge makes the scientific method obsolete’ and that within big data studies ‘correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all’. In other words, rather than being guided by theory, the data can be wrangled through hundreds of algorithms to discover the most salient factors with regards to a particular phenomenon. The second is data-driven science that seeks to hold to the tenets of the scientific method, but seeks to generate hypotheses and insights ‘born from the data’ rather than ‘born from the theory’ [33, p. 613]. It uses guided knowledge discovery techniques to mine the data to identify potential hypotheses, before a traditional deductive approach is employed to test their validity. It is contended that data-driven science will become the new dominant mode of scientific method in the big data age because its epistemology is suited to exploring, extracting value, and making sense of massive, interconnected datasets; it extracts additional, valuable insights that traditional knowledge-driven science would fail to generate and it produces more holistic and extensive models and theories of entire complex systems rather than elements of them [16,33]. Both approaches are evident in urban science/informatics, with a preference on the latter...

(a) Datafication and privacy

Privacy—to selectively reveal oneself to the world—is considered a basic human right in many jurisdictions (particularly democratic states), enshrined in national and supra-national laws in various ways. How privacy is understood both as an everyday and legal concept, however, varies between cultures and contexts. In general terms, privacy debates concern acceptable practices with regards to accessing and disclosing personal and sensitive information about a person [34]. Such sensitive information can relate to a number of a personal facets and domains creating a number of inter-related privacy forms including [35,36]:

— identity privacy (to protect personal and confidential data);

— bodily privacy (to protect the integrity of the physical person);

— territorial privacy (to protect personal space, objects and property);

— locational and movement privacy (to protect against the tracking of spatial behaviour);

— communications privacy (to protect against the surveillance of conversations and correspondence); and

— transactions privacy (to protect against monitoring of queries/searches, purchases, and other exchanges)....

First, there needs to be a re-orientation in how the city is conceived. Rather than being cast as bounded, knowable and manageable systems that can be steered and controlled in mechanical, linear ways, cities need to be framed as fluid, open, complex, multi-level, contingent and relational systems that are full of culture, politics, competing interests and wicked problems and often unfold in unpredictable ways. Reducing this complexity into models and then using the outcomes to drive urban management produces a reductionist and limiting understanding of cities and overly technocratic forms of governance. Rather these models need to be complemented with other forms of knowledge such as phronesis and metis. In other words, city analytics and its instrumental rationality should not be allowed to simply trump reason and experience, or other sources of information and insight such as those based on ‘small data’ studies, in shaping and driving urban governance. Instead, they should be used contextually and in conjunction with each other.

Second, there needs to be a re-casting of the epistemology of urban science. This re-casting involves recognizing that the realist assumptions, which posit urban science can reveal essential truths about the city, are flawed. Urban science does not, and cannot, provide objective, neutral, God's eye views of the city. Instead, it produces a particular view through a specific lens. On the one hand, the data used do not exist independently of the ideas, instruments, practices, contexts, knowledge and systems used to generate and process them [74]. In other words, data are never raw, but always already cooked [75]. On the other hand, databases and data analytics are similarly not neutral, technical means of assembling and making sense of data but rather are socio-technical in nature, shaped by philosophical ideas and technical means. ...

Third, the ethical dimensions of smart city technologies and urban science need to be much more thoroughly mapped out and addressed. While some might argue that new ethical frameworks based on a gift or sharing basis, in which individuals swap their data for a tangible return (usually a service or knowledge, but also including monetary reward), are in operation or offer an alternative underpinning for a big data economy, smart cities and urban science, the present reality is that many smart city technologies capture data without consent or notice with respect to such a ‘gifting’ and are so pervasive that the gifting is compulsory with no alternatives. Moreover, the benefits of ‘sharing’ data are most often stacked in favour of those capturing the data, especially when they are monetized or shared with third parties and used against individual interests. In order for a sharing notion of ethical practice to be enacted, those gifting the data must have full details of what data are being generated, what additional data are being inferred from them, and to have shared control and benefit in how all data relating to them are subsequently used. This requires full notice and consent, as well full transparency with respect to the actions of data controllers and processors.
big_data  epistemology  data_science  smart_cities  privacy  ethics 
24 days ago
SLSA 2016 // Digging, Driving, Decoding, Describing: Media Historical Methodologies - Paused
On this panel, I will discuss the digital materials that we don't want to archive, or that don't want to be archived—the trash, cruft, detritus and intentionally opaque hoard of documents and artifacts that constitute our digital middens. I will focus on two from my own research: the archives of spam, which we'd all rather forget, and the records of the communities and marketplaces of the so-called “Dark Web,” which would prefer to be forgotten. How best to make them into and understand them as archives?...

Middens are pits of domestic refuse filled with the discards and byproducts of material life: the gnawed bones, ashes, fruit stones and potsherds, shells and chips and hair and drippings—together, the photographic negative of a community in action and an invaluable record for archeologists. Along with presenting some practical tools and techniques for both finding and making the middens of our subject, I will discuss ways that we can think of digital historiography in terms of these accidental or unwanted archives. Finally, I'll pose some questions about doing research with other kinds of eccentric, troubling, or speculative archives, like blockchains and doxxes.
archives  waste  spam 
24 days ago
Giving Today’s Car a Well-Tuned Interior - The New York Times
IT can make engines sound like purring pussycats — or growling tigers.

Through the wizardry of digital technology some of today’s most sophisticated vehicles, like the GMC Sierra Denali, are designed to keep annoying engine noise from seeping into the cabin.

Others, like the Lexus NX F Sport, include digital tuners to accentuate the engine’s throaty growl to satisfy the primal urges of driver and passengers.

And sometimes — in a seeming contradiction — the same car does a bit of both.

In the Nissan Maxima, for example, noise-cancellation technology helps suppress undesirable droning frequencies from the engine. But the throb of horsepower is acoustically amplified when the driver steps on the gas.

“It’s about the driver’s comfort,” explained Aaron Gauger, a product planning manager at Nissan. “But we also want the driver to have a good experience during acceleration.”

All of this, like so much else in modern automobiles, happens through the magic of digital software and hardware.....

Noise-cancellation systems use multiple microphones (usually positioned near the driver’s and passengers’ ears in the liner of the vehicle’s ceilings) to detect sounds in the interior, isolating particular unwanted wavelengths and frequencies. Software and digital signal processors then use the car’s audio system to create countervailing waveforms that are broadcast over the speakers to block the original noise.

Not all unwanted sounds can be eliminated. The noises in and around a car — like the whine of tires on different surfaces, the rush of wind through an open window or a road crew’s jackhammers — are too varied and changing to cancel out completely....

“There’s also a signature to every engine,” Mr. Pelliccio said, “and automakers are particular about which harmonics they want to enter the cabin and which ones they don’t.”

That is where the ability to enhance engine sounds can come into play. Because the current generation of smaller, more fuel-efficient engines and turbochargers often does not generate the sort of throaty resonance drivers expect, automakers design systems to augment the sonic experience.
noise  sound_design  noise_cancellation  cars 
24 days ago
What Trump’s Win Compels Scholars to Do - The Chronicle of Higher Education
As a social scientist, I’m especially aware of how universities have contributed to another problem in American political culture: excessive faith in Big-Data analytics, and insufficient concern about the quality of the data we’re producing. Even those of us who merely consume that data are implicated....

In retrospect, there were flaws with both the data (its reliance on land lines, and its failure to account for those who didn’t feel comfortable acknowledging that they’d vote for Trump) and the models used to analyze them (in crucial states, turnout levels were far lower than in the recent past). And the truth is that preventing those kinds of mistakes is extremely difficult. It’s one reason many of us — even, or perhaps especially, in the social sciences — worry that the Big Data revolution may go too far, pulling resources away from other valuable forms of knowledge production, such as ethnography, history, and philosophy. ...

Of course, universities aren’t the only institutions that are pushing the Big Data revolution. It comes from engineering firms in Silicon Valley, media companies in New York City, and a variety of other industries. But universities have played a special role in legitimating this emerging form of research, and emphasizing it over other ways of knowing. Did the new data-driven polling we helped pioneer affect how the candidates ran their campaigns, shaping who and where they targeted scarce resources and time? If so, perhaps now is the time to ask whether we’ve overstated the public benefits of Big Data analysis while underestimating the risks....

There’s one other thing that universities must do better: teach students skills for learning, discerning, reasoning, and communicating in an informational environment dominated by quick hits on social media like Twitter and Facebook. Like it or not, social media is at the center of the new public sphere. This election leaves no doubt that candidates, campaigns, and their surrogates can make great use of it: planting memes, spreading rumors, building communities. Professors know how to help students work through difficult ideas in books and articles. But except for some of us in the learning sciences, few of us have thought much about how to help students develop critical-thinking skills for the media that they use most.
I don’t blame academics for neglecting this kind of pedagogy, and we did not create this civic problem. But professors — particularly humanists and social scientists — are well positioned to help students navigate the new informational environment. We should rise to the challenge.
liberal_arts  higher_education  pedagogy  public_sphere 
24 days ago
How Did Trump Get Elected? Take a Look in the Mirror - The Chronicle of Higher Education
What responsibility do members of the academy bear for the shocking devolution of American politics that has just occurred? Quite a bit, I’d say.

For one, the university’s historical role in purveying "truth" has diminished qualitatively. That it has become obligatory to put this term in quotation marks is a good indication of how far we have fallen. Whereas the pursuit of truth may retain its value at those bastions of educational privilege where a liberal education has remained meaningful, elsewhere the ideals of humanistic study have been essentially left for dead. In this respect, we have met the enemy and he is "us."

The triumph of identity politics has also played a deleterious role. Amid the vogue of multiculturalism, the humanities have exempted strong claims to group identity — so-called "subject positions" that are embraced, sometimes inflexibly, by ethnic and cultural groups — from scrutiny, thereby sparing them from the type of withering interrogations that, since the Renaissance, have defined the culture of critical discourse.

Abetting these trends, university presidents have readily jettisoned a commitment to higher cultural and intellectual goals. In their rush to demonstrate the payoff of a four-year degree, they have shamelessly and enthusiastically hopped on the "relevance" and "bottom line" bandwagons.
Historically, one of the central missions of higher education, in addition to preparing students for the rigors of the job market, has been to nurture the values of active citizenship — the encouragement and cultivation of character traits that are epitomized by the idea of "autonomy." Brusquely put, this means producing individuals who are capable of making thoughtful and mature political judgments as well as intelligent life decisions.

This approach was exemplified by the philosopher John Dewey’s conviction that the key to developing virtues conducive to democratic citizenship lay with the anti-authoritarian, dialogic approach of the Socratic method. Thus Dewey held that emancipatory pedagogy required the abandonment of mind-numbing, rote instruction in favor of honing the skills of critical thinking. Dewey was convinced that the experience of participatory learning was an apprenticeship for the practice of democratic citizenship. To the nation’s detriment, the academy has turned its back on Dewey’s insight.
liberal_arts  public_sphere  higher_education  totalitarianism 
24 days ago
Confronting Our Failure of Care Around the Legacies of Marginalized People in the Archives – On Archivy – Medium
The politics of what we’ve traditionally preserved means the archive is filled with silences, absences, and distortions, mostly affecting the legacies of the less privileged, including black women, LGBTQ people, immigrants, poor people, and victims of police violence, to name a few. In the name of neutrality, we’re erasing people, communities and their humanity from the historical record.
The more selective and specialized space of digital collections, prioritizes professionalism, technical expertise, and standards, over a critical interrogation of the cultural character of our records. So this is certainly an appropriate venue to ask questions about the diversity represented in our historical records. Because for digital collections, who gets represented is closely tied to who writes the software, who builds the tools, who produces the technical standards, and who provides the funding or other resources for that work....

[Theaster Gates] projects like transforming a boarded up and abandoned home into a community centered library, archive, and arts space on the Southside of Chicago; Or converting an abandoned bank building into a thriving arts center. In many ways Gates’ work is about radical inclusion and transformation and I think archivists can learn a lot from that. In an interview earlier this year about his new exhibition, How to Build a House Museum, Gates talked about the politics of what gets preserved, how we decide what is worthy of memorialization, and why those things matter. It’s a fascinating interview where he also touched on the awesome potential of house museums as a powerful way of remembering how local people or communities have contributed to our shared culture.... While describing his work on building house museums as a way of challenging the traditional notions of what should be preserved, Theaster asked, “Who feels responsible for the failure of care around the legacies of great black people around the world?” ...

The evidence is abundant that people other than white men contributed to building this country. Land, labor, wealth, and life stolen from Native Americans and enslaved Africans are but few examples. Slavery and extreme violence against black bodies were the foundation of American capitalism. Without those two evils we would be living in a different America today. If we accept the historical fact that African Americans were at the center of American progress from the very beginning, it begs the question then, why is the historical record filled with so many silences, distortions, and erasures around Black peoples lives?...

Baptist describes how he wanted to set up the book so there could be no doubt as to the centrality of forced African labor to the economic foundation of the country. He set up the chapters in a way that presents a powerful image of the entire American experiment sitting on top of a black human body. Chapters are titled, feet, head, right hand, left hand, tongues, breath, seed, blood, backs, and arms. I thought this was an effective way to represent the truth about black labor and how it drove American progress....

Neutrality is a threat to the legacies of marginalized people and by extension their lives. In our line of work neutrality is a dangerous idea that prioritizes dominant culture, white male culture. So I want to push back and say that I’m interested in a #BlackLivesMatter care ethic for building our collections in the future, or better yet, a #BlackTransLivesMatter care ethic....

In his 1970 address to the Society of American Archivists annual conference, which was later published as, Secrecy, Archives, and the Public Interest, Howard Zinn cautioned against the prioritization of professionalism and neutrality by archivists. He said, and I quote, “The archivist, even more than the historian and the political scientist, tends to be scrupulous about his neutrality, and to see his job as a technical job, free from the nasty world of political interest: a job of collecting, sorting, preserving, making available, the records of the society. But I will stick by what I have said about other scholars; and argue that the archivist, in subtle ways, tends to perpetuate the political and economic status quo simply by going about his ordinary business. His supposed neutrality is, in other words, a fake. If so, the rebellion of the archivist against his normal role is not, as so many scholars fear, the politicizing of a neutral craft, but the humanizing of an inevitably political craft.”
archives  silence  race  labor  embodiment  care 
25 days ago
Is DNA the Future of Data Storage? - WSJ
This recent data-to-DNA conversion, completed in July, totaled 200 megabytes—which would barely register on a 16-gigabyte iPhone. It’s not a huge amount of information, but it bested the previous DNA storage record, set by scientists at Harvard University, by a factor of about 10. To achieve this, researchers concocted a convoluted process to encode the data, store it in synthetic DNA and then use DNA sequencing machines to retrieve and, finally, decode the data. The result? The exact same files they began with....

Easy, economical data access could also address concerns about data sovereignty, a hot-button topic in Europe, where regulators are pressing companies that hold sensitive information—financial services firms, health care organizations—to store information locally. DNA storage could, eventually, provide a cheaper and more eco-friendly alternative to huge server farms.

Perhaps more important, DNA could prove a far more durable storage medium than our present options. “If you look at digital data storage, it’s an ephemeral thing,” says Bill Peck, chief technology officer at Twist Bioscience, a San Francisco startup that’s creating synthetic DNA for the Microsoft-University of Washington team. Hard disks and flash drives can crash without warning, and some last just a few years. Magnetic tape may survive a few decades, and DVDs even longer, but they are by no means immortal. Data stored in DNA, provided it’s kept cold and dry, could last for thousands of years.

While the concept is promising, the technology is years, perhaps even decades, from moving out of labs and into everyday use.
archives  biomedia  storage  DNA 
29 days ago
Introducing "The Social History of the Archive" - Past and Present
We are answering Eric Ketelaar’s call for a social history of archives; we endeavour to draw attention to the lived practices that underpin the formation of archives rather than simply focusing on them as static repositories. Therefore, our volume looks at the wider cultural practice of record-keeping not only in emerging and expanding official archives, but in the daily life of individuals, families, and communities. Managing information through recording, ordering, and preserving it was crucial; the early modern period seems to have been transformative in this respect. The dynamics of the age — expanding global trade, burgeoning state bureaucracies, advances in technology — necessitated documentation and information management. Our volume explores this apparent evolution and proliferation of record-keeping practices between the fifteenth and eighteenth centuries....

the social history of the archive tries to capture the bustling life of documents. The paper or parchment now in our reading rooms lay once at the heart of transactions and relationships. Therefore, the contributions all focus on process over product: the practices which brought the documents and collections into existence, rather than their eventual outcome. ...

The first section, Creation, Curation, and Expertise, highlights the people at the heart of record-keeping, and discusses how some built a career around writing and record-keeping. The records under discussion in this volume were never abstract, detached texts, but part of past people’s livelihoods and creative output.

The essays in the second section, Credibility, Testimony, and Authenticity, discuss records that helped built arguments and through which credibility was acquired. The instrumental nature of these documents alerts us to their rhetorical power and how this shaped their content.

The meaning of repositories is further explored in the third section, Collecting, Compiling, and Controlling Knowledge. All four essays discuss in depth the reasons for gathering, ordering, and preserving information when dealing with the past, making sense of the present, and envisaging a future. They encourage us to look more directly at the structures through which the documents we study have reached us, and what these structures might tell us.

The impact of forgetting and remembering during the period runs through most of the volume, but is most explicitly discussed in the final part, Memory, History, and Oblivion. Deliberate early modern practices of creating, manipulating, and silencing the past show us very directly how archives mediate and construct our perception of it.
archives  records  paperwork  bureaucracy 
29 days ago
The Knowledge School and an Election Mandate | R. David Lankes
All of these works will be useful and will help us to better understand a new reality where polls and predictive models failed spectacularly. However, while we may do some of this kind of work, our role as a knowledge school is different. Our role is not simply to document the campaign. Our role is not simply to analyze the data generated by the candidates. Our role is to act.

Democracy is not about voting. Voting is a periodic decision: democracy is sustained conversation, oversight, and advocacy. The work of a citizen did not end yesterday; it began. For those who chose or opposed a candidate, there is now the vital work of holding those chosen to account. An election doesn’t change a constitution. An election doesn’t change demographics.

The role of a knowledge school, the role of you and me, is to reinforce the values and principles we hold dear and support the communities that make up our nation and the world. We cannot simply explore the racial makeup of the electorate without providing opportunity for all races. Our values of diversity in decision making and ensuring equity of opportunity for all religions, classes, and ideology must be put into action, not theses.

If this election has shown us anything it is not that our communities are facing too little information, it is that information alone is insufficient for informed action. The problems our communities face is not one of access to resources, but access to the right resources provided to the ready learner. The problems are about creating a culture of literacy, open civic conversation, and knowledge. We must not mistake describing people’s opinions with facilitating learning....

Tomorrow we have a new administration that needs oversight and a direct link to the best scholarship we can provide. We have a country that is increasing living in echo chambers built in walls of selective data that needs action to push them to greater insight. Tomorrow, in essence, needs us: scholars, librarians, information professionals, staff, faculty, alumni, and students focused on making a better world.
libraries  information  democracy  access  discourse  public_sphere  data  epistemology 
29 days ago
A Neural Network Playground
It’s a technique for building a computer program that learns from data. It is based very loosely on how we think the human brain works. First, a collection of software “neurons” are created and connected together, allowing them to send messages to each other. Next, the network is asked to solve a problem, which it attempts to do over and over, each time strengthening the connections that lead to success and diminishing those that lead to failure. For a more detailed introduction to neural networks, Michael Nielsen’s Neural Networks and Deep Learning is a good place to start. For a more technical overview, try Deep Learning by Ian Goodfellow, Yoshua Bengio, and Aaron Courville.
computation  machine_learning  neural_networks 
29 days ago
The Avery Review | Land Art in the Age of Remote Sensing or: Google Mapping the <i>Spiral Jetty</i>
Following directions printed from a website seems almost quaint in the age of Google Maps and cheap GPS devices, with the road signs reading more like warnings than travel tips. Clearly being stranded without gas and toilets is now on par with being without cell reception, or even worse, without an Internet connection. Up until the exact moment cell reception cut out, our trip had been constantly marked in both time and space by the small blue orb pointing my phone, car, and thus body toward our destination, marking us, in real time, within the vast cartographic empire of Google Maps....

Within this broad view of artistic production, the mediatic apparatuses required to consume so-called land art would therefore also be enveloped in Krauss’s newly conceived definition of sculptural practice.
The status of the image as integral to land art, earthworks, and other works of art not easily accessible for average viewers was not unrecognized before Krauss. In 1971, when Gregoire Müller asked Smithson about the importance of photography for works made in the land, he responded, “I think we are actually talking about multiple ways of locating a thing, and one way to locate a thing is to circumscribe it with a photograph. If you are flying over a piece, you can see its whole configuration in a sense contracted down to a photographic scale.” Smithson went on to say, “For me, a photograph acts as a kind of map that tells you where the piece is and I don’t see anything wrong with that … . All we can do is use our orders and systems to investigate them [earthworks], and they generally turn out to be wrong … . What is interesting is how we fail to understand remote things.”...

Smithson claims the Spiral Jetty is a resolution of the dialectic between site and nonsite, a place where the map and the physical location collapse together into a single site of displaced physical materiality that generates its own mark upon the landscape. Put another way, the Spiral Jetty is incomplete without the map. The spiral, as a form upon the land, is impossible to fully comprehend without its abstracted representation....

As with many of his other pieces, Smithson made films and photographs about the Spiral Jetty and exhibited them independently as works in their own right. Remarking about the film he made to document the Jetty, Smithson described, in detail, the process for creating the film’s cinematography: “For my film (a film is a spiral made up of frames) I would have myself filmed from a helicopter (from the Greek helix, helikos meaning spiral) directly overhead in order to get the scale in terms of erratic steps.”12 The film is not just documentary evidence of marking the landscape; it exists as one aspect of a broadly conceived work....

Now over forty years old, it has begun to settle back into the salt plain, the mud and salt and rocks slowly shifting and blurring the once definitive boundaries between ground and that just above. The film, too, is a victim to celluloid decay, and of course, the effects of global climate change on the Jetty and other earthworks throughout the American West are as-of-yet unknown. Standing in stark contrast is the unrelenting digital memory of the Internet, where time can just as easily be played in reverse.
land_art  expanded_field  navigation  mapping  memory  decay  smithson 
4 weeks ago
City Ground
Modern humans tend to naturalize the ground; to construe the earth — terra firma — as the immutable and natural product of geological processes working over eons. This understandable tendency leads also to a predominant sense of ground as inherently horizontal — the surface of the earth stretching to and beyond the horizon. Yet this perspective underplays the importance of the vertical composition of ground. For the terrestrial material of our rapidly urbanizing species is increasingly anything but “natural”: it is the vertical accumulation of manufactured ground. The making of geological strata is an essential but often neglected component of the mass shift of humankind to urban living; it is also a crucial by-product of the industrialized technologies of construction, agriculture, mining, and warfare.

Geologists now estimate that the deliberate shift of material by humans due to construction, agriculture, and mining, as well as the generation and movement of waste, now amounts to around 59 billion tons a year. 2 Rising levels of global urban development mean this figure is growing fast. 3 Little wonder that geologists are on the verge of formally declaring that we have moved from the post-glacial Holocene to the new geological epoch of the Anthropocene: the age in which human agency shapes land and soil, the very geology of the earth, as well as the atmosphere, biosphere, and oceans, more powerfully than any other force....

Geologists describe human-generated geological features like Teufelsberg as “artificial” or “manufactured” ground. Not surprisingly, such ground is densest and most complex in ancient cities that have been inhabited — rebuilt again and again — over millennia. It is particularly thick beneath old industrial cities which have experienced many cycles of construction and destruction: cycles as old as urbanization, though the scales of which have multiplied massively in the last two centuries. Fire, earthquake, war, decay, obsolescence, redevelopment, the desire for improvement — all can result in the destruction or demolition of buildings and infrastructures, or sometime in their absorption into a higher level of ground, aided by gravity....

In artificial ground, as in most other geological formations, depth down is usually equated with temporal distance back into history. More recently, though, the aggressive processes of modern urban redevelopment — based on deep excavation, the driving of piles into rock, the construction of extensive subterranean infrastructures — have been producing highly intricate and complicated artificial ground. “Successive phases of development,” as the geologist Simon Price and his colleagues stress, “have added to, or in some cases re-used and recycled, this artificial ground, leaving a complex ‘stratigraphy’ of deposits, including drains, middens, pits, cellars, foundations and trenches among other features.” 5 Urban archaeologists have done much to explain this complex patterning of human-made ground, especially in European cities occupied more or less continuously since Roman times. 6 While occasionally entire street surfaces or discrete historic ground levels are revealed during archaeological or construction projects, it is rare for artificial ground to consist simply of the vertical accretion of historic layers, piled one upon another in situ....

A few years ago the art and media collective Smudge Studio published a couple of informed and accessible volumes that sought to broaden cultural awareness of the city’s “geological pulse.” In Geologic City: A Field Guide to the GeoArchitecture of New York, Elizabeth Ellsworth and Jamie Kruse, the duo who make up Smudge Studio, trace the distant origins of vital resources ranging from food, energy, and water to gold, steel, and concrete, and they map the geologies and landscapes created by the ceaseless cycles of construction, demolition, and waste.

A later anthology, Making the Geologic Now, contains an especially remarkable analysis of how cities have adapted to cold winters by creating their own “terminal moraines” — the landscapes that result from the debris of glaciers — by bulldozing millions of tons of dirty snow — filled with road salt, tire dust, worn brake linings, exhaust pipe chips and other detritus — into huge glacier-like mounds. These then melt in the warmer months, dumping their “moraines” and building up new layers of manufactured ground. 8 Throughout these volumes Smudge Studio provides valuable insights into the ways in which cities work to metabolize nature...

Some Scandinavian cities are already exploring techniques that will allow them to extract the valuable metals within obsolete or forgotten infrastructures — old tram lines, disused district heating pipes, abandoned power and telephone wires, 19th-century gasworks — as resources to sustain contemporary economic development. In the Swedish industrial city of Norrköping, for instance, technology scholar Björn Wallsten estimates there are 5,000 tons of iron, copper, and aluminum available to be extracted...

Which raises a pertinent question: Is manufactured ground the purview of the geologist or the archaeologist?

Well, both; the proliferation of artificial ground is drawing the two disciplines into dynamic and unprecedented collaboration. Leicester University archaeologist Matt Edgeworth has suggested that the artificial ground created by humans should be considered a hybrid domain, formed through a complex mix of natural and cultural forces; Edgeworth calls this the “archaeosphere.” 11 Historic streets, tunnels, ports, industrial sites, foundations, and religious and commercial buildings — the stuff of urban archaeology — are all understood as layered within and through the complex strata in which is contained waste, human remains, rubble, ballast, soil....

Urban landfills are not only vast concentrations of the effluvia of consumerism and capitalism; they are also a new and highly toxic geology, massive emitters of greenhouse gas emissions (though in more advanced sites these are captured for use as fuel). Not surprisingly, these modern archaeospheres of artificial ground constitute prime sites for the burgeoning field of contemporary archaeology. Little wonder cultural historian Cinzia Scarpino has described landfills as “the true archaeological sites of late modernity.
geology  waste  anthropocene  ruins  urban_archaeology  archaeology  infrastructure  dredge 
4 weeks ago
The Avery Review | Archaeologists Wear Wetsuits
Over the next two years she and Throckmorton mapped the locations of numerous wrecks and produced preliminary drawings of various sites. Frost writes about their strange position as experts without official archaeological credentials: “We realized it was easier to find wrecks than to persuade first-rate professionals to produce rational excavation methods … Success depended on our ability to present the evidence and raise interest in the project.”5 Between Frost’s early survey drawings and Throckmorton’s underwater photographs, they had enough visual evidence to garner funding for future research and to convince experienced archaeologists to get involved. Throckmorton took their documentation back to the United States, where he got the attention of the University of Pennsylvania.6 The University Museum sent an archaeologist named George Bass (fresh from a YMCA diving class in Philadelphia) to Turkey to plan and execute a series of excavations with Frost and Throckmorton.7
Though Frost and Throckmorton identified many wrecks at Yassiada, the first one chosen for excavation in 1961 was not especially valuable. According to Bass, the decision had less to do with archaeological significance than their “desire to develop new techniques,” and the site was selected “more on the basis of its depth and condition than its date.”8 Frost called the wreck site “ideal for drawing” because of its lack of erosion and exposed galley and anchors.9 During the course of the excavation, the team systematically experimented with a wide range of survey devices and drawing techniques in order to produce drawings of the shipwreck that could be comparable to archaeological surveys done on land.10 They combined traditional methods with new machines and mapping devices that had yet to be applied to archaeological sites, as a way of dealing with the distortions inherent in looking at objects underwater (and recording data efficiently when oxygen was limited).

The rapid and heterogeneous development of experimental methods and technologies allowed this young field to congeal around representational techniques. In the 1960s, underwater archaeology was barely considered an academic discipline and had been relegated to the realm of amateur treasure hunting. The story of the waters off Yassiada, then, is also the story of the complicated relationship between exploration and archaeological preservation in the postwar period. Underwater surveys, visualized through the meticulous hand of the surveyor and through the lens of new technologies, played a crucial role in gaining the confidence of Western institutions to fund this new breed of wetsuit-wearing archaeologist. The Yassiada excavation sheds light on how the aesthetics of data and archaeological surveys, alongside their capacity to organize information, produced faith and credibility in the discipline of underwater archaeology and have come to inform our understanding of cultural artifacts. The early survey drawings from Yassiada are all that remain of the wood wreck, as much of the salvaged material has eroded, was destroyed in the process of being uncovered, or was left on the sea floor to drift away.11 Even so, the original drawings—which appear minimal by today’s standards—contained enough information to produce new interpretations of the ship throughout the ’80s and ’90s. While we are now experiencing a moment of accelerated technological development much like that of the 1960s, survey data is obtained far more easily and can be collected almost instantaneously. Yassiada offers important insights about the value of representation and technology today, as cultural artifacts are increasingly susceptible to destruction by war, climate change, and lack of preservation.

This aquatic turn for archaeology in the 1960s was enabled by and implicated in a series of technological developments across the preceding decades. The scuba (self-contained underwater breathing apparatus) was invented by Yves Le Prieur in 1926, and it allowed divers to manually control their air supply from a tank of compressed air. In the winter of 1942–43, Emile Gagnan and Jacques Cousteau improved the design with a pressure regulator that supplied the diver with air automatically and allowed more freedom of movement. They quickly patented and began selling it commercially in France in 1946, opening up a world that most people had never seen before.12 Meanwhile, underwater photography enabled images of that world to circulate in both public and academic spheres, and photogrammetry—developed first as a military imperative to survey the distances between objects at sea during the world wars—made it possible to take measurements from those photographs and make surveys. Without these three technologies, the execution of an organized underwater excavation would not have been possible.....

it became increasingly important to differentiate underwater archaeology from underwater exploration in order to find funding from universities, museums, or global institutions like UNESCO. With few rules in place and hardly any scientists qualified to make underwater observations firsthand, early underwater archaeologists struggled to gain the respect of their colleagues operating aboveground. Archaeological practice on land had been steadily codified in Europe since at least the time of the Enlightenment and had been in existence for many centuries prior. Underwater archaeology had a lot of catching up to do....

Depth measurements were the most difficult to obtain but also the most important for producing credible archaeological sections and elevations. In order to get these drawings right, the Yassiada crew invested significant time and energy developing a series of mapping machines that could measure depth in different ways....

Underwater draftsmen were also invested in developing new drawing typologies that could communicate the salient features of a shipwreck while still maintaining archaeological standards. Frost worked out a variety of underwater techniques using projective geometry that would allow her to translate a quickly drawn perspective sketch with certain crucial, known points into measured plans and sections above water. Dismissive of the idea that plans and sections should be the de facto forms of archaeological representation for shipwrecks, she described the differences between the representation of aquatic and terrestrial ruins:

Why, one wonders, is it necessary to divide the strange, collapsed machine which is a wreck into squares, and what is sacrosanct about vertically observed plans? They are essential to the understanding of a collapsed building, but are they the only convention that will explain a dislocated pile of machinery? If there is one thing that modern art taught, it is the multiplicity of visual conventions, each communicating a different type of statement and each depending on the limitations of the medium used....

Since little remained of the original ship after their excavation, they had only their drawings and photo grids to study from in the decades to come. The mapping grid frame they erected, created to accommodate the limited reach of the underwater camera, helped to extrapolate and emphasize the relative connections and relationships between artifacts. With just this information left of the ship, they were able to make accurate models and replicas because they were thoughtful and considerate of the information they did collect. Even as their representation bestows particular readings of form or even ideologies of progress onto our understanding of the wreck, it still leaves holes or gaps where new readings might be possible....

When all you have left of a cultural artifact is its representation, important questions are raised about how to document its essence, how the survey can replace the artifact, or how the survey affects our reading of history. The tools that made this history visible for the first time have also changed the way we understand it, whether that has to do with the way in which artifacts are identified and documented or the new international and technological context into which their representation places them....

Modern technology made it possible for this new wave of underwater pioneers to succeed in mapping a territory where others had failed, and this success affirms an underlying, techno-optimistic narrative of Western progress.
In this sense, the spectacle of evolving underwater technologies in the 1960s is not to be understated. Underwater archaeology differs from archaeology on land for many reasons, but the most obvious are the physical and athletic obstacles posed for those carrying out their research. The image of the historian equipped with a wetsuit, oxygen tank, and waterproof flash-camera rig represents a new kind of explorer, one capable of conquering not only the dangers of the ocean’s depths but also what remains of the conquerors and civilizations of empires long since passed. As much as archaeological exploits were intended to produce a kind of shared global history after World War II, underwater archaeology in particular can be seen as a means for the West to claim the ancient past as its own, with the survey acting as a primary agent of this reclamation.....

This scattered and inventive moment in the formation of underwater archaeology is especially important to reflect on today, as archaeological preservation has been pushed into new spheres with the rapid advancement and availability of high-resolution laser scanning and virtual reality. Recently there has been a rush to scan monuments in areas that are war-torn or subject to imminent damage from climate change, to harness information about their surface conditions in clouds of points that contain millimeter-specific coordinates with photographic color overlays. But even as we lose ancient monuments in cities like Palmyra, and as we feel compelled to build duplicate versions of places like the Chauvet caves, we should question our … [more]
archaeology  visualization  drawing  photography  representation  water  mapping 
4 weeks ago
Reimagining Libraries In The Digital Era: Lessons From Data Mining The Internet Archive
In 2014 I posed the question “What if we could bring scholars, citizens and journalists together, along with computers, digitization and ‘big data’ to reimagine libraries as centers of information innovation that help us make sense of the oceans of data confronting society today?” Reflecting back on three years of collaborating with the Internet Archive, my own experiences working with such a digital-first library offers a number of reflections and insights into the future of libraries as data-driven centers of innovation.....

Two years later at a Library of Congress meeting on archiving the then-growing world of citizen journalism, a key question facing the archival community was how to tractably archive the vast blogosphere given its incredible growth and the difficulty of tracking new blogs as they come and go. At the time I noted that a small set of companies accounted for the majority of hosting platforms and tools used to publish blogs and suggested collaborating with those companies to receive a streaming API of all URLs of new blog posts on those platforms as they are published. Today that provides a key data stream to the Archive’s web crawlers....

Beyond simply preserving the web for future generations, library-based web archives like the Internet Archive’s Wayback Machine offer researchers one of the few places they can work with large collections of web content. While not as extensive as the collections held by commercial search engines, library archives are accessible to scholars who lack the resources to build their own massive web-scale crawling infrastructures and uniquely allow the exploration of change over time at the scale of the web itself....

In 2013 I began my first major analytic collaboration with the Internet Archive, creating an interactive visualization of the geography of the Archive’s Knight Foundation-supported Television News Archive to explore what parts of the world Americans hear about when they turn on their televisions, followed by an interactive search tool comparing coverage of different keywords. This involved applying sophisticated data mining algorithms to the closed captioning stream of each broadcast. Anticipating the needs of data miners, the Archive had built its systems in such a way that television was treated not as a collection of MPEG files gathering digital dust on a file server, but as a machine-friendly analytic environment, with closed captioning available as standard ASCII text and audio and video streams accessible in similar analytic-friendly formats....

the Virtual Reading Room allows trusted researchers to bring data mining codes to execute directly on the Archive’s servers, with only computed metadata leaving the premises, while the original television content never leaves the Archive’s servers....

Using the Internet Archive’s public collection of more than 600 million pages of public domain books dating back 500 years and contributed by over 1,000 libraries worldwide, the OCR of each book was used to extract every image from every page and gradually upload them all, along with their surrounding text, to Flickr.

...often the best thing a library can do is to make its holdings as accessible as possible and just step back and get out of the way to let its community find new applications for all of that data....

In 2014 I also worked with the Archive to explore what it would look like to enable mass data mining over its more than 460 billion archived web pages. This involved overcoming a number of technical hurdles for dealing with such a large dataset and creating technical blueprint and workflow for others to follow. The final analysis examined the more than 1.7 billion PDF files preserved by the Archive since 1996, coupling them with JSTOR, DTIC and several other collections to data mine more than 21 billion words of academic literature codifying the scholarly knowledge of Africa and the Middle East over the last 70 years. Beyond being the first large-scale socio-cultural analysis of a web archive, it also has had a very real world impact, pioneering the use of large-scale data mining to socio-cultural research and helping incorporate understanding of non-Western cultures into development efforts....

Last year I used Google’s cloud to process more than 3.5 million books totaling the complete English-language public domain holdings of the Internet Archive and HathiTrust dating from 1800 to 2015. This was used to publish the first comprehensive comparison of the two collections, with a number of fundamental findings regarding how the data we use influence the findings we derive. Perhaps most powerfully, the animation above shows the impact of copyright on data mining. In the United States, most books published after 1922 are still protected by copyright and thus cannot be digitized, leaving them behind the digital revolution and creating a paradox whereby we have more information on what happened in 1891 than we do about 1951....

The real challenge is about how to assess relevance in an temporal index and how to visualize search results that stem from how a page changes over time. Rather than try to build such systems entirely in-house, as libraries increasingly open their digital collections to access, they should reach out to the open source and research communities to think outside the box for creative solutions to these challenges.
search  archives  television  annotation  captioning  data_mining 
4 weeks ago
The Network Imaginary - Los Angeles Review of Books
Questions, then, of what a network actually looks like and what it feels like are refreshing ones, ones that are simultaneously familiar and unfamiliar. Without forcing readers to vacate the territory of interconnectedness that they’ve occupied for years (or in my case, my entire life), Patrick Jagoda’s Network Aesthetics demands that we reconsider the omnipresence of the term “network” and the seemingly concrete meanings that have come to adhere to it. It asks us to think seriously about what we mean when we talk about networks, what it means to undergird our daily life with network logic, and what possibilities exist once we start imagining networks — and the connection they enable — differently. ...

The first section of the book focuses on the linear narrative forms of the maximalist novel, the network film, and the television drama. The second engages with distributed forms like digital videogames and transmedia alternate reality games (ARGs). A series of deft readings makes clear that our supposed understanding of networks is governed, at least to some degree, by how we encounter representations of them....

This turn to what Jagoda calls a “network imaginary” is an incisive one that accounts for the impossibility of comprehending, mapping, or describing networks in any tangible way. This is not to say that networks in the 21st century lack materiality; their physical infrastructures are inextricable from the ephemeral connection they enable, and recent work from scholars like Tung-Hui Hu, Nicole Starosielski, and Allison Carruth (to name only a few) shows the wide-ranging ecological, political, and ethical stakes of that materiality. But the network, as a form, is too vast and in flux to be fathomed as whole. As Caroline Levine has noted, “At any given moment we know that we cannot grasp crucial pathways between nodes, and this points to our more generalized ignorance of networks. We cannot ever apprehend the totality of the networks that organize us.” It is no surprise that Levine resonates here, as the approach she suggests in her 2015 book Forms: Whole, Rhythm, Hierarchy, Network grants form the same primacy that Network Aesthetics develops. Its investment in aesthetics posits that our media shape the way we think about the networks we occupy. In turn, the way we imagine those networks informs how we exist within, move among, and relate to them....

There is certainly something attractive about the idea of network as totality: certainty itself. But continuing to treat our current form of networked life as a predetermined and immutable fact tricks us into also accepting as fact the uneven development of control and power that has underwritten the rise of contemporary structures of culture, economics, and politics. When we celebrate the connective power of Twitter, we miss the violent sexist and racist discourse that countless users face every day. When reserving an Uber is as simple as a few keystrokes on an iPhone, it is easy to miss not only the exploitation of labor and resources that made the iPhone possible, but also the exploitation of labor and resources that made the Uber possible. Conversely, when insisting that the success of YouTube celebrities, or the notoriously toxic comments left on their videos, is a sign that culture has reached a new low, we overlook the tight-knit and inclusive online communities that have built up around those content creators. None of these realities supersede each other, but rather exist simultaneously with each other, and a deeper understanding of our network imaginary can help us see this multiplicity of interconnectedness....

But what network aesthetics can do is teach us how, in this relatively young internet age, to slow down enough to make room, aesthetically, affectively, and otherwise, for the possibility of other kinds of connection
networks  visualization  network_imaginary  internet 
4 weeks ago
The Shrugged Atlas - Los Angeles Review of Books
This Atlas of Improbable Places joins a small but growing number of what we might call alternative or perhaps “indie” travel guides, maybe anti-travel guides, postmodern Baedekers for those wearied by (or too hip for) the conventional itineraries. The genre includes Unruly Spaces (2004) by Alastair Bonnett, subtitled “Lost Spaces, Secret Cities, and Other Inscrutable Geographies” and Tom Lutz’s And the Monkey Learned Nothing and Drinking Mare’s Milk on the Roof of the World, both published this year, subtitled “Dispatches from a Life in Transit” and “Wandering the Globe from Azerbaijan to Zanzibar.” (Full disclosure: Lutz, as you may well know, is the editor of LARB, and also, as you may well not know, the editor of this piece [I have no idea what he is getting at here, Ed.]). There’s also the recent Atlas Obscura (2016), “An Explorer’s Guide to the World’s Hidden Wonders” — yes, a lot of subtitling seems to be required in these matters.
All these books display a fascination with ambiguous or edgy or potentially dangerous places: ruins (industrial rather than classical), deranged architectural follies, environments created by outsider artists, underground or utopian or lost cities, abandoned prisons, bunkers, theme parks, relics of the Space Age and the Cold War; examples of all these appear in Elborough’s book.
mapping  cartography  ambiguity  utopia 
4 weeks ago
Time to Dump Time Zones - The New York Times
The time-zone map is a hodgepodge — a jigsaw puzzle by Dalí. Logically you might assume there are 24, one per hour. You would be wrong. There are 39, crossing and overlapping, defying the sun, some offset by 30 minutes or even 45, and fluctuating on the whims of local satraps.

Let us all — wherever and whenever — live on what the world’s timekeepers call Coordinated Universal Time, or U.T.C. (though “earth time” might be less presumptuous). When it’s noon in Greenwich, Britain, let it be 12 everywhere. No more resetting the clocks. No more wondering what time it is in Peoria or Petropavlovsk. Our biological clocks can stay with the sun, as they have from the dawn of history. Only the numerals will change, and they have always been arbitrary.

Some mental adjustment will be necessary at first. Every place will learn a new relationship with the hours. New York (with its longitudinal companions) will be the place where people breakfast at noon, where the sun reaches its zenith around 4 p.m., and where people start dinner close to midnight. (“Midnight” will come to seem a quaint word for the zero hour, where the sun still shines.) In Sydney, the sun will set around 7 a.m., but the Australians can handle it; after all, their winter comes in June.

The human relationship with time changed substantially with the arrival of modernity — trains and telegraphs and wristwatches all around — and we can see it changing yet again in our globally networked era. We should synchronize our watches for real....

I’m not the first to propose this seemingly radical notion. Aviation already uses U.T.C. (called Zulu Time) — fewer collisions that way — and so do many computer folk. The visionary novelist Arthur C. Clarke suggested a single all-earth time zone when he was pondering the future of global communication as far back as 1976....

People forget how recent is the development of our whole ungainly apparatus. A century and a half ago, time zones didn’t exist. They were a consequence of the invention of railroads. At first they were neither popular nor easy to understand. When New York reset its clocks to railway time on Sunday, Nov. 18, 1883, this newspaper explained the messy affair as follows:

“When the reader of The Times consults his paper at 8 o’clock this morning at his breakfast table it will be 9 o’clock in St. John, New Brunswick, 7 o’clock in Chicago, or rather in St. Louis — for Chicago authorities have refused to adopt the standard time, perhaps because the Chicago meridian was not selected as the one on which all time must be based — 6 o’clock in Denver, Col., and 5 o’clock in San Francisco. That is the whole story in a nut-shell.”
time_zones  temporality 
4 weeks ago
How The Guggenheim And NYU Are Conserving Computer Based Art—Part 2
In what way do the computer science students participate in this collaboration?
Engel: They participate by analyzing and documenting source code for these works of art. The students typically begin by writing up notes, spreadsheets, and charts to track all of the software elements in a given work of art as well as the hardware components. In this way, students clearly document the native environment for each work of art that they study, such as the operating system, hardware details and other aspects of the computer and software needed to run the works. Many works of art use more than one programming language, for example, so the students seek to understand and clarify which role each language plays in a work of art. There are also works of art that either retrieve data from other sources (such as from the web) or track the viewers’ input and then store and manipulate the data in a database or in data files. The students also examine the works of art with respect to media: for example, if there are sound files used, which file type is used? Are the sounds created or composed as the program runs, or is there a “soundtrack” running separately from an audio file? Still images, too, are sometimes stored as GIF or other standard image file types; but other programs actually “draw” the images programmatically as directed by the software.
archives  digital_archives  preservation  software  media_archaeology 
4 weeks ago
In the next minutes, you will be shown different visualizations of uncertainty. That data you will see is a series of points that could be generated from a GPS, for example, overlaid on top of a simple street map. You will be asked to identify the most or least uncertain of those points based on their visualization, which will indicate uncertainty in different ways.
mapping  uncertainty  ambiguity 
5 weeks ago
Gregory Zinman Digital Dialogue - Maryland Institute for Technology in the Humanities
This talk describes the discovery and significance of Etude (1967), a previously unknown work by media artist Nam June Paik identified by the author in the Smithsonian American Art Museum’s recently-acquired Paik archive. Composed at Bell Labs, in collaboration with engineers, and written in an early version of FORTRAN, Etude stands as one of the earliest works of digital art—although it is not entirely clear whether Etude was, in fact, the “computer opera” that Paik mentions elsewhere in his writings, or another artwork altogether. By exploring Etude’s uncertain status, as well as the piece’s more conceptual indeterminacies—between image and code, analog and digital, and film and music—this paper demonstrates how such indefinite artifacts allow for a rethinking of the nature of the archive, cinema’s digital past, and film’s place in computational media.
media_art  archives  digital_archives  nam_june_paik  preservation 
5 weeks ago
This Map of the World Just Won Japan’s Prestigious Design Award | Spoon & Tamago
Narukawa developed a map projection method called AuthaGraph (and founded a company of the same name in 2009) which aims to create maps that represent all land masses and seas as accurately as possible. Narukawa points out that in the past, his map probably wasn’t as relevant. A large bulk of the 20th century was dominated by an emphasis on East and West relations. But with issues like climate change, melting glaciers in Greenland and territorial sea claims, it’s time we establish a new view of the world: one that equally perceives all interests of our planet.
maps  cartography  projection 
5 weeks ago
Control Earth
To do an experiment — any experiment — you need a control. A baseline state, a normal, an unmanipulated variable. A what-things-would-be-like-if-I-weren’t-changing-them kind of thing. In other words, a natural thing. Then you do your experiment and see what happens, ceteris paribus (i.e., all other things being equal). You change a condition or two, you manipulate, you alter, and you see how your treatment cases differ from that baseline state. You determine what the interventions do and how they matter. Whether they create something better, or worse, or just different from your control. Different from that natural thing.

We are changing Earth’s climate, in what the oceanographer Roger Revelle famously called humanity’s “great geophysical experiment.” 1 But if it’s an experiment, what’s the control? Changing from what? How is the climate we’re creating different from what it would have been had we not dug up trillions of tons of ancient plants and animals, compressed over millions of years into soft black rock and energy-rich black goo, and sent them up in smoke, all within a couple of centuries? What would have happened instead?

To know these things, we need a control Earth. A ceteris paribus Earth. An Earth that would have existed had we not shown up with our hypertrophied brains, our energivorous technology, and our insatiable appetites for more and more and more. There are multiple ways to build such control Earths; the one that we’ll look at here is, naturally (so to speak), climate simulation....

Climate models are very similar to weather forecasting models, but their grid cells are bigger because a climate simulation must be run for at least several (simulated) decades. Some runs model time periods of 1,000 years or more. All this requires tremendous computer power.

Climate models can be operated like zoom lenses. For a spatial zoom, they can bring a particular region into sharper focus by embedding higher-resolution regional models into the main model. Or their results can be downscaled using statistical methods (though this technique remains problematic and controversial). For a temporal zoom, their parameters can be set to resemble those of some previous time period. You can recreate the Pangaea supercontinent of 200 million years ago, or conditions at the peak of the last ice age, or any time you like, including the future. That’s where those scary curves come from, some of them showing temperatures soaring by 5°C or 6°C by the end of this century. What happens if we keep on adding carbon? What happens if we stop?
mapping  climate_change  simulation  experimentation 
5 weeks ago
The ‘Internet of Things’ Faces Practical and Ethical Challenges - The Chronicle of Higher Education
Soon, university researchers hope to have a system in place for visually impaired people that uses Bluetooth Low Energy beacons, which can be used for indoor location-tracking. The system will help the visually impaired find the easiest way — with the fewest staircases and obstacles — to get from one building to another.

Along with enthusiasm, the concept of the Internet of Things has drawn criticism from cyber­security experts and others for the privacy concerns it raises. "People have a justified worry about their every move being marked in some database," says Frank Pasquale, a professor at the University of Maryland’s Francis King Carey School of Law. Many big-data projects are deployed on a large scale before researchers thoroughly weigh the benefits and costs, says Mr. Pasquale, who studies the intersection of technology and the law.

The IoT can be fruitful territory for hackers. In a 2014 experiment, a group of researchers at the University of Michigan hacked into IoT infrastructure in a small town in the state and seized control of nearly 100 traffic lights.
Worried observers ask: How much information about me is being collected? Who has access to it? What is it being used for?...

A group of researchers is developing privacy tools. "In the Internet of Things, you’re just entering different spaces and interacting with different devices, and you may not be aware that sensors are present, let alone what their privacy policies might be," says Norman M. Sadeh, a professor of computer science at Carnegie Mellon and one of the privacy researchers.

He says his team has created registries that enable people’s smartphones to discover whether there are sensors in an area and to understand what data are being collected. In certain spaces, people can request that data about them not be collected. For instance, if the sensor uses facial-recognition technology, people can have their faces obscured from the sensor.

Sometimes you can’t opt out, Mr. Sadeh says, though he points out that the advance notice will allow people to choose not to enter a room, or to turn off any smart devices before entering. Still, for the most part, people have to take action to ensure their privacy.

The eventual goal is to minimize the number of physical sensors by teaching the IoT "how to reason about the real world" on its own, Mr. Dey says. Fewer sensors means fewer privacy concerns, he says.
security  internet_of_things  smart_cities  big_data 
5 weeks ago
CABINET // Bringing the Drugstore Home: An Interview with Deanna Day
the medicine cabinet is also one of the home’s most particularized containers—stocked with substances and technologies used in healthcare and grooming, it functions both as personal pharmacy and private salon. Indeed, the medicine cabinet emerged across the early part of the twentieth century not just in tandem with public health policy initiatives but also, importantly, with the developing consumer market for the goods and tools of personal care. Its signature aesthetic—mirror, glass, and gleaming metal—would seem to have as much in common with the presentational seductions of the department store display case as with the sanitary spaces of the physician’s examining room....

A well-stocked and carefully curated medicine cabinet conveyed care and successful home management, while an overstuffed or unconsidered one ran afoul of received ideals of motherhood....

I came across a few interesting sources on the development of the medicine cabinet, which was this truly iconic container designed to hold the types of quasi-medical objects that I was interested in: objects that were partly medical, but also that were partly cosmetic and not thought of as particularly “high tech.” For instance, people would be instructed to keep the thermometer in their medicine cabinet, but also things like razors and toothbrushes and other little technologies that were at least as much about grooming as about health. As I began to look into the history of the 
cabinet as a container, I discovered that it developed very much in parallel with these kinds of technologies. In fact, collecting these technologies together in the same cabinet helped to create the idea that these domestic technologies were something of a “kind,” emphasizing that both beauty work and medical care were crucial parts of literally embodying middle-class virtue in the home. And somewhat predictably, this bodily caretaking, along with the tools used for it, came to be coded as within the realm of women.

it’s the woman of the house who’s the person in charge of maintaining that space to certain standards of cleanliness, standards for what a healthy home was supposed to be like that were increasingly dictated by science. Because of that, women became the stewards of the medicine cabinet, in addition to being the stewards of the other spaces of the home that were about taking care of their family’s bodily needs.

Prior to the bathroom medicine cabinet, these kinds of medical objects and tools would most likely have been kept in the kitchen. There was a lot of overlap in terms of the tools used to prepare medicines and to prepare food, and many treatments were based on diet, on eating certain foods. As indoor home bathrooms became more common with the advent of indoor plumbing, the idea of the clean, hygienic home bathroom space emerged and quasi-medical objects began to migrate to that space. The clean indoor bathroom was a kind of larger container that helped give rise to the smaller container that held these sorts of objects.

Early catalogues that sold household items, like sinks and mirrors and fixtures, figured the space of the bathroom more as a space of luxury, with soft furnishings, lots of decoration, things like that. But as the germ theory of disease and the idea of scientific motherhood started to become more prominent, you start to see the aesthetic changes as well—shiny white tile, chrome fixtures, etc. You want to make it gleam so you can form an aesthetic of cleanliness as well as making sure that you have all the spaces germ-free.

it was a combination of government recommendation, healthcare-expert recommendation, and the “recommendation” of the companies making the medicine cabinets and, especially, the things that went in them. There were reports put out by government agencies recommending what kinds of home healthcare objects families should have on hand in cases of emergency; the idea was that you were supposed to have “a fully stocked medicine cabinet.” This idea also plugs into the idea of having a well-stocked home in general. The medicine cabinet also becomes, in a slightly different way, an aid to sales....

it’s in a room that guests are actually invited into. Then it becomes a private space that guests are actually given a private opportunity to explore, if they want to. It feels like a minor transgression to open the cabinet and see what kinds of things your hosts are using on their bodies, a relatively low-stakes form of gaining secret knowledge about them.
 ...purports to give you some kind of insight into the housekeeping skills of your host....

after looking at some plumbing history. I think the development of the medicine cabinet demonstrates the utility of storage in the bathroom, and other kinds of bathroom storage, such as the under-the-sink cabinet, come to follow its lead. In part this reflects a desire to have an even more private place to keep things in the bathroom than in the medicine cabinet.
medicine  domesticity  containers  sanitation  health  intellectual_furnishings  classification  secrecy 
5 weeks ago
How Getting Hopelessly Lost Inspired What Might be NYC’s Most Important Wayfinding AppEye on Design | Eye on Design
Based on the New Jersey Transit map, and buoyed by his exhaustive research and analysis, Schettino developed a series of floor plans in Illustrator of the station’s two levels and then used these diagrams to create three-dimensional perspective views in SketchUp. The 2D floor plans map out the space while the 3D views help travelers visualize the environment and imagine themselves within it. The dimensionality is especially helpful during peak travel times when physical obstructions and dense crowds make it difficult to see more than a few feet in front of you.

To distinguish the two interior levels, Schettino applied color-coding. The upper level, where Amtrak and New Jersey Transit are located, is identified by blue and the lower level, where Long Island Railroad is situated, is colored yellow.

The 382-page book also includes street-level maps that locate Penn Station within the neighborhood, enlarged views of floor plans that spotlight specific areas, and diagrams that identify typical pathways inside the station. On top of that, there’s a section on the theory and practice of wayfinding and lots of photographs that document the mayhem.
transportation  mapping  deep_maps 
5 weeks ago
Ten Theses In Support of Teaching and Against Learning Outcomes
Teaching does not instruct or transmit information, it embodies and exemplifies the commitment to thinking....

All successful teaching therefore results in students who love to think and never stop thinking for the rest of their lives. This result is very different from mastering a certain body of knowledge or learning to apply certain rules to well-defined situations. ...

The critically minded person is not an undisciplined skeptic, but one who can detect contradictions between principle and practice, and between principles and the values to which they purportedly lead as means. Critical thinking is not the ability to solve problems within the established parameters of social, economic, political, aesthetic, and intellectual-scientific life. Change is impossible if all that people can do is apply the given rules mindlessly. If the problem lies with the established rules (and fundamental problems in any field always concern the established rules), then confining critical thinking to “problem solving” always serves the status quo (i.e., repeats the cause of the problem as the solution)....

Learning outcomes are justified as proof of a new concern within the university with the quality of teaching and student learning. In reality, they are part of a conservative drift in higher education towards skill-programming and away from cultivation of cognitive freedom and love of thinking. Ironically, the passive, consumeristic attitude that learning outcomes encourage in students works against students becoming motivated to learn even the skills and the information that the learning outcomes prioritize.... They present the purpose of pursuing a course of study as the purchase of a defined set of skills and circumscribed body of information which can then be used as a marketing pitch to future employers. Learning outcomes submerge the love of thinking in bureaucratic objectification of the learner as a customer, a passive recipient of closed and pre-packaged material.
teaching  liberal_arts  learning_outcomes 
5 weeks ago
Ikea Strategy Ditches the Dream Home for the Daily Grind
The settings for the vignettes are workaday and unassuming: a living room where a man surrounded by file boxes and paperwork nods off on the sofa, a compact bedroom where two brothers’ very different personalities are on display, an apartment where a father tiptoes from a crib in the single bedroom to a loft bed in the living room.

For furniture ads, Ikea’s new offerings say remarkably little about furniture. Instead, they say much more about the way the American dream has evolved to fit a postrecession economic reality.

The company’s new “We Help You Make It” campaign eschews the aspirational gloss of master suites and two-story foyers in favor of embracing Americans — and their furniture needs — where they are today.....

“We started finding information that was suggesting to us that, postrecession, people’s opinions on what it means to make it had changed,” Ms. Whitehawk said, referring to a “new normal” in which possessions take a back seat to experiences.

The ads reflect this, said Kevin Lane Keller, a professor of marketing at the Dartmouth College Tuck School of Business.
intellectual_furnishings  furniture  advertising 
5 weeks ago
Data Storage on DNA Can Keep It Safe for Centuries - The New York Times
In two recent experiments, a team of computer scientists at the University of Washington and Microsoft, and a separate group at the University of Illinois, have shown that DNA molecules can be the basis for an archival storage system potentially capable of storing all of the world’s digital information in roughly nine liters of solution, about the amount of liquid in a case of wine.

The new research demonstrates that specific digital files can be retrieved from a potentially vast pool of data. The new storage technology would also be capable of keeping immense amounts of information safely for a millennium or longer, researchers said....

The raw storage capacity of DNA is staggering compared with even the most advanced electronic or magnetic storage systems. It is theoretically possible to store an exabyte of information, if it were coded into DNA, in the volume of a grain of sand. An exabyte is roughly equivalent to 200 million DVDs.

In nature, DNA molecules carry the genetic instructions that govern the development and function of living organisms. The cost of sequencing or “reading” the genetic code is falling faster than the cost of computer memory, and technologists are beginning to make progress in their ability to more rapidly synthesize strands composed of arbitrary sequences of the small organic molecules known as oligonucleotides, the basic DNA building blocks.

Computer scientists say they believe that as costs of sequencing and creating synthetic DNA continue to fall, it will soon be possible to create a new class of hybrid storage systems....

“In the last year, it suddenly hit us that this fusion of computer technology and biology will be where future advances come from,” said Douglas M. Carmean, a Microsoft researcher who had been a leading designer of microprocessor chips at Intel.

The evolution of the two fields dates back to the start of interactive computing. The first true personal computer, known as the LINC, was designed by Wesley A. Clark in 1961 for biomedical researchers.

“Information technology has helped biotech in the past,” said Luis Ceze, a University of Washington computer scientist and one of the designers of the new DNA storage system. “Now biotech has to pay back.”...

built on that work by storing information in DNA form and then retrieving a specific file from the data. The Illinois scientists were able to encode parts of the Wikipedia pages of six universities, and then select and edit parts of the text written in DNA corresponding to three of the colleges....

A digitized picture, for example, might be broken into thousands of pieces that are in turn mapped into thousands of individual strands of DNA. When they encode the information, the researchers add a unique identifier that makes it possible to later reassemble the complete picture or file, like putting together a jigsaw puzzle.

The scientists use the ability to amplify specific DNA strands rapidly and efficiently using a technique known as “polymerase chain reaction,” or P.C.R., to make it easier to find the information they wish to retrieve. Invented by the chemist Kary Mullis in 1983, P.C.R. makes it possible to amplify a single copy of a DNA molecule into millions of copies of a single sequence....

“DNA is a remarkable media for long-term storage,” said Karin Strauss, a Microsoft computer architect. “All you have to do is keep it cold and dry.”
preservation  archives  biology  biomedia  genetics 
5 weeks ago
​How Do You Back Up the Museum of Modern Art? | Motherboard
As visual art evolves on the internet, MoMA is having to adapt the way it conserves both its 630,000 square foot of gallery space and its archives. The museum is turning toward an old-school form of tape data storage in order to back up its collection of both digital-native works and images of physical objects. But as it turns out, storing the data is only the beginning. Museum staff also have to figure out how to manage the future of artworks when the digital platforms they exist on just might disappear within a decade.

MoMA’s digital collection is currently about 90 terabytes in size, but the museum expects that to grow to 1.2 petabytes (1.2 million gigabytes) by 2025. That archive will soon be stockpiled on Linear Tape-Open (LTO), a magnetic tape storage system developed in the 1990s.
One of the people responsible for the peculiar challenge of translating so much multimedia material from all eras of culture into digital bytes is Ben Fino-Radin, MoMA’s digital repository manager. …
Whether it’s film, painting, or a hacked Nintendo game (thanks Cory Arcangel), all the art has to get translated into the same medium for storage. Images are carefully scanned and analog videos ripped. MoMA also has to make sure artworks are stable—able to be recalled by the viewers of the future.
“Digitization alone is not preservation,” Fino-Radin said. “When we digitize things, it’s not like way that Internet Archive digitizes books, ripping out the spines and scanning as quickly as possible. It’s much more delicate and considered.”…
After digitization, all those files have to be put somewhere. “The problem with digital preservation is that there’s no permanent form of storage, it just doesn’t exist,” Fino-Radin said. Formats change, companies fail, and data gets corrupted. The best current answer is the resolutely physical LTO Ultrium system, which Fino-Radin is currently transitioning to. The actual tape that holds the encoded data is are stored in cases that look like zip disks, which will then stored in MoMA’s basement.
“It will be this giant hulking black box,” Fino-Radin said. “Plus a second system at the MoMA Queens art storage facility, and a third copy at the Celeste Bartos Film Preservation Center," MoMA’s media vault in Hamlin, Pennsylvania.
The best thing about the LTO system is that as the technology improves, the storage capability of the same amount of physical material increases exponentially in a perfect echo of Moore’s law. … As updates continue and the digital collection grows, it will still fit on the same tape.
So why not just use the cloud? MoMA’s collection just might fit on the largest Dropbox account ever. But there are other dangers to take into account when subscribing to a storage service instead of doing it yourself. “The most basic reason is cost,” Fino-Radin said. “When you store things in the cloud, you don’t purchase storage, you rent it. … it would cost upwards of $10 million more than to store the same data on our own infrastructure,” Fino-Radin said.
Then there’s the issue of being locked into a particular company that’s vulnerable to cyber attacks, internal instability, bankruptcy, or anything else that could take its servers offline. “When you store things in the cloud, you are dependent on the companies you are storing them with to be around in 10, 15, 20 years. Amazon isn’t going away, but let’s say they did….
The museum’s backups are behind a firewall and only accessible via a private network. The files themselves are “write once, read many,” Fino-Radin said. “You can write data but can’t erase or change data; it’s incredibly permanent.” This guards against all-important artworks getting stolen, hacked, or altered, so they’ll stay the same for future generations—so many digital Mona Lisas….
MoMA plans a decade out with its digital storage needs, but it's not so concerned with storage infrastructure as what it will be allowed to do with the data it receives. “It really often is driven by rights agreements,” Fino-Radin said, who added that in the digital era, the museum has changed its acquisition contracts “to be more permissive with what we can do with the content.” That means ripping data, accessing source code, and porting pieces to platforms they might not have originally existed on.
“The really big challenge when writing an agreement is having language that says you can do all this stuff, but not having a date so specific it becomes obsolete,” Fino-Radin said. The institution can’t specify a particular platform like YouTube since “in 25 years it will probably be irrelevant.” The trick is thinking long-term. He offers some basic digital file guidelines that might go as well for a personal music stash as the world’s best collection of modern art. “Is it lossless, is it uncompressed, does it have good metadata embedded in it?”
archives  museum  memory  data  preservation  storage 
5 weeks ago
How Chemistry Is Rescuing Our Audio History from Melting - Facts So Romantic - Nautilus
Between the late 60s and the late 80s, much of our culture—from the Nixon trials on television to unreleased music from famous artists like the Beatles—was recorded on magnetic tape, and this tape is starting to disintegrate. Some of the audio and visual data has already been safely adapted to digital storage, but the majority hasn’t—and it’s a problem of massive proportions....

The Cultural Heritage Index estimates that there are 46 million magnetic tapes in museums and archives in the U.S. alone—and about 40 percent of them are of unknown quality. (The remaining 60 percent are known to be either already disintegrated or in good enough condition to be played.)

What’s more, in only about 20 years we won’t be able to digitize them, according to audio and video preservationist George Blood, in Philadelphia. This is partly because digitization machines that can handle the tapes have ceased production. On Sept 30th, for example, Sony stopped taking orders for videotape machines, and in June 2015, the last audio reel-to-reel machine went out of production. Plus, the ones that already exist are wearing down—and parts to repair them are difficult to come by. And to add to this, the tapes themselves are degrading. Trying to digitally process these in studio-grade machines, for example, clogs the tape player heads, wrecking the very machinery that can digitize the tapes as stocks of them are dwindling.

The cause of tape disintegration is something called sticky shed syndrome, a result of the hydrolysis of esters. When ester, a compound that partly constitutes the polyurethane binder that holds a tape’s magnetic particles, combines with water, they form a carboxylic acid plus alcohol. The acid and alcohol make the tapes sticky and unplayable...

Sticky shed tapes are not lost to the world forever, however. They can be baked in a low-temperature oven (about 100 degrees F) for eight hours or more. This often drains the water from the tape and can make it playable for a short while. However, baking tapes also makes them precariously brittle—so treating tapes of unknown quality isn’t a good idea....

Morgan and Breitung needed a way to figure out which tapes are degrading the fastest to prioritize the arduous digitization efforts.

Before them, other researchers had employed infrared spectroscopy, a non-invasive technique, to assess the damage. It works by identifying various light absorption peaks, corresponding to changes in ester, carboxylic acid, and alcohol content—each absorbs light differently. However, this approach wasn’t totally reliable: Not only were the peaks not very different between playable and nonplayable tapes, making the level of degradation difficult to determine, but the sound engineers also had difficulty working with the tool.

To overcome these hurdles, they combined a laptop-sized infrared spectrometer with an algorithm that uses multivariate statistics to pick up patterns of all the absorption peaks (this kind of analysis is called chemometrics). As the tapes go through the breakdown reaction, the chemical changes give off tiny signals in the form of compounds, which can be seen with infrared light—and when the patterns of reactions are analyzed with the model, it can predict which tapes are playable. The sound engineers could use this, says Breitung. “We couldn’t have them analyzing spectra—it would take too long and the types of changes were too subtle.” Taking spectra samples at 20 different places along the tape, the researchers get a pretty good sense of the tape’s condition....

the quarter-inch audiotape—first used in 1972—was the type of media that the institutions were most worried about saving....

The problem with lacquer discs, which were used to record sound in the 1930s, first by movie studios and then by radio stations, is basically the same with magnetic tape. Even “if they are properly stored,” says DeAnna, “they start to break down.”

One batch of lacquer discs that has records of radio dispatches from World War II has been particularly challenging to digitize. Many radio stations had switched to using discs manufactured with a glass base instead of the typical aluminum base, since aluminum was in demand for the war effort. Glass discs are even more fragile. But a physicist at Lawrence Berkeley National Laboratory, with the help of some students, developed a machine to record the dispatches without having to touch the discs. Called IRENE—an acronym for Image Recover Erase Noise, Etc.—it takes a high-resolution digital image of the disc using a beam of light and translates it into a digital file.

DeAnna thinks this technology could transform the science of archives. Instead of making a copy of an original, and then a copy of that copy—and losing a bit of fidelity each time—IRENE offers the possibility of capturing the audio in its original state.

As opposed to making a copy and having the quality decrease, says DeAnna, “for the first time audio archivists can start thinking about preserving the object as an image of the grooves.”
archives  preservation  materiality  chemistry  tape  imaging 
5 weeks ago
MoMA | Collecting Alvin Lucier’s I Am Sitting in a Room
How does a museum acquire an experimental music performance? What does the museum actually receive? And if the museum doesn’t acquire a physical object, what is the value of adding this work to the collection? ...

In consultation with Lucier himself, the curatorial team decided that the acquisition would include both the ability for others to perform the work in the future and an archival recording of the composer performing the piece himself at MoMA. (Lucier is no stranger to MoMA; he performed the work Chambers as a part of The Machine and Nature, an audio-visual concert on January 15 and 16, 1969.) He will provide the Museum with a set of instructions guiding the re-performance of the work in the future, a practice other artists such as Kevin Beasley have used in other MoMA acquisitions....

This past December, a small group gathered at the Museum after hours on a Saturday evening to produce the recording. Having transitioned from an analog set-up to digital, Lucier worked with trusted audio engineer James Fei, substituting a laptop for two tape recorders. We sat in near silence, and after a number of sound checks, the performance began....

This recording, forever linking I Am Sitting in a Room to the walls of the Museum is hardly definitive. There are a number of recordings of the piece including the original and a version from 1980 released on Lovely Music, Ltd. Each recording has its own temporality—it’s own relationship to a time and place. Each one is an artifact, an articulation of Lucier’s presence at a given point in our continuum. But the presence of the work in MoMA’s collection has an entirely different temporality. As Stuart Comer, Chief Curator in the Department of Media and Performance Art, phrased it, the ability to perform the piece “allows it to exist in a constant state of imminence.” A collection of sound art is the endless possibility of bringing the work into fruition, a commitment to the work’s future status as having not just a past but also a present.
sound_space  sound_art  acoustics  lucier  preservation  archives  museums 
5 weeks ago
Educause 2016: Libraries and future of higher education | Feral Librarian
As fewer people “go to the library” there has been a growing genre of literature I’ll call the “how to save libraries” genre.

Trends like declining circulation of print books and, in some cases, declining foot traffic in physical library buildings, has led to all kinds of strategies for “saving libraries”.

For academic libraries, that has usually been about turning libraries into information commons, always with coffee shops inside; and/or pumping up the role of librarians in teaching study skills, info-seeking skills and otherwise tying the work of the library folks into student success.

These are all good things, and make for good talks and articles, but my talk today will not be part of that genre. This will not be a “save the libraries” talk.

(this talk by David Lankes, where he references a great talk by Char Booth ,is a much more nuanced take on this than my soundbite intro here)

Let me go ahead and give away the punch line now: I don’t think we need to save libraries, but I do think we might need libraries to save us....

This is where libraries come in.

Libraries and librarians can and do play a crucial role in creating a more open, connected, and equitable future for higher education (and for our communities) through our support and facilitation of open access to scholarship and through our role in providing inclusive spaces that facilitate community building and formal and informal learning.

Let me talk first about openness....

This is one of the key themes in the preliminary report on the future of libraries just released by MIT on Monday:

For the MIT Libraries, the better world we seek is one in which there is abundant, equitable, meaningful access to knowledge and to the products of the full life cycle of research.

And lo and behold, it is libraries and librarians who are implementing open access policies in our research organizations and who are doing the heavy lifting to make journal articles (and some other forms of scholarship, like data and in some cases books and textbooks) openly available in meaningful, organized ways through institutional repositories and through educating authors on their rights and options.

Right now we are doing that in a hybrid environment, where much of the literature libraries provide to our communities is still not openly available; we provide it to “authorized users” only based on the contracts we sign with publishers – many of whom are for-profit entities who dabble in open access publishing, but who at the end of the day are still driven by a profit motive not an educational or social good motive.

Having research locked away behind corporate paywalls and/or behind our institutional authentication systems means that access to information is not only not free; but is fragmented and cumbersome.

The current landscape of scholarly literature consists of multiple silos of information, accessed through library websites, journal sites, aggregators sites, google and google scholar, social media sites, you name it....

Libraries are special places on campus and the Libraries and their staff occupy an essential role in the intellectual and social life of our college and university communities, perhaps especially for students.

The Libraries are a place of research and learning, and library staff are subject-matter and methodological experts who are committed to supporting student success.

One important characteristic of library staff that distinguishes them from faculty is the lack of any authoritative or evaluative role over students. This makes the Libraries places where students might be especially free and comfortable asking questions, seeking help, experimenting with nascent ideas and thoughts, and making mistakes.

Combine that with the fact that Libraries are places where intellectual freedom and privacy are deeply valued and fiercely protected, and it is quite possible that libraries will be the places our students and other community members might feel the most comfortable talking about difficult topics. Perhaps we could start to bridge some of the racial and other divides on our own campuses in and through the libraries; through formal and informal learning and dialogue in our spaces and through exposing students to an inclusive range of credible sources of information and knowledge and research.
libraries  social_justice  open_access  discourse  public_sphere 
5 weeks ago
PASIG 2016 talk: “The Voice of One Crying Out in the Wilderness: Preservation in the Anthropocene” « Eira Tansey
America’s founding mythologies revolve around making the land bend to the will of the powerful. This process was sustained by the creation of records and archives that asserted that the land was a wild place: whether wild with humans to be killed or removed through treaties that would inevitably be broken, or wild with trees and rivers to be surveyed and divided up through land claims.[5]

Indeed, archives are so strongly identified with the land in which records are created and maintained that when the writers of the Declaration of Independence listed among the “facts submitted to the world,” they found King George had:

“[…]called together legislative bodies at places unusual, uncomfortable, and distant from the depository of their public Records, for the sole purpose of fatiguing them into compliance with his measures.”[6]

The implication is clear, that records not only accorded legal rights, but that those records had a particular spatial significance as well. To be alienated from access to one’s records is to not just have one’s rights in question, but to be divorced from the land of legal rights. However, time and again throughout history, those who create and control records effectively control the land, and records articulate who is allowed to have a legal relationship with the land — and who can exploit the land for economic gain. I can think of no better current example than what is happening with the Dakota Access Pipeline.
archives  records  colonialism 
5 weeks ago
It's Nice That | The Wellcome Collection publishes book of early infographics, charts and diagrams for organising nature
Animal. Vegetable. Mineral. is a publication from the Wellcome Collection celebrating how humans visually classified and organised the sprawling natural world in an important era of scientific research. Made to accompany the upcoming exhibition Making Nature, it features archival charts, diagrams, maps, lists and illustrations of variants of species and organism types, delightful in their intricate and informative detail.

Most were ordering systems and tools devised by pioneering European nature researchers, artists, scientists and explorers. The book features visuals from key 18th and 19th Century figures that “shaped the course of natural history” such as Charles Darwin, Carl Linnaeus, Alexander Von Humboldt, Anna Atkins and Ernst Haeckel.

It was “the original big data challenge” says Wellcome during a time when our understanding of nature was rapidly evolving and expanding, for these pioneers to document and present their findings. Many of these images, from taxonomy charts and animal distribution maps to colourful picture dictionaries, show the creativity at hand during this process.
classification  illustration  botany  nature  zoology  geology  scientific_illustration 
5 weeks ago
Liberatory Archives: Towards Belonging and Believing (Part 2) – On Archivy – Medium
If you think of an idea that you think is ahead of the curve or new in any way, be assured that a woman — often times a black woman, but not always — probably thought of the idea first. So do the research. Do the reading. Cite her work. And don’t be an oppressive, patriarchal jackass who erases and undermines the work of women and folks who don’t subscribe to the gender binary. Fellas if you aren’t finding the sources that speak to whatever idea it is you’re interested in exploring, that isn’t because those sources don’t exist or haven’t been written. It’s likely because they haven’t been cited, and they likely haven’t been cited because she’s a woman. Just my thoughts.

Okay; back to the definition! Michelle Caswell, in her book chapter “Inventing New Archival Imaginaries,” really sets a fiery foundation on which to engage this concept of a liberatory archive. Again, please read this work in full if you haven’t. If you need access to a copy of it, holla at ya boy or contact Michelle directly. Once you read the chapter in its entirety, I’m sure you’ll be struck by this line that reads:
“…through the lens of liberatory archival imaginaries, our work as community-based archivists does not end with the limits of our collection policies, but rather, it is an ongoing process of conceptualizing what we want the future to look like.”
So you see in her definition that liberatory archives are not things so much as they are processes. Understanding them, then, is not a ‘what’ question as much as a ‘how’ question. Let me now expand on the ‘how’ question of liberatory archives and focus on two processes and actions for us to consider explicitly integrating into the work of community archives....

A project that embodies believing in the context of liberatory archives is Community Futurisms: Time & Memory in North Philly. Led by two black women artists who form the collective Black Quantum Futurism, Community Futurisms is:
“a collaborative art and ethnographic research project exploring the impact of redevelopment, gentrification, and displacement within the North Philadelphia neighborhood known as Sharswood/Blumberg through the themes of oral histories, memories, alternative temporalities, and futures… BQF Collective will operate Community Futures Lab, a gallery, resource and zine library, workshop space, recording booth, and time capsule, recording oral histories/futures in North Philly.”
Did you catch that at the end of their project description? They will be recording oral histories and futures. This isn’t an archival project that exists solely to recast the past. Rather, their efforts are about impacting the future, which can only happen if one 1) believes there is thing such as a future and 2) believes that one’s fate in the future is not fixed.
archives  community_archives  citation 
6 weeks ago
20,000 Hard Drives on a Mission | Internet Archive Blogs
Once a new item is created, automated systems quickly replicate that item across two distinct disk drives in separate servers that are (usually) in separate physical data centers. This “mirroring” of content is done both to minimize the likelihood of data loss or data corruption (due to unexpected harddrive or system failures) and to increase the efficiency of access to the content. Both of these storage locations (called “primary” and “secondary”) are immediately available to serve their copy of the content to patrons… and if one storage location becomes unavailable, the content remains available from the alternate storage location.

We refer to this overall scheme as “paired storage.” Because of the dual-storage arrangement, when we talk about “how much” data we store, we usually refer to what really matters to the patrons — the amount of unique compressed content in storage — that is, the amount prior to replication into paired-storage. So for numbers below, the amount of physical disk space (“raw” storage) is typically twice the amount stated.

As we have pursued our mission, the need for storing data has grown. In October of 2012, we held just over 10 petabytes of unique content. Today, we have archived a little over 30 petabytes, and we add between 13 and 15 terabytes of content per day (web and television are the most voluminous).

Currently, Internet Archive hosts about 20,000 individual disk drives. Each of these are housed in specialized computers (we call them “datanodes”) that have 36 data drives (plus two operating systems drives) per machine. Datanodes are organized into racks of 10 machines (360 data drives), and interconnected via high-speed ethernet to form our storage cluster. Even though our content storage has tripled over the past four years, our count of disk drives has stayed about the same. This is because disk drive technology improvements. Datanodes that were once populated with 36 individual 2-terabyte (2T) drives are today filled with 8-terabyte (8T) drives, moving single node capacity from 72 terabytes (64.8T formatted) to 288 terabytes (259.2T formatted) in the same physical space! This evolution of disk density did not happen in a single step, so we have populations of 2T, 3T, 4T, and 8T drives in our storage clusters.
storage  archives  data_centers  repair  maintenance 
6 weeks ago
Should I Pursue My Passion or Business? – Medium
For the next 6-months, I am joining YCombinator Research’s New Cities project as an Explorer. My goal? Create an open, repeatable system for rapid cityforming that maximize human potential. It is a vastness and complex challenge — and one that makes me so happy that I want to tap dance to work. Like any other epic journey, we’ll start small and learn fast: Everything we learn, we will be publishing online.
I am not giving up entrepreneurship. This is just another form. I am trusting that amazing experiences will teach me to be a better entrepreneur.
I can’t do this alone. YC can’t do this alone. This is our problem to solve together. To be successful, we’ll need investors, industries, governments, charities, citizens, and critics. I know many of you have been waiting for a project like this. (If you have lots of land for a new city, let us know.)
Why now?
I’m done complaining about cities. I want to be a part of a solution. I want cities for the poor and the rich, the locals and the transplants, the freaks and the geeks, and the young and old....

Affordable, dynamic cities are a sustainable solution to a world thirsting for innovation....

And cities are resilient. Rome. Tokyo. Istanbul. Lagos. Cities often outlast kings and empires. City-states were the original superpowers. Yet, mass migration to mega-cities have only occurred in the last 50 years. Cities are young trees of life that have just started to bear fruit....

Every great city benefited from historically advantageous starting conditions that cannot be recreated. But I believe technology can seed fertile starting condition across nations and geographies.
urban_planning  solutionism 
6 weeks ago
MIT task force releases preliminary “Future of Libraries” report | MIT News
The MIT task force arranged ideas about the MIT Libraries into four “pillars,” which structure the preliminary report. They are “Community and Relationships,” involving the library’s interactions with local and global users; “Discovery and Use,” regarding the provision of information; “Stewardship and Sustainability,” involving the management and protection of MIT’s scholarly resources; and “Research and Development,” addressing the analysis of library practices and needs. The preliminary report contains 10 general recommendations in these areas.
For the “Community and Relationships” pillar, the report notes that MIT library users may have varying relationships to the system in the future, and suggests a flexible approach simultaneously serving students, faculty, staff, alumni, cooperating scholars, participants in MITx classes, the local Cambridge and Boston community, and the global scholarly community.
The task force also recommends further study of changes to on-campus library spaces, allowing for quiet study as well as new modes of instruction and collaboration. It suggests that in an evolving information landscape, libraries must teach students how to not only access and evaluate information, but also responsibly generate new knowledge and create systems and tools that others will use to discover, share, and analyze information.
In the area of “Discovery and Use,” the report suggests that the library system enhance its ability to disseminate MIT research to the world; provide “comprehensive digital access to content in our collections”; form partnerships to “generate open, interoperable content platforms” for sharing and preserving knowledge; and review the Institute’s Faculty Open Access Policy.  
Regarding “Stewardship and Sustainability,” the task force envisions the MIT Libraries as the leading repository of the Institute’s history and as a leader in the effort to find solutions for the “preservation of digital research,” which the report describes as a “major unsolved problem.”
Finally, in the area of “Research and Development,” the report proposes the establishment of an initiative for research in information science and scholarly communication, to support both research and development on the grand challenges in the field.
6 weeks ago
Turning the inside out – discontents
Kate and other historians of Chinese Australia have noted that the administration of the White Australia Policy was not uniform or consistent. Similar cases could result in quite different outcomes depending on the location and those involved. Understanding this is important, not only for documenting the workings of the system, but for recovering the agency of those subjected to it. Non-white residents were not mere victims, they found ways of negotiating, and even manipulating, the state’s racist bureaucracy. In her work on colonial archives, Ann Laura Stoller identifies this ‘disjuncture between prescription and practice, between state mandates and the manoeuvres people made in response to them’ as part of the ‘ethnographic space’ of the archive.4

How do we explore this space? One of the things I’ve found interesting in working with the closed files is the way we can use available metadata to show us what we can’t see. It’s like creating a negative image of access. Kate and I have been thinking for a number of years now about how we might use digital tools to mine the White Australia records for traces, gaps, and shadows that together build a picture of the policy in action. Who knew who? Who was where and when? What records remain and why?...

Just like systems of racial classification, intelligence services exist within a circle of self-justification. The fact they exist proves they need to exist. We are denied information that might enable us to imagine alternatives. And yet as limited as the provisions under the Archives Act are, we do have access.

How can we use this narrow, shuttered window to reverse the gaze of state surveillance and rebuild a context that has been deliberately erased. Just as with Closed Access and the White Australia records can we give meaning to the gaps and the absences? Can we see what’s not there?

This is one of the questions being explored by Columbia University’s History Lab. They’ve created the Declassification Engine – a huge database of previously classified government documents that they’re using to analyse the nature of official secrecy. By identifying non-redacted copies of previously redacted documents, they’ve also been able to track the words, concepts and events most likely to censored.

The History Lab’s collection of documents on foreign policy and world events is rather different to ASIO’s archive of the lives, habits and beliefs of ordinary Australians. But I’m hoping that they too can tell us something about the culture that created them....

Through trial and error I developed a computer vision script that did a pretty good job of finding redactions – despite many variations in redaction style, paper colour, and print quality. It took a couple of days to work through the 300,000 page images, but in the end I had a collection of about 300,000 redactions. Unfortunately about 20 percent of these were false positives, so I spent a number of nights manually sorting the results.
archives  classification  secrecy  redaction  machine_vision 
6 weeks ago
RHUNHATTAN: A TALE OF TWO ISLANDS | NYU Center for the Humanities
I strive to uncover invisible, suppressed stories that lie in the geopolitical shadows of colonialism and migration. As the 2016-17 Artist-in-Residence at the Asian/Pacific/American Institute at NYU, I will research the social history of plants via spice routes and botanical expeditions to create a multiplatform project, Rhunhattan, that will include psychogeographic and immersive tech experiences, as well as object and olfactory work to bring forth the historical and contemporary relationship between the islands of Rhun (located in present-day Banda Island Archipelago of Indonesia) and Manaháhtaan (original Lenape name of Manhattan).

During 17th century Spice Wars, Dutch Nieuw Amsterdam was captured by the British and renamed “New York.” By 1667, the Dutch relinquished their claim to the colony in exchange for Rhun, the sole British colony in the Banda Islands of present-day Indonesia, thereby gaining monopoly of the lucrative nutmeg and mace trade. This pivotal moment came at a bloody cost for Indigenous peoples: both for the Bandanese and the Lenape people of Manaháhtaan. Over the centuries, as the spice trade faded, Rhun also settled into the background while Manaháhtaan rose to unprecedented financial success. The remaining colonial landmarks that continue to link these islands are the present day National Museum of American Indian at Bowling Green, which occupies the original site of Fort Amsterdam, and Fort Nassau of the Banda Islands; both forts share the same diamond-shaped architectural structure. In the visual narrative that I will be developing I see the identical forts act as portals between the two contested sites to collapse the time and distance of these two islands.

To tell this story of two islands with intertwined fates of land dispossession and erasure during the birthing of imperial globalization propelled forward by countless caravans and ships transporting spice, sugar, and silk, I am reeducating myself about the broken human relationship with land and waters. We are living in debt to our future generations and must learn how the Lenape sustainably managed the island for the sake of futurity over millennia. In a time when massive glaciers the size of lower Manhattan crashing into the ocean doesn’t make a media splash, we have a great responsibility to fight apathy. We are living in urgent times and there is a need to revitalize indigenous cultures and knowledge for environmental stewardship. We need a paradigm shift from falsely believing that human beings are landlords of Earth to seeing humans as being part of the ecosystem.
smell  taste  colonialism  trade  globalization  botany 
6 weeks ago
Amazon as an ISP Isn’t Bonkers—It Makes Perfect Sense | WIRED
AMAZON THE ISP. It sounds strange when you first hear it. Amazon, you think, is an online store. It lets you buy stuff over the Internet. Comcast and Verizon and Orange and Vodafone are the ISPs. They provide the Internet service to the world’s homes and phones.

But if you step back, just a little, you realize that Amazon is a natural ISP. One day, it could compete the Comcasts and the AT&Ts—or at least try to. You can see this in the way Amazon has already built its business. And you can see it in the ambitions of other Internet giants like Google and Facebook....

The online news site says Amazon may sell Internet service directly to consumers alongside its streaming media offering, Prime, which delivers movies and TV shows via the Internet....

As it stands, Amazon is beholden to Comcast. But if it ran an ISP, it wouldn’t be. “One of the big challenges for companies providing over-the-top video services like Netflix and Amazon Prime Video is that they are still reliant on broadband providers, many of whom are also TV providers and so have an inherent conflict of interest in helping them reach customers with high-quality video services,” says Jackdaw Research analyst Jan Dawson, who has studied telecoms regulation and carrier strategy....

We’ve already see much the same moves from Google and Facebook. Mostly notably, in 2011, Google started building an ultra-high-speed ISP, Google Fiber, in select American cities. In the beginning, it characterized this an experiment meant to push other ISPs toward similar high-speed services. But as Google moves more and more into video and other digital media, Google Fiber has morphed into a full-fledged business. In fact, it’s now its own company, one of the business units spun out of Google under the new umbrella operation called Alphabet.

At the same, Google is offering its own wireless Internet service, Project Fi. This service is driven by existing services from entrenched mobile ISPs like Sprint and T-Mobile. But it’s a way of working around the limitations of even bigger mobile ISPs like Verizon and AT&T....

Facebook has taken a slightly different route. Through, it has partnered with ISPs in the developing world to offer free Internet service on mobile phones. This is a way of expanding the Internet into new areas, and it includes access to Facebook....

The world’s largest online retailer now controls so much of its own supply chain, from the massive fulfillment centers it operates across the globe to the brick-and-mortar stores that are popping up in places like New York and Seattle. This is just what Amazon does. It builds and operates its own infrastructure.
amazon  isp  internet  connectivity  infrastructure 
6 weeks ago
This New Code Ensures Buildings Designs are Internet Optimized | ArchDaily
looking at a building, how good its internet is, is probably not one’s first thought. But for the tenants and companies inside it, it’s a key building service that they rely on daily.

As Arie Barendrecht explains, “it’s vital to tenants of buildings and critical to attracting and maintain new tenants – it’s a non-negotiable design component."

Barendrecht is the co-founder and CEO of WiredScore, a company that ranks commercial buildings on their connectivity. Beginning in New York, the company has provided wired certification to over 300 buildings in the city, with further operations across several other US cities as well as London and Manchester in the UK. The company’s work is instrumental in showing architects how their designs need to prepare for the 21st century and acknowledging those that already do....

Space allocation, in particular, is a critical factor. It’s not unusual for tenants wanting to upgrade their connectivity to discover they can’t, simply because there is no room for it. A common example of this seen by WiredScore is not having the floor space for wireless equipment like DAS or small cells. The space for wireless is simply not included in a lot of current building designs, but increasingly needed by tenants given the rise of the mobile workforce.

It’s also important for spaces to be flexible, not just for the potential to free up more floor area, but also to support the installation of new technologies regardless of what sort of wired or wireless infrastructure is required. This is especially relevant for new buildings where technological requirements can easily change between the time of planning and its completion....

Nowadays, many companies depend on having connectivity 100% of the time, making this sole dependency especially risky. Instead, diverse conduit pathways provide an alternative backup if one side were to come under fire, flood, or other physical damage. This involves having at least two different internet providers running their cables vertically through, and horizontally out of different sides of the building.

Resiliency focuses on the protection of the equipment itself, such as placement above grade – a lesson many New Yorkers learnt following the flooding caused by Hurricane Sandy. It also covers allocating telecom in a way to prevent day to day damage, and the best-designed buildings for connectivity separate equipment from areas of the building where users could accidentally damage equipment...

Materiality also comes into play, especially their effect on wireless coverage. Energy-efficient glass, in particular, blocks external cellular networks from entering buildings. So for developers aiming for LEED certification, Arie suggests having wireless strategies in place to compensate for the typically worse cellular coverage caused by low-e glass.
media_architecture  internet  infrastructure  wires  connectivity 
6 weeks ago
Field notes for 'What We Left Unfinished' | Ibraaz
It is not simple to work with an archive in a country like Afghanistan, where books, films and monuments are all subject to burning; stupas are looted and statues shattered; and sites sacred for one reason or another are eroded by both natural and human disasters. Understandably, Afghans are wary of anyone who proposes to 'mine' any cultural resource they still possess.
If you want to work with an Afghan archive, therefore, you cannot address your desires to it directly. You must sidle up to it sideways, as if approaching a horse with an uncertain temper. ...

This indexing of the archive is critical, because your first approach to any archive is always through its metadata: not the content, but its descriptors. When you are engaged in a slantwise, shuffling sort of appeal to the archive – three steps forward, two steps back – your approach may be even more removed. First you must address the people who possess or are creating the descriptions, and then you must sort through their often conflicting and overlapping accounts. In short, you must perform some of the functions of the archive, or archivist, yourself. This performance, you hope, can be your contribution to the archive: a history of sorts, which you write as you find it and leave behind when you go....

Some performances alter the archive irrevocably, slashing and burning as they go, like the literal burning of film prints in the Afghan Films courtyard in 1996. Others are delicate insinuations, or daily rituals, whose effects are not visible until viewed from a distance – leaving out certain details while labelling a film canister, for example, because everyone knows those details, until no one is left who knows and the data transmutes from omitted to lost. Or the use of a cheap brand of tape to splice film in lean years, which decades later means that each time those reels run through a projector or telecine apparatus, they may break at the splice point. Or a particular method of cleaning prints with rags, which over the years accumulates as a fretwork of scratches on celluloid....

Some parts of the archive are always more visible than others. The archive has two faces: its public narrative and its private holdings. The public narrative, which is designed to be visible, is usually constructed from only a small portion of the private holdings, which remain largely invisible. The public narrative can be adapted by the archive's performers to meet the moment, by sampling from different parts of the private holdings to construct the order most likely to match present interests. In every archive, there exists a literal or metaphorical dusty drawer where past archivists have filed the private holdings deemed least likely to ever be of interest to anyone anywhere: unfinished projects, failed experiments, institutional embarrassments. If you are an artist, you probably want to find that drawer and rifle through it...

The Afghan Films archive is, however, a special case, where the entire negative archive and large portions of the print archive were hidden from 1996-2002, with the door to the negative archive completely bricked up and disguised behind a poster of Mullah Omar. In some ways, the whole archive was temporarily filed in the invisible dusty drawer, and only very gradually did it emerge from this position of retreat over the subsequent decade (2002-12)....

many of the prints are literally covered in dust, and need cleaning and checking to see whether they are still viable. The negatives, in their closed chamber, remained more pristine, but are plagued by the aforementioned splices, which need to be marked before any kind of large-scale telecine project can be undertaken. Multiple handwritten catalogues of prints and negatives exist but they are often contradictory or overlapping, and the handwritten labels on the film canisters also sometimes contradict the catalogues, or are inaccurate. This surplus of unreliable indices has produced some uncertainty about which films now (post-bonfire) may exist only as negatives, which may exist only as prints, and which may exist as both negatives and prints. Re-cataloguing will resolve this uncertainty. It also serves to discover which prints may still be useful for soundtrack digitization or circulation of films on film....

In the research project What we left unfinished, I will be looking for some of these unfinished films and the people who made them, trying to decipher, from the gaps between what was finished and unfinished, some clue to the gaps between how the Afghan Left imagined its re-invention of the state and how that project went so terribly wrong – the gaps between revolution, reconciliation and dissolution....

Archives often presume or present themselves to be keepers of facts, and, moreover, keepers of facts that serve as anchor points for the larger historical record. Artists prospecting in archives are sometimes suspected by archivists of taking these facts only to weave fictions around them. While this suspicion is not entirely unjustified, it also overlooks the multiple layers of constructed narrative that already surround most archival records – from provenance records, to finding aids, to placement and classification within the archive, to metadata tags, descriptions and annotations. Each of these layers has an individual author and thus allows subjective interpretations, human errors, fictions and inventions to accumulate around, and influence perceptions of, the original records....

Is it possible, however, to imagine some kind of ethics of archival research?

The media archive collective suggested in Ten theses on the archive that we approach the archive with intellectual propriety, rather than rigid notions of intellectual property.[2] I interpret this to mean that, as researchers, we should be sensitive to the origins and contexts of archival material, especially when considering how to deploy it within a new artwork. 'Fair use' is a legal doctrine but also an apt phrase: is your use of an existing work fair to the original creator?...

Intellectual propriety, however, would require that the original creators be sought out and consulted about their original intentions for the films, not only as a matter of intellectual curiosity, but also as an ethical prerequisite for taking their unfinished work and re-contextualizing it within a new artwork – especially when the work being appropriated was never made public in its original form. Ultimately, intellectual propriety might even require that the new artwork become a work of facilitation rather than a work of appropriation – that is, after consulting the original creators, it may appear more appropriate or desirable to create a system whereby they can finish their own unfinished work (with the interest coming from the gap between the moment of making and the moment of finishing), rather than subsuming their unfinished work into a new artwork....

In a country like Afghanistan, where iconoclasm is a very real and seemingly perpetual threat, preservation of cultural resources like the Afghan Films archive may best be achieved not by panicked moves to protect assets, but rather by a move to project those assets. That is, locking the films away for another decade in another dusty drawer would be less effective than digitizing the archive as quickly as possible and disseminating films as widely as possible, including placing copies of master files on servers both inside and outside the country. Broad dissemination would also allow a critical discourse to grow around the films, ultimately making an even stronger argument for their preservation....

When a collection becomes an archive, the linguistic shift registers a transformation from a group of objects that are, to a group of objects that were (the same, connected, part of a set, parts of a whole). In this sense, the archive is founded on a moment of passing into the past, a kind of death, and the impulse to archive is connected (as Derrida said, following Freud) to the death drive....

At the same time, the archive constantly engages in attempts to resuscitate its holdings, bringing them back to life in the present: translations to new formats; circulation to new audiences; new interpretations, orders, edits, narratives. If the archive is both founded on and pledged against disaster, we can interpret that founding moment as the archive's original attempt to preserve something that might otherwise be lost, and that pledge as the archive's continuing efforts to countermand the static nature of preservation by projecting its past memories into the present and the future.
archives  metadata  afghanistan  nationalism  remixing  preservation  projection 
6 weeks ago
New Columbia class aims to contextualize data in history, society - Columbia Daily Spectator
Columbia will offer a new course on how to interpret and evaluate the impact of data next semester in the hopes of facilitating greater understanding of how data is used.

History professor Matt Jones and applied math and physics professor Chris Wiggins announced the course at an event on the role of data science on Monday. The class will begin as a small discussion section under both the history and applied math departments, and the professors plan to eventually expand the course to lecture size.

Both professors said that the idea stemmed from a fear that while governments and large corporations are gathering more data, the public’s understanding of the way in which that data is used is not sufficient....

“What we’re seeing today is a real transition in the ability of data to impact the world,” he said. “We’ve done a great job over the last 100 years with thinking about what every citizen should know about the Greeks, but in the next century I think there’s a need for somebody to think through what every citizen needs to know about data.”

Jones explained that the course will accomplish its goal of providing a general education on data issues by including students from all disciplines in the same class....

One goal of the course is to teach students to evaluate claims based on data that has been interpreted by algorithms.

“If somebody says this algorithm exists, therefore you should believe in it, you should be critical of it,” Wiggins said. “Rhetorical literacy is recognizing that if somebody says this algorithm is true, somebody says that to you because they want something.”

Jones said he also hopes that the course will explore more political questions, explaining how expertise in history is useful to understanding why approaches to data collection and interpretation were established in their original forms.

Jones noted the original purpose for the introduction of modern statistical methods as an example of interesting background.

“We’re going to begin with classical statistics and teach all of the technical rigors that go along with that, but never neglect that the key context for that work was eugenics,” Jones said. “The science can be made independent of that context, but it’s important.”
data_science  data_literacy  liberal_arts 
6 weeks ago
Toward a Constructive Technology Criticism - Columbia Journalism Review
Journalism about technology looks like: reporting, facts, the fourth estate, agenda setting. This kind of writing is constrained by PR embargoes and exclusive access. It can suffer from regurgitating Silicon Valley jargon and from telling seductive stories, as in the case of Theranos being judged as a startup rather than a medical company. Producer and freelance writer Rose Eveleth points to the problem: “There’s so much glittery, breathless writing about technology that fails to slow down and think about why we’re making these things, who we’re making them for, and who we’re leaving out when we make them.11 Dave Lee, tech reporter for the BBC, further asks if the role of technology journalism is meant to be “reporting every concocted venture capital investment, or being the first draft of our digital history.” 12

On the other hand, criticism about technology looks like: analysis, interpretation, commentary, judging merits, and unfavorable opinions. In the best cases, criticism offers the opportunity for context setting, and for asking questions beyond the tick-tock of technical development and into the how’s and why’s of a larger cultural shift. Criticism leaves room for interpretation, analysis, assessment, and more systematic inquiry. Popular criticism seeks to question established and unexamined knowledge—the assumptions and positions taken for granted. As author and contributor for The New York Times Virginia Heffernan reflects, criticism should “‘familiarize the unfamiliar’ and ‘de-familiarize the familiar.’”...

tyle and Tactic Trap: Missing People

More than just missing the social and political factors that bring a technology into existence, Critics of technology often fail to address the people for whom the technology is made. In his review of Morozov’s To Save Everything, Alexis Madrigal points to the missing users: “Without a functioning account of how people actually use self-tracking technologies, it is difficult to know how well their behaviors match up with Morozov’s accounts of their supposed ideology.”79

Critics also tend to write in the idiomatic royal “we” without representing real users’ interests or perspectives. Madrigal again articulates the importance of talking to people: “It is in using things that users discover and transform what those things are. Examining ideology is important. But so is understanding practice.”...

Style and Tactic Trap: Generalizing Personal Gripes

Another common mode in mainstream technology criticism is for the Critic to generalize personal gripes about technology into blanket judgments about technological progress. This is the mode used by Franzen when he complains about Twitter, a technology that threatens his livelihood by distracting him from his writing practice and changing the way his readers consume media. It can also be seen in Morozov’s description of the safe in which he locks his internet router so he can write his damning screeds without distraction. ...

Style and Tactic Trap: Cults of Personality, Bullying, and Misrepresenting Ideas

Though it is important to understand the ideological positions of the titans of the tech industry, some technology Critics unduly focus attention on individual personalities in isolation from their contexts. Profiles and takedowns of Silicon Valley moguls like Elon Musk, Peter Thiel, Mark Zuckerberg, and Tim O’Reilly make for compelling (anti-)hero narratives, but they often miss the details of the larger system and the labor that surrounds them. These profiles also perpetuate the mystique of ownership and power attributed to these Silicon Valley leaders.

Morozov, in particular, is guilty of personal, vindictive, intellectual bullying of his targets, no matter what side of the argument they represent.....

Style and Tactic Trap: Deconstruction Without Alternatives

One of the most widely recognized Critics of technology has made it his mission to destroy the industry and everyone associated with it. Writing against what he calls “solutionist” thinking, i.e. that all problems are potentially solvable (and often with technology), Morozov facilely avoids offering alternative solutions.
technology  criticism 
6 weeks ago
The Mission to Save Vanishing Internet Art - The New York Times
Now the digital art organization Rhizome is setting out to bring some stability to this evanescent medium. At a symposium to be held Thursday, Oct. 27, at the New Museum, its longtime partner and backer, Rhizome plans to start an ambitious archiving project. Called Net Art Anthology, it is to provide a permanent home online for 100 important artworks, many of which have long since disappeared from view. With a $201,000 grant from the Chicago-based Carl & Marilynn Thoma Art Foundation, Rhizome will release a newly refurbished work once a week for the next two years, starting with the 1991 “Cyberfeminist Manifesto.” By 2018, Rhizome will be presenting works by artists such as Cory Arcangel and Ms. Cortright.

Continue reading the main story


Continue reading the main story
In addition to salvaging the past, the aim is to tell the story of Internet-based art in an online gallery that serves much the same narrative function as the galleries in the Museum of Modern Art. “There’s a sense of amnesia about the history these things have,” Michael Connor, Rhizome’s artistic director, said as he sat in the New Museum’s ground-floor cafe. “This is an opportunity to really be rigorous.”...

Net Art’s political posture was characteristic of the feverish, techno-utopian excitement shared by netheads in general. “There was this radical idea that the internet was going to change the way art is made and shared,” said Lauren Cornell, who was Rhizome’s executive director from 2005 to 2012 and who has since moved to the New Museum as a curator and associate director of technology initiatives. “That it might even do away with traditional institutions and gatekeepers” — that is, museums and curators.

Instead, it was Net Art that started to disappear. Rhizome began trying to preserve it in 1999 with the creation of ArtBase, an online archive that has since grown to more than 2,000 works. The organization became an affiliate of the New Museum in 2003, saving the group from almost-certain oblivion. But even then it was apparent that to keep Net Art from vanishing into the ether, something drastic would have to be done.

Preserving this work is not just a matter of uploading old computer files. “The files don’t mean anything without the browser,” Mr. Connor, 38, said. “And the browser doesn’t mean anything without the computer” it runs on. Yet browsers from 15 or 20 years ago won’t work on today’s computers, and computers from that era are hard to come by and even harder to keep working.

Dragan Espenschied, Rhizome’s preservation director, has been working with the University of Freiburg in Germany to develop a sophisticated software framework that emulates outdated computing environments on current machines.

Another iteration of this approach is, which Rhizome began in December as a free service. Oldweb lets you time-travel online, viewing archived web pages from sources such as the Library of Congress in a window that mimics an early browser. A second Rhizome initiative is Webrecorder, a free program that lets users build their own archives of currently available web pages. That can help preserve online works being created today.
archive  net_art  media_archaeology  flow 
6 weeks ago
Why Being A City Geek Is So Cool Today | Co.Design | business + design
"I was really surprised that anyone was interested in the stuff I was doing around internet infrastructure, but I think a lot of the appeal has to do with a growing public anxiety over the opacity of networked systems," Burrington says. "More and more of everyday life is tied into networked systems that most people interface with via a scrying mirror, which tends to obscure all the algorithmic spells and hexes going on behind the scenes. Looking at data centers and cables and microwave towers doesn't really make those hexes any more legible, but it grounds this increasingly incomprehensible system in something real, something made by humans, something that could hypothetically be destroyed by humans. It's comforting, kind of."...

"There has been a tremendous interest in urban infrastructure over the past decade, and I believe it is connected to the fact that residents feel more empowered and informed about transit decision-making than ever before," Michelle Young says. "It's not that long ago that we were in the era of Robert Moses; now he's vilified for the type of [top-down] decisions he took."

Now, planners are for advocating participatory design, like letting residents vote on how to spend public funds to improve infrastructure.

Indeed, bottom-up planning has yielded some of the most influential urban design projects that involve infrastructure. The High Line—an elevated park on a formerly abandoned elevated railway—was the product of a grassroots organization, Friends of the High Line. Now what was once a blighted stretch of track is now one of the most popular destinations in Manhattan—for better or worse—and cities across the country are clamoring to create their own "X Lines."...

We have products that celebrate the beauty of infrastructure, celebrities who endorse infrastructural adaptive reuse, and infrastructure communicating with the public in 140 characters or less. But one of the most compelling pieces of evidence about infrastructure's resurgence has to do with how we define "infrastructure" in the first place.

"One thing I've noticed rising in tandem with the appeal of an infrastructural aesthetic is a massive expansion of the use of the term 'infrastructure' to describe lots of things that aren't manholes or bridges or railroads," Burrington says. "Software is infrastructure, social media is infrastructure, UX is infrastructure—that sort of thing. I've seen artists who would have called their work 'social practice' five years ago now describe it as 'making infrastructure.' And it seems like a really strategic choice—because infrastructure is also sort of assumed to be indispensable. Defining one's work as infrastructure valorizes it, elevates its importance in a system as something crucial and in need of attention, care, maintenance, and support."
infrastructure  infrastructural_tourism  infrastructure_art 
6 weeks ago
Michael Kreil: An Honest Picture of Metadata | Exposing the Invisible
I have a problem with the term “metadata.” I don't think that this term is precise, because, simply put, the basic idea of metadata is that it's data about data. For example, if I take a photo, I can add data like the camera model, time and geolocation, so, the additional information about when and where the photo was shot is called metadata. But, for example, if I take a lot of photos, I can use the metadata contained in these photos to connect the location in which I took them with the time I took them. The metadata can be used to track me. So, from that point of view, metadata is the data itself, and that’s the interesting aspect, not the photos themselves.

I think that only people who add data to the data can use the term metadata. But, in general, from a public point of view, everything is data, which is usually about persons. So let's stop calling it metadata.

Have you got an alternative name? 


6 weeks ago
« earlier      
academia acoustics advising aesthetics_of_administration algorithms archaeology architecture archive_art archives art audio big_data blogs book_art books bookstacks bookstores branded_places branding cartography cell_phones china cities classification collection collections comics computing conference craft criticism curating data data_centers data_visualization databases dead_media design design_process design_research digital digital_archives digital_humanities digitization discourse diy drawing ebooks education embodiment epistemology exhibition exhibition_design filetype:pdf film formalism furniture geography geology globalization google graduate_education graphic_design guerilla_urbanism hacking history home illustration information information_aesthetics infrastructure installation intellectual_furnishings interaction_design interface interfaces internet koolhaas korea labor landscape language learning lettering liberal_arts libraries library_art listening little_libraries little_magazines locative_media logistics machine_vision magazines making mapping maps marketing material_culture material_texts materiality media media:document media_archaeology media_architecture media_city media_education media_form media_history media_literature media_space media_theory media_workplace media_workspace memory methodology models multimodal_scholarship museums music music_scenes my_work networks new_york newspapers noise notes nypl object_oriented_philosophy objects organization palimpsest paper pedagogy performance periodicals phd photography place pneumatic_tubes poetry popups postal_service presentation_images preservation print printing privacy professional_practice public_space public_sphere publication publications publishing radio reading rendering research satellites screen security sensation sensors signs smart_cities smell social_media sound sound_art sound_design sound_map sound_space sound_studies space storage surveillance sustainability syllabus teaching telecommunications telegraph telephone television temporality text_art textual_form theory things tools transportation typewriter typography ums urban_archaeology urban_design urban_form urban_history urban_informatics urban_media urban_planning urban_studies video visualization voice wedding word_art workflow writing zines

Copy this bookmark: