robertogreco + monsters + surveillance   1

Ed-Tech's Monsters #ALTC
[video here: https://www.youtube.com/watch?v=Kiotl4G6fMw ]

"No doubt, we have witnessed in the last few years an explosion in the ed-tech industry and a growing, a renewed interest in ed-tech. Those here at ALT-C know that ed-tech is not new by any means; but there is this sense from many of its newest proponents (particularly in the States) that ed-tech has no history; there is only now and the future.

Ed-tech now, particularly that which is intertwined with venture capital, is boosted by a powerful forms of storytelling: a disruptive innovation mythology, entrepreneurs' hagiography, design fiction, fantasy.

A fantasy that wants to extend its reach into the material world.

Society has been handed a map, if you will, by the technology industry in which we are shown how these brave ed-tech explorers have and will conquer and carve up virtual and physical space.

Fantasy.

We are warned of the dragons in dangerous places, the unexplored places, the over explored places, the stagnant, the lands of outmoded ideas — all the places where we should no longer venture. 

Hic Sunt Dracones. There be dragons.

Instead, I’d argue, we need to face our dragons. We need to face our monsters. We need to face the giants. They aren’t simply on the margins; they are, in many ways, central to the narrative."



"I’m in the middle of writing a book called Teaching Machines, a cultural history of the science and politics of ed-tech. An anthropology of ed-tech even, a book that looks at knowledge and power and practices, learning and politics and pedagogy. My book explores the push for efficiency and automation in education: “intelligent tutoring systems,” “artificially intelligent textbooks,” “robo-graders,” and “robo-readers.”

This involves, of course, a nod to “the father of computer science” Alan Turing, who worked at Bletchley Park of course, and his profoundly significant question “Can a machine think?”

I want to ask in turn, “Can a machine teach?”

Then too: What will happen to humans when (if) machines do “think"? What will happen to humans when (if) machines “teach”? What will happen to labor and what happens to learning?

And, what exactly do we mean by those verbs, “think” and “teach”? When we see signs of thinking or teaching in machines, what does that really signal? Is it that our machines are becoming more “intelligent,” more human? Or is it that humans are becoming more mechanical?

Rather than speculate about the future, I want to talk a bit about the past."



"To oppose technology or to fear automation, some like The Economist or venture capitalist Marc Andreessen argue, is to misunderstand how the economy works. (I’d suggest perhaps Luddites understand how the economy works quite well, thank you very much, particularly when it comes to questions of “who owns the machinery” we now must work on. And yes, the economy works well for Marc Andreessen, that’s for sure.)"



"But even without machines, Frankenstein is still read as a cautionary tale about science and about technology; and Shelley’s story has left an indelible impression on us. Its references are scattered throughout popular culture and popular discourse. We frequently use part of the title — “Franken” — to invoke a frightening image of scientific experimentation gone wrong. Frankenfood. Frankenfish. The monster, a monstrosity — a technological crime against nature.

It is telling, very telling, that we often confuse the scientist, Victor Frankenstein, with his creation. We often call the monster Frankenstein.

As the sociologist Bruno Latour has argued, we don’t merely mistake the identity of Frankenstein; we also mistake his crime. It "was not that he invented a creature through some combination of hubris and high technology,” writes Latour, "but rather that he abandoned the creature to itself.”

The creature — again, a giant — insists in the novel that he was not born a monster, but he became monstrous after Frankenstein fled the laboratory in horror when the creature opened his “dull yellow eye,” breathed hard, and convulsed to life.

"Remember that I am thy creature,” he says when he confronts Frankenstein, "I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed. Everywhere I see bliss, from which I alone am irrevocably excluded. I was benevolent and good— misery made me a fiend.”

As Latour observes, "Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.”

Our “gigantic sin”: we failed to love and care for our technological creations. We must love and educate our children. We must love and care for our machines, lest they become monsters.

Indeed, Frankenstein is also a novel about education. The novel is structured as a series of narratives — Captain Watson’s story — a letter he sends to his sister as he explores the Arctic— which then tells Victor Frankenstein’s story through which we hear the creature tell his own story, along with that of the De Lacey family and the arrival of Safie, “the lovely Arabian." All of these are stories about education: some self-directed learning, some through formal schooling.

While typically Frankenstein is interpreted as a condemnation of science gone awry, the novel can also be read as a condemnation of education gone awry. The novel highlights the dangerous consequences of scientific knowledge, sure, but it also explores how knowledge — gained inadvertently, perhaps, gained surreptitiously, gained without guidance — might be disastrous. Victor Frankenstein, stumbling across the alchemists and then having their work dismissed outright by his father, stoking his curiosity. The creature, learning to speak by watching the De Lacey family, learning to read by watching Safie do the same, his finding and reading Volney's Ruins of Empires and Milton’s Paradise Lost."



"To be clear, my nod to the Luddites or to Frankenstein isn’t about rejecting technology; but it is about rejecting exploitation. It is about rejecting an uncritical and unexamined belief in progress. The problem isn’t that science gives us monsters, it's that we have pretended like it is truth and divorced from responsibility, from love, from politics, from care. The problem isn’t that science gives us monsters, it’s that it does not, despite its insistence, give us “the answer."

And that is problem with ed-tech’s monsters. That is the problem with teaching machines.

In order to automate education, must we see knowledge in a certain way, as certain: atomistic, programmable, deliverable, hierarchical, fixed, measurable, non-negotiable? In order to automate that knowledge, what happens to care?"



"I’ll leave you with one final quotation, from Hannah Arendt who wrote,
"Education is the point at which we decide whether we love the world enough to assume responsibility for it and by the same token save it from that ruin which, except for renewal, except for the coming of the new and young, would be inevitable. And education, too, is where we decide whether we love our children enough not to expel them from our world and leave them to their own devices, nor to strike from their hands their chance of undertaking something new, something unforeseen by us, but to prepare them in advance for the task of renewing a common world.”

Our task, I believe, is to tell the stories and build the society that would place education technology in that same light: “renewing a common world.”

We in ed-tech must face the monsters we have created, I think. These are the monsters in the technologies of war and surveillance a la Bletchley Park. These are the monsters in the technologies of mass production and standardization. These are the monsters in the technologies of behavior modification a la BF Skinner.

These are the monsters ed-tech must face. And we must all consider what we need to do so that we do not create more of them."
audreywatters  edtech  technology  education  schools  data  monsters  dragons  frankenstein  luddites  luddism  neoluddism  alanturing  thomaspynchon  society  bfskinner  standardization  surveillance  massproduction  labor  hannaharendt  brunolatour  work  kevinkelly  technosolutionism  erikbrynjolfsson  lordbyron  maryshelley  ethics  hierarchy  children  responsibility  love  howwelearn  howweteach  teaching  learning  politics  policy  democracy  exploitation  hierarchies  progress  science  scientism  markets  aynrand  liberarianism  projectpigeon  teachingmachines  personalization  individualization  behavior  behaviorism  economics  capitalism  siliconvalley 
september 2014 by robertogreco

Copy this bookmark:



description:


tags: