robertogreco + computers   413

HEWN, No. 324: Sorry, you can’t get to Sesame Street from here any more
"I’ve been thinking this week about a 1985 interview with MIT computer science professor Joseph Weizenbaum. The interview opens with a question about the role of computers in education — a question that Weizenbaum dismisses in part because it assumes that computers are good and useful and necessarily have something to offer. “The computer has almost since its beginning,” he says, “been basically a solution looking for a problem.”

It’s a “solution” that, in its search for a problem, has come “to use entire generations of schoolchildren as experimental subjects.” (Related: “Psychodata: disassembling the psychological, economic, and statistical infrastructure of ‘social-emotional learning’” by Ben Williamson.)

Even if we can demonstrate that introducing computers into classrooms helps students improve their test scores (or what have you), Weizenbaum argues there are still many questions to be asked about why students struggle in the first place. “The question, ‘Why can’t Johnny read?’ must still be asked.”
There is a very good reason that questions of that kind are uncomfortable. When we ask this question, we may discover that Johnny is hungry when he comes to school, or that Johnny comes from a milieu in which reading is irrelevant to concrete problems or survival on the street — that is, there is no chance to read, it is a violent milieu, and so on.

You might discover that, and then you might ask the next question: “Why is it that Johnny comes to school hungry? Don’t we have school breakfast programs and lunch programs?” The answer to that might be, yes, we used to, but we don’t any more.

Why is there so much poverty in our world, in the United States, especially in the large cities? Why is it that classes are so large? Why is it that fully half the science and math teachers in the United States are underqualified and are operating on emergency certificates?

When you ask questions like that, you come upon some very important and very tragic facts about America. One of the things you would discover is that education has a very much lower priority in the United States than do a great many other things, most particularly the military.

It is much nicer, it is much more comfortable, to have some device, say the computer, with which to flood the schools, and then to sit back and say, “You see, we are doing something about it, we are helping,” than to confront ugly social realities.

Sesame Street was one education technology that, from its outset, did try to confront those social realities. The show, first pitched by Joan Ganz Cooney in a 1966 report to the Carnegie Foundation, recognized that “disadvantaged children are inadequately stimulated and motivated during the preschool years and the belief that the right kind of early intervention can provide adequate compensation have done much to create the present ferment in cognitive development research and preschool education.” The initial mission: create a show for public (not commercial) television that would develop school readiness of viewers age 3 to 5, with particular emphasis on the needs of low-income children and children of color.

Sesame Street is the most researched television show in history — not only in its reception but in its development. From the outset, there was attention to repetition and sequencing, for example. There was careful consideration of when to use straightforwardness and when to use fantasy; consideration of how dramatic tension and humor affected comprehension. With a mission of reaching preschoolers of color, Sesame Street cast actors of color. The curriculum was relevant and meaningful and age-appropriate. Sesame Street knew the problem; it hoped to be a solution.

And then in 2015, it moved from PBS to HBO. It started a venture capital arm the following year (to invest in “data-driven education products that promote personalized learning and educational technology”). This week, it moved from HBO to HBO Max, an even more exclusive streaming service.

What made Sesame Street decide to abandon its mission? What made it think that ed-tech investment was the solution to early childhood education, to childhood poverty, to racism and discrimination? Not research, that’s for damn sure. Not a commitment to social justice. Somewhere along the way, Sesame Street decided that the brand, the computer trumped social change.

Weizenbaum again:
I think the computer has from the beginning been a fundamentally conservative force. It has made possible the saving of institutions pretty much as they were, which otherwise might have had to be changed. For example, banking. Superficially, it looks as if banking has been revolutionized by the computer. But only very superficially. Consider that, say 20, 25 years ago, the banks were faced with the fact that the population was growing at a very rapid rate, many more checks would be written than before, and so on. Their response was to bring in the computer. …

Now if it had not been for the computer, if the computer had not been invented, what would the banks have had to do? They might have had to decentralize, or they might have had to regionalize in some way. In other words, it might have been necessary to introduce a social invention, as opposed to the technical invention.

What the coming of the computer did, “just in time,” was to make it unnecessary to create social inventions, to change the system in any way. So in that sense, the computer has acted as fundamentally a conservative force, a force which kept power or even solidified power where is already existed.

They might have blocked off Sesame Street, but we can’t let the bankers and the tech CEOs win."
audreywatters  2019  edtech  sesamestreet  josephweizenbaum  education  computing  computers  compsci  learning  benwilliamson  history  conservatism 
8 weeks ago by robertogreco
Optimize What? • Commune
"Silicon Valley is full of the stupidest geniuses you’ll ever meet. The problem begins in the classrooms where computer science is taught."



"In higher education and research, the situation is similar, if further removed from the harsh realities of technocapitalism. Computer science in the academy is a minefield of contradictions: a Stanford undergraduate may attend class and learn how to extract information from users during the day, then later attend an evening meeting of the student organization CS+Social Good, where they will build a website for a local nonprofit. Meanwhile, a researcher who attended last year’s Conference on Economics and Computation would have sat through a talk on maximizing ad revenue, then perhaps participated the next morning in a new workshop on “mechanism design for social good.”

It is in this climate that we, too, must construct our vision for computer science and its applications. We might as well start from scratch: in a recent article for Tribune, Wendy Liu calls to “abolish Silicon Valley.” By this she means not the naive rejection of high technology, but the transformation of the industry into one funded, owned, and controlled by workers and the broader society—a people’s technology sector.

Silicon Valley, however, does not exist in an intellectual vacuum; it depends on a certain type of computer science discipline. Therefore, a people’s remake of the Valley will require a people’s computer science. Can we envision this? Today, computer science departments don’t just generate capitalist realism—they are themselves ruled by it. Only those research topics that carry implications for profit extraction or military applications are deemed worthy of investigation. There is no escaping the reach of this intellectual-cultural regime; even the most aloof theoreticians feel the need to justify their work by lining their paper introductions and grant proposals with spurious connections to the latest industry fads. Those who are more idealistic or indignant (or tenured) insist that the academy carve out space for “useless” research as well. However, this dichotomy between “industry applications” and “pure research” ignores the material reality that research funding comes largely from corporate behemoths and defense agencies, and that contemporary computer science is a political enterprise regardless of its wishful apolitical intentions.

In place of this suffocating ideological fog, what we must construct is a notion of communist realism in science: that only projects in direct or indirect service to people and planet will have any hope of being funded, of receiving the esteem of the research community, or even of being considered intellectually interesting. What would a communist computer science look like? Can we imagine researchers devising algorithms for participatory economic planning? Machine learning for estimating socially necessary labor time? Decentralized protocols for coordinating supply chains between communes?

Allin Cottrell and Paul Cockshott, two of the few contemporary academics who tackle problems of computational economics in non-market settings, had this to say in a 1993 paper:
Our investigations enable us to identify one component of the problem (with economic planning): the material conditions (computational technology) for effective socialist planning of a complex peacetime economy were not realized before, say, the mid-1980s. If we are right, the most notorious features of the Soviet economy (chronically incoherent plans, recurrent shortages and surpluses, lack of responsiveness to consumer demand), while in part the result of misguided policies, were to some degree inevitable consequences of the attempt to operate a system of central planning before its time. The irony is obvious: socialism was being rejected at the very moment when it was becoming a real possibility.


Politically, much has changed since these words were written. The takeaway for contemporary readers is not necessarily that we should devote ourselves to central planning once more; rather, it’s that our moment carries a unique mixture of ideological impasse and emancipatory potential, ironically both driven in large part by technological development. The cold science of computation seems to declare that social progress is over—there can only be technological progress. Yet if we manage to wrest control of technology from Silicon Valley and the Ivory Tower, the possibilities for postcapitalist society are seemingly endless. The twenty-first-century tech workers’ movement, a hopeful vehicle for delivering us towards such prospects, is nascent—but it is increasingly a force to be reckoned with, and, at the risk of getting carried away, we should start imagining the future we wish to inhabit. It’s time we began conceptualizing, and perhaps prototyping, computation and information in a workers’ world. It’s time to start conceiving of a new left-wing science."
engineering  problemsolving  capitalism  computers  politics  technology  jimmywu  2019  optimization  efficiency  allincottrell  paulcockshott  siliconvalley  techosolutionism  technocapitalism  computation  wendyliu  compsci  ideology 
april 2019 by robertogreco
getting a new Mac up and running – Snakes and Ladders
"Things I do when I get a new Mac, more or less in order:

• install Homebrew [https://brew.sh/ ]
• use Homebrew to install pandoc [https://pandoc.org/ * ]
• install BBedit
• install MacTex
• type this into the terminal: defaults write com.barebones.bbedit FullScreenWindowsHogScreen -bool NO
• type this into the terminal: defaults write com.apple.dock single-app -bool true (followed by killall Dock)
enable Night Shift
• install TextExpander
• install Alfred
• install Hazeover
• install Hazel

Everything else can wait; once I have the above in place — plus of course syncing all my existing TextExpander snippets — I can do almost everything I really need to do on a computer, with maximum focus and speed."

[*"About pandoc

If you need to convert files from one markup format into another, pandoc is your swiss-army knife. Pandoc can convert documents in (several dialects of) Markdown, reStructuredText, textile, HTML, DocBook, LaTeX, MediaWiki markup, TWiki markup, TikiWiki markup, DokuWiki markup, Creole 1.0, Vimwiki markup, roff man, OPML, Emacs Org-Mode, Emacs Muse, txt2tags, Microsoft Word docx, LibreOffice ODT, EPUB, Jupyter notebooks ipynb, or Haddock markup to

HTML formats
XHTML, HTML5, and HTML slide shows using Slidy, reveal.js, Slideous, S5, or DZSlides

Word processor formats
Microsoft Word docx, OpenOffice/LibreOffice ODT, OpenDocument XML, Microsoft PowerPoint.

Ebooks
EPUB version 2 or 3, FictionBook2

Documentation formats
DocBook version 4 or 5, TEI Simple, GNU TexInfo, roff man, roff ms, Haddock markup

Archival formats
JATS

Page layout formats
InDesign ICML

Outline formats
OPML

TeX formats
LaTeX, ConTeXt, LaTeX Beamer slides

PDF
via pdflatex, xelatex, lualatex, pdfroff, wkhtml2pdf, prince, or weasyprint.

Lightweight markup formats
Markdown (including CommonMark and GitHub-flavored Markdown), reStructuredText, AsciiDoc, Emacs Org-Mode, Emacs Muse, Textile, txt2tags, MediaWiki markup, DokuWiki markup, TikiWiki markup, TWiki markup, Vimwiki markup, and ZimWiki markup.

Interactive notebook formats
Jupyter notebook (ipynb)

Custom formats
custom writers can be written in lua.

Pandoc understands a number of useful markdown syntax extensions, including document metadata (title, author, date); footnotes; tables; definition lists; superscript and subscript; strikeout; enhanced ordered lists (start number and numbering style are significant); running example lists; delimited code blocks with syntax highlighting; smart quotes, dashes, and ellipses; markdown inside HTML blocks; and inline LaTeX. If strict markdown compatibility is desired, all of these extensions can be turned off.

LaTeX math (and even macros) can be used in markdown documents. Several different methods of rendering math in HTML are provided, including MathJax and translation to MathML. LaTeX math is converted (as needed by the output format) to unicode, native Word equation objects, MathML, or roff eqn."
mac  alanjacobs  computers  osx  macos  via:lukeneff  homebrew  pandoc  files  filetype  conversion  text  plaintext  markup  html  epub  latex  setup 
march 2019 by robertogreco
I Embraced Screen Time With My Daughter—and I Love It | WIRED
I often turn to my sister, Mimi Ito, for advice on these issues. She has raised two well-adjusted kids and directs the Connected Learning Lab at UC Irvine, where researchers conduct extensive research on children and technology. Her opinion is that “most tech-privileged parents should be less concerned with controlling their kids’ tech use and more about being connected to their digital lives.” Mimi is glad that the American Association of Pediatrics (AAP) dropped its famous 2x2 rule—no screens for the first two years, and no more than two hours a day until a child hits 18. She argues that this rule fed into stigma and parent-shaming around screen time at the expense of what she calls “connected parenting”—guiding and engaging in kids’ digital interests.

One example of my attempt at connected parenting is watching YouTube together with Kio, singing along with Elmo as Kio shows off the new dance moves she’s learned. Everyday, Kio has more new videos and favorite characters that she is excited to share when I come home, and the songs and activities follow us into our ritual of goofing off in bed as a family before she goes to sleep. Her grandmother in Japan is usually part of this ritual in a surreal situation where she is participating via FaceTime on my wife’s iPhone, watching Kio watching videos and singing along and cheering her on. I can’t imagine depriving us of these ways of connecting with her.

The (Unfounded) War on Screens

The anti-screen narrative can sometimes read like the War on Drugs. Perhaps the best example is Glow Kids, in which Nicholas Kardaras tells us that screens deliver a dopamine rush rather like sex. He calls screens “digital heroin” and uses the term “addiction” when referring to children unable to self-regulate their time online.

More sober (and less breathlessly alarmist) assessments by child psychologists and data analysts offer a more balanced view of the impact of technology on our kids. Psychologist and baby observer Alison Gopnik, for instance, notes: “There are plenty of mindless things that you could be doing on a screen. But there are also interactive, exploratory things that you could be doing.” Gopnik highlights how feeling good about digital connections is a normal part of psychology and child development. “If your friends give you a like, well, it would be bad if you didn’t produce dopamine,” she says.

Other research has found that the impact of screens on kids is relatively small, and even the conservative AAP says that cases of children who have trouble regulating their screen time are not the norm, representing just 4 percent to 8.5 percent of US children. This year, Andrew Przybylski and Amy Orben conducted a rigorous analysis of data on more than 350,000 adolescents and found a nearly negligible effect on psychological well-being at the aggregate level.

In their research on digital parenting, Sonia Livingstone and Alicia Blum-Ross found widespread concern among parents about screen time. They posit, however, that “screen time” is an unhelpful catchall term and recommend that parents focus instead on quality and joint engagement rather than just quantity. The Connected Learning Lab’s Candice Odgers, a professor of psychological sciences, reviewed the research on adolescents and devices and found as many positive as negative effects. She points to the consequences of unbalanced attention on the negative ones. “The real threat isn’t smartphones. It’s this campaign of misinformation and the generation of fear among parents and educators.”

We need to immediately begin rigorous, longitudinal studies on the effects of devices and the underlying algorithms that guide their interfaces and their interactions with and recommendations for children. Then we can make evidence-based decisions about how these systems should be designed, optimized for, and deployed among children, and not put all the burden on parents to do the monitoring and regulation.

My guess is that for most kids, this issue of screen time is statistically insignificant in the context of all the other issues we face as parents—education, health, day care—and for those outside my elite tech circles even more so. Parents like me, and other tech leaders profiled in a recent New York Times series about tech elites keeping their kids off devices, can afford to hire nannies to keep their kids off screens. Our kids are the least likely to suffer the harms of excessive screen time. We are also the ones least qualified to be judgmental about other families who may need to rely on screens in different ways. We should be creating technology that makes screen entertainment healthier and fun for all families, especially those who don’t have nannies.

I’m not ignoring the kids and families for whom digital devices are a real problem, but I believe that even in those cases, focusing on relationships may be more important than focusing on controlling access to screens.

Keep It Positive

One metaphor for screen time that my sister uses is sugar. We know sugar is generally bad for you and has many side effects and can be addictive to kids. However, the occasional bonding ritual over milk and cookies might have more benefit to a family than an outright ban on sugar. Bans can also backfire, fueling binges and shame as well as mistrust and secrecy between parents and kids.

When parents allow kids to use computers, they often use spying tools, and many teens feel parental surveillance is invasive to their privacy. One study showed that using screen time to punish or reward behavior actually increased net screen time use by kids. Another study by Common Sense Media shows what seems intuitively obvious: Parents use screens as much as kids. Kids model their parents—and have a laserlike focus on parental hypocrisy.

In Alone Together, Sherry Turkle describes the fracturing of family cohesion because of the attention that devices get and how this has disintegrated family interaction. While I agree that there are situations where devices are a distraction—I often declare “laptops closed” in class, and I feel that texting during dinner is generally rude—I do not feel that iPhones necessarily draw families apart.

In the days before the proliferation of screens, I ran away from kindergarten every day until they kicked me out. I missed more classes than any other student in my high school and barely managed to graduate. I also started more extracurricular clubs in high school than any other student. My mother actively supported my inability to follow rules and my obsessive tendency to pursue my interests and hobbies over those things I was supposed to do. In the process, she fostered a highly supportive trust relationship that allowed me to learn through failure and sometimes get lost without feeling abandoned or ashamed.

It turns out my mother intuitively knew that it’s more important to stay grounded in the fundamentals of positive parenting. “Research consistently finds that children benefit from parents who are sensitive, responsive, affectionate, consistent, and communicative” says education professor Stephanie Reich, another member of the Connected Learning Lab who specializes in parenting, media, and early childhood. One study shows measurable cognitive benefits from warm and less restrictive parenting.

When I watch my little girl learning dance moves from every earworm video that YouTube serves up, I imagine my mother looking at me while I spent every waking hour playing games online, which was my pathway to developing my global network of colleagues and exploring the internet and its potential early on. I wonder what wonderful as well as awful things will have happened by the time my daughter is my age, and I hope a good relationship with screens and the world beyond them can prepare her for this future."
joiito  parenting  screentime  mimiito  techology  screens  children  alisongopnik  2019  computers  computing  tablets  phones  smartphones  mobile  nicholaskardaras  addiction  prohibition  andrewprzybylski  aliciablum-ross  sonialvingstone  amyorben  adolescence  psychology  candiceodgers  research  stephaniereich  connectedlearning  learning  schools  sherryturkle  trust 
march 2019 by robertogreco
Bay Area Disrupted: Fred Turner on Vimeo
"Interview with Fred Turner in his office at Stanford University.

http://bayareadisrupted.com/

https://fredturner.stanford.edu

Graphics: Magda Tu
Editing: Michael Krömer
Concept: Andreas Bick"
fredturner  counterculture  california  opensource  bayarea  google  softare  web  internet  history  sanfrancisco  anarchism  siliconvalley  creativity  freedom  individualism  libertarianism  2014  social  sociability  governance  myth  government  infrastructure  research  online  burningman  culture  style  ideology  philosophy  apolitical  individuality  apple  facebook  startups  precarity  informal  bureaucracy  prejudice  1960s  1970s  bias  racism  classism  exclusion  inclusivity  inclusion  communes  hippies  charism  cultofpersonality  whiteness  youth  ageism  inequality  poverty  technology  sharingeconomy  gigeconomy  capitalism  economics  neoliberalism  henryford  ford  empowerment  virtue  us  labor  ork  disruption  responsibility  citizenship  purpose  extraction  egalitarianism  society  edtech  military  1940s  1950s  collaboration  sharedconsciousness  lsd  music  computers  computing  utopia  tools  techculture  location  stanford  sociology  manufacturing  values  socialchange  communalism  technosolutionism  business  entrepreneurship  open  liberalism  commons  peerproduction  product 
december 2018 by robertogreco
iPad Pro (2018) Review: Two weeks later! - YouTube
[at 7:40, problems mentioned with iOS on the iPad Pro as-is for Rene Ritchie keeping it from being a laptop replacement]

"1. Import/export more than just photo/video [using USB drive, hard drive, etc]

2. Navigate with the keyboard [or trackpad/mouse]

3. 'Desktop Sites' in Safari [Why not a desktop browser (maybe in addition to Safari, something like a "pro" Safari with developer tools and extensions?]

4. Audio recording [system-wide like the screen recording for capturing conversations from Skype/Facetime/etc]

5. Develop for iPad on iPad

6. Multi-user for everyone [like on a Chromebook]"

[I'd be happy with just 1, 2, and 3. 6 would also be nice. 4 and 5 are not very important to me, but also make sense.]

[Some of my notes regarding the state of the tablet-as-laptop replacement in 2018, much overlap with what is above:

iOS tablets
no mouse/trackpad support, file system is still a work in process, no desktop browser equivalents, Pro models are super expensive given these tradeoffs, especially with additional keyboard and pen costs

Microsoft Surface
tablet experience is lacking, Go (closest to meeting my needs and price) seems a little overpriced for the top model (entry model needs more RAM and faster storage), also given the extra cost of keyboard and pen

Android tablets
going nowhere, missing desktop browser

ChromeOS tablets
underpowered (Acer Chromebook Tab 10) or very expensive (Google Pixel Slate) or I don’t like it enough (mostly the imbalance between screen and keyboard, and the keyboard feel) for the cost (HP x2), but ChromeOS tablets seem as promising as iPads as laptop replacements at this point

ChromeOS convertibles
strange having the keyboard in the back while using as a tablet (Samsung Chromebook Plus/Pro, ASUS Chromebook Flip C302CA, Google Pixelbook (expensive)) -- I used a Chromebook Pro for a year (as work laptop) and generally it was a great experience, but they are ~1.5 years old now and haven’t been refreshed. Also, the Samsung Chromebook Plus (daughter has one of these, used it for school and was happy with it until new college provided a MacBook Pro) refresh seems like a step back because of the lesser screen, the increase in weight, and a few other things.

Additional note:
Interesting how Microsoft led the way in this regard (tablet as laptop replacement), but again didn't get it right enough and is now being passed by the others, at least around me]

[finally, some additional discussion and comparison:

The Verge: "Is this a computer?" (Apr 11, 2018)
https://www.youtube.com/watch?v=K7imG4DYXlM

Apple's "What's a Computer?" iPad ad (Jan 23, 2018, no longer available directly from Apple)
https://www.youtube.com/watch?v=llZys3xg6sU

Apple's "iPad Pro — 5 Reasons iPad Pro can be your next computer — Apple" (Nov 19, 2018)
https://www.youtube.com/watch?v=tUQK7DMys54

The Verge: "Google Pixel Slate Review: half-baked" (Nov 27, 2018)
https://www.youtube.com/watch?v=BOa6HU_he2A
https://www.theverge.com/2018/11/27/18113447/google-pixel-slate-review-tablet-chrome-os-android-chromebook-slapdash

Unbox Therapy: "Can The Google Pixel Slate Beat The iPad Pro?" (Nov 28, 2018)
https://www.youtube.com/watch?v=lccvHF4ODNY

The Verge: "Google keeps failing to understand tablets" (Nov 29, 2018)
https://www.theverge.com/2018/11/29/18117520/google-tablet-android-chrome-os-pixel-slate-failure

The Verge: "Chrome OS isn't ready for tablets yet" (Jul 18, 2018)
https://www.youtube.com/watch?v=Eu9JBj7HNmM

The Verge: "New iPad Pro review: can it replace your laptop?" (Nov 5, 2018)
https://www.youtube.com/watch?v=LykS0TRSHLY
https://www.theverge.com/2018/11/5/18062612/apple-ipad-pro-review-2018-screen-usb-c-pencil-price-features

Navneet Alang: "The misguided attempts to take down the iPad Pro" (Nov 9, 2018)
https://theweek.com/articles/806270/misguided-attempts-take-down-ipad-pro

Navneet Alang: "Apple is trying to kill the laptop" (Oct 31, 2018)
https://theweek.com/articles/804670/apple-trying-kill-laptop

The Verge: "Microsoft Surface Go review: surprisingly good" (Aug 7, 2018)
https://www.youtube.com/watch?v=N7N2xunvO68
https://www.theverge.com/2018/8/7/17657174/microsoft-surface-go-review-tablet-windows-10

The Verge: "The Surface Go Is Microsoft's Hybrid PC Dream Made Real: It’s time to think of Surface as Surface, and not an iPad competitor" (Aug 8, 2018)
https://www.theverge.com/2018/8/8/17663494/microsoft-surface-go-review-specs-performance

The Verge: "Microsoft Surface Go hands-on" (Aug 2, 2018)
https://www.youtube.com/watch?v=dmENZqKPfws

Navneet Alang: "Is Microsoft's Surface Go doomed to fail?" (Jul 12, 2018)
https://theweek.com/articles/784014/microsofts-surface-doomed-fail

Chrome Unboxed: "Google Pixel Slate: Impressions After A Week" (Nov 27, 2018)
https://www.youtube.com/watch?v=ZfriNj2Ek68
https://chromeunboxed.com/news/google-pixel-slate-first-impressions/

Unbox Therapy: "I'm Quitting Computers" (Nov 18, 2018)
https://www.youtube.com/watch?v=w3oRJeReP8g

Unbox Therapy: "The Truth About The iPad Pro..." (Dec 5, 2018)
https://www.youtube.com/watch?v=JXqou3SVbMw

The Verge: "Tablet vs laptop" (Mar 22, 2018)
https://www.youtube.com/watch?v=Rm_zQP9JIJI

Marques Brownlee: "iPad Pro Review: The Best Ever... Still an iPad!" (Nov 14, 2018)
https://www.youtube.com/watch?v=N1e_voQvHYk

Engadget: "iPad Pro 2018 Review: Almost a laptop replacement" (Nov 6, 2018)
https://www.youtube.com/watch?v=jZzmMpP2BNw

Matthew Moniz: "iPad Pro 2018 - Overpowered Netflix Machine or Laptop Replacement?" (Nov 8, 2018)
https://www.youtube.com/watch?v=P0ZFlFG67kY

WSJ: "Can the New iPad Pro Be Your Only Computer?" (Nov 16, 2018)
https://www.youtube.com/watch?v=kMCyI-ymKfo
https://www.wsj.com/articles/apples-new-ipad-pro-great-tablet-still-cant-replace-your-laptop-1541415600

Ali Abdaal: "iPad vs Macbook for Students (2018) - Can a tablet replace your laptop?" (Oct 10, 2018)
https://www.youtube.com/watch?v=xIx2OQ6E6Mc

Washington Post: "Nope, Apple’s new iPad Pro still isn’t a laptop" (Nov 5, 2018)
https://www.washingtonpost.com/technology/2018/11/05/nope-apples-new-ipad-pro-still-isnt-laptop/

Canoopsy: "iPad Pro 2018 Review - My Student Perspective" (Nov 19, 2018)
https://www.youtube.com/watch?v=q4dgHuWBv14

Greg' Gadgets: "The iPad Pro (2018) CAN Replace Your Laptop!" (Nov 24, 2018)
https://www.youtube.com/watch?v=Y3SyXd04Q1E

Apple World: "iPad Pro has REPLACED my MacBook (my experience)" (May 9, 2018)
https://www.youtube.com/watch?v=vEu9Zf6AENU

Dave Lee: "iPad Pro 2018 - SUPER Fast, But Why?" (Nov 11, 2018)
https://www.youtube.com/watch?v=Aj6vXhN-g6k

Shahazad Bagwan: "A Week With iPad Pro // Yes It Replaced A Laptop!" (Oct 20, 2017)
https://www.youtube.com/watch?v=jhHwv9QsoP0

Apple's "Homework (Full Version)" iPad ad (Mar 27, 2018)
https://www.youtube.com/watch?v=IprmiOa2zH8

The Verge: "Intel's future computers have two screens" (Oct 18, 2018)
https://www.youtube.com/watch?v=deymf9CoY_M

"The Surface Book 2 is everything the MacBook Pro should be" (Jun 26, 208)
https://char.gd/blog/2018/the-surface-book-2-is-everything-the-macbook-pro-should-be-and-then-some

"Surface Go: the future PC that the iPad Pro failed to deliver" (Aug 27, 2018)
https://char.gd/blog/2018/surface-go-a-better-future-pc-than-the-ipad-pro

"Microsoft now has the best device lineup in the industry" (Oct 3, 2018)
https://char.gd/blog/2018/microsoft-has-the-best-device-lineup-in-the-industry ]
ipadpro  ipad  ios  computing  reneritchie  2018  computers  laptops  chromebooks  pixelslate  surfacego  microsoft  google  apple  android  microoftsurface  surface 
november 2018 by robertogreco
Silicon Valley Nannies Are Phone Police for Kids - The New York Times
[This is one of three connected articles:]

"Silicon Valley Nannies Are Phone Police for Kids
Child care contracts now demand that nannies hide phones, tablets, computers and TVs from their charges."
https://www.nytimes.com/2018/10/26/style/silicon-valley-nannies.html

"The Digital Gap Between Rich and Poor Kids Is Not What We Expected
America’s public schools are still promoting devices with screens — even offering digital-only preschools. The rich are banning screens from class altogether."
https://www.nytimes.com/2018/10/26/style/digital-divide-screens-schools.html

"A Dark Consensus About Screens and Kids Begins to Emerge in Silicon Valley
“I am convinced the devil lives in our phones.”"
https://www.nytimes.com/2018/10/26/style/phones-children-silicon-valley.html

[See also:
"What the Times got wrong about kids and phones"
https://www.cjr.org/criticism/times-silicon-valley-kids.php

https://twitter.com/edifiedlistener/status/1058438953299333120
"Now that I've had a chance to read this article [specifically: "The Digital Gap Between Rich and Poor Kids Is Not What We Expected"] and some others related to children and screen time and the wealthy and the poor, I have some thoughts. 1/

First, this article on the unexpected digital divide between rich and poor seems entirely incomplete. There is an early reference to racial differences in screen usage but in the article there are no voices of black or brown folks that I could detect. 2/

We are told a number of things: Wealthy parents are shunning screens in their children's lives, psychologists underscore the addictive nature of screen time on kids, and of course, whatever the short end of the stick is - poor kids get that. 3/

We hear "It could happen that the children of poorer and middle-class parents will be raised by screens," while wealthy kids will perhaps enjoy "wooden toys and the luxury of human interaction." 4/

Think about that and think about the stories that have long been told about poor families, about single parents, about poor parents of color - They aren't as involved in their kids' education, they are too busy working. Familiar stereotypes. 5/

Many of these judgments often don't hold up under scrutiny. So much depends upon who gets to tell those stories and how those stories are marketed, sold and reproduced. 6/

In this particular story about the privilege of being able to withdraw from or reduce screen time, we get to fall back into familiar narratives especially about the poor and non-elite. 7/

Of course those with less will be told after a time by those with much more - "You're doing it wrong." And "My child will be distinguished by the fact that he/she/they is not dependent on a device for entertainment or diversion." 8/

My point is not that I doubt the risks and challenges of excessive screen time for kids and adults. Our dependence on tech *is* a huge social experiment and the outcomes are looking scarier by the day. 9/

I do, however, resist the consistent need of the wealthy elite to seek ways to maintain their distance to the mainstream. To be the ones who tell us what's "hot, or not" - 10/

Chris Anderson points out "“The digital divide was about access to technology, and now that everyone has access, the new digital divide is limiting access to technology,” - 11/

This article and its recent close cousins about spying nannies in SV & more elite parent hand wringing over screen in the NYT feel like their own category of expensive PR work - again allowing SV to set the tone. 12/

It's not really about screens or damage to children's imaginations - it's about maintaining divides, about insuring that we know what the rich do (and must be correct) vs what the rest of us must manage (sad, bad). 13/fin]
siliconvalley  edtech  children  technology  parenting  2018  nelliebowles  addiction  psychology  hypocrisy  digitaldivide  income  inequality  ipads  smartphones  screentime  schools  education  politics  policy  rules  childcare  policing  surveillance  tracking  computers  television  tv  tablets  phones  mobile  teaching  learning  howwelearn  howweteach  anyakamenetz  sherrispelic  ipad 
october 2018 by robertogreco
Avery Trufelman - 99pi (Oakland) - YouTube
"The Way Things Live

As one of the staff producers for the design podcast 99% Invisible, Avery Trufelman spends most of her time considering the intentions behind inanimate objects. She finds stories hidden in products we encounter every day, like fire escapes and neon signs, as well as oddities and architectural outliers around the world, from art schools in Havana to garbage trucks in Taipei.

Her talk, "The Way Things Live," is a meditation of sorts—a reconsidering of the overlaps in some of the episodes she has made in the past three years. Design stories are human stories: the objects that we make are reflections of us, and they live existences parallel to ours. They fall in and out of favor with changing tastes and mores, in rich, changing narratives, until eventually, some outlive us all."

[See also:
"The Fancy Shape"
https://99percentinvisible.org/episode/the-fancy-shape/

"Octothorpe"
https://99percentinvisible.org/episode/octothorpe/ ]
averytrufelman  2016  design  symbols  shapes  iconographicdrift  architecture  history  99pi  hashtags  technology  telephones  computers  chrismessina  dougkerr  belllabs 
april 2018 by robertogreco
Tom Simonite on Twitter: "The cooling/support structure for IBM's 50 qubit quantum computer. Top level cools to 4 kelvin (-269 celsius); the shiny bit at the bottom w the chip inside reaches 10 millikelvin, colder than outer space. The superconducting dat
"The cooling/support structure for IBM's 50 qubit quantum computer. Top level cools to 4 kelvin (-269 celsius); the shiny bit at the bottom w the chip inside reaches 10 millikelvin, colder than outer space. The superconducting data lines have loops so they can shrink when cooled."
computers  quantumcomputers  classideas  2018  cooling 
february 2018 by robertogreco
No one’s coming. It’s up to us. – Dan Hon – Medium
"Getting from here to there

This is all very well and good. But what can we do? And more precisely, what “we”? There’s increasing acceptance of the reality that the world we live in is intersectional and we all play different and simultaneous roles in our lives. The society of “we” includes technologists who have a chance of affecting the products and services, it includes customers and users, it includes residents and citizens.

I’ve made this case above, but I feel it’s important enough to make again: at a high level, I believe that we need to:

1. Clearly decide what kind of society we want; and then

2. Design and deliver the technologies that forever get us closer to achieving that desired society.

This work is hard and, arguably, will never be completed. It necessarily involves compromise. Attitudes, beliefs and what’s considered just changes over time.

That said, the above are two high level goals, but what can people do right now? What can we do tactically?

What we can do now

I have two questions that I think can be helpful in guiding our present actions, in whatever capacity we might find ourselves.

For all of us: What would it look like, and how might our societies be different, if technology were better aligned to society’s interests?

At the most general level, we are all members of a society, embedded in existing governing structures. It certainly feels like in the recent past, those governing structures are coming under increasing strain, and part of the blame is being laid at the feet of technology.

One of the most important things we can do collectively is to produce clarity and prioritization where we can. Only by being clearer and more intentional about the kind of society we want and accepting what that means, can our societies and their institutions provide guidance and leadership to technology.

These are questions that cannot and should not be left to technologists alone. Advances in technology mean that encryption is a societal issue. Content moderation and censorship are a societal issue. Ultimately, it should be for governments (of the people, by the people) to set expectations and standards at the societal level, not organizations accountable only to a board of directors and shareholders.

But to do this, our governing institutions will need to evolve and improve. It is easier, and faster, for platforms now to react to changing social mores. For example, platforms are responding in reaction to society’s reaction to “AI-generated fake porn” faster than governing and enforcing institutions.

Prioritizations may necessarily involve compromise, too: the world is not so simple, and we are not so lucky, that it can be easily and always divided into A or B, or good or not-good.

Some of my perspective in this area is reflective of the schism American politics is currently experiencing. In a very real way, America, my adoptive country of residence, is having to grapple with revisiting the idea of what America is for. The same is happening in my country of birth with the decision to leave the European Union.

These are fundamental issues. Technologists, as members of society, have a point of view on them. But in the way that post-enlightenment governing institutions were set up to protect against asymmetric distribution of power, technology leaders must recognize that their platforms are now an undeniable, powerful influence on society.

As a society, we must do the work to have a point of view. What does responsible technology look like?

For technologists: How can we be humane and advance the goals of our society?

As technologists, we can be excited about re-inventing approaches from first principles. We must resist that impulse here, because there are things that we can do now, that we can learn now, from other professions, industries and areas to apply to our own. For example:

* We are better and stronger when we are together than when we are apart. If you’re a technologist, consider this question: what are the pros and cons of unionizing? As the product of a linked network, consider the question: what is gained and who gains from preventing humans from linking up in this way?

* Just as we create design patterns that are best practices, there are also those that represent undesired patterns from our society’s point of view known as dark patterns. We should familiarise ourselves with them and each work to understand why and when they’re used and why their usage is contrary to the ideals of our society.

* We can do a better job of advocating for and doing research to better understand the problems we seek to solve, the context in which those problems exist and the impact of those problems. Only through disciplines like research can we discover in the design phase — instead of in production, when our work can affect millions — negative externalities or unintended consequences that we genuinely and unintentionally may have missed.

* We must compassionately accept the reality that our work has real effects, good and bad. We can wish that bad outcomes don’t happen, but bad outcomes will always happen because life is unpredictable. The question is what we do when bad things happen, and whether and how we take responsibility for those results. For example, Twitter’s leadership must make clear what behaviour it considers acceptable, and do the work to be clear and consistent without dodging the issue.

* In America especially, technologists must face the issue of free speech head-on without avoiding its necessary implications. I suggest that one of the problems culturally American technology companies (i.e., companies that seek to emulate American culture) face can be explained in software terms. To use agile user story terminology, the problem may be due to focusing on a specific requirement (“free speech”) rather than the full user story (“As a user, I need freedom of speech, so that I can pursue life, liberty and happiness”). Free speech is a means to an end, not an end, and accepting that free speech is a means involves the hard work of considering and taking a clear, understandable position as to what ends.

* We have been warned. Academics — in particular, sociologists, philosophers, historians, psychologists and anthropologists — have been warning of issues such as large-scale societal effects for years. Those warnings have, bluntly, been ignored. In the worst cases, those same academics have been accused of not helping to solve the problem. Moving on from the past, is there not something that we technologists can learn? My intuition is that post the 2016 American election, middle-class technologists are now afraid. We’re all in this together. Academics are reaching out, have been reaching out. We have nothing to lose but our own shame.

* Repeat to ourselves: some problems don’t have fully technological solutions. Some problems can’t just be solved by changing infrastructure. Who else might help with a problem? What other approaches might be needed as well?

There’s no one coming. It’s up to us.

My final point is this: no one will tell us or give us permission to do these things. There is no higher organizing power working to put systemic changes in place. There is no top-down way of nudging the arc of technology toward one better aligned with humanity.

It starts with all of us.

Afterword

I’ve been working on the bigger themes behind this talk since …, and an invitation to 2017’s Foo Camp was a good opportunity to try to clarify and improve my thinking so that it could fit into a five minute lightning talk. It also helped that Foo Camp has the kind of (small, hand-picked — again, for good and ill) influential audience who would be a good litmus test for the quality of my argument, and would be instrumental in taking on and spreading the ideas.

In the end, though, I nearly didn’t do this talk at all.

Around 6:15pm on Saturday night, just over an hour before the lightning talks were due to start, after the unconference’s sessions had finished and just before dinner, I burst into tears talking to a friend.

While I won’t break the societal convention of confidentiality that helps an event like Foo Camp be productive, I’ll share this: the world felt too broken.

Specifically, the world felt broken like this: I had the benefit of growing up as a middle-class educated individual (albeit, not white) who believed he could trust that institutions were a) capable and b) would do the right thing. I now live in a country where a) the capability of those institutions has consistently eroded over time, and b) those institutions are now being systematically dismantled, to add insult to injury.

In other words, I was left with the feeling that there’s nothing left but ourselves.

Do you want the poisonous lead removed from your water supply? Your best bet is to try to do it yourself.

Do you want a better school for your children? Your best bet is to start it.

Do you want a policing policy that genuinely rehabilitates rather than punishes? Your best bet is to…

And it’s just. Too. Much.

Over the course of the next few days, I managed to turn my outlook around.

The answer, of course, is that it is too much for one person.

But it isn’t too much for all of us."
danhon  technology  2018  2017  johnperrybarlow  ethics  society  calltoaction  politics  policy  purpose  economics  inequality  internet  web  online  computers  computing  future  design  debchachra  ingridburrington  fredscharmen  maciejceglowski  timcarmody  rachelcoldicutt  stacy-marieishmael  sarahjeong  alexismadrigal  ericmeyer  timmaughan  mimionuoha  jayowens  jayspringett  stacktivism  georginavoss  damienwilliams  rickwebb  sarawachter-boettcher  jamebridle  adamgreenfield  foocamp  timoreilly  kaitlyntiffany  fredturner  tomcarden  blainecook  warrenellis  danhill  cydharrell  jenpahljka  robinray  noraryan  mattwebb  mattjones  danachisnell  heathercamp  farrahbostic  negativeexternalities  collectivism  zeyneptufekci  maciejcegłowski 
february 2018 by robertogreco
Children are tech addicts – and schools are the pushers | Eliane Glaser | Opinion | The Guardian
"As a culture, we are finally waking up to the dark side of new technology. “The internet is broken”, declares the current issue of Wired, the tech insiders’ bible. Last month Rick Webb, an early digital investor, posted a blog titled “My internet mea culpa”. “I was wrong,” he wrote. “We all were.” He called on the architects of the web to admit that new technology had brought more harm than good.

Yet while geeks, the public and politicians – including Theresa May – grow disenchanted, schools, and those responsible for the national curriculum, seem stuck in an earlier wide-eyed era. My instinct tells me that this innocence is perverse. As a friend memorably described it, when he gave his three-year-old his phone to play with, it was as if a worm had found its way into her head.

I flinch internally when my five-year-old tells me she plays computer games in what primary schools call “golden time” rather than enjoying some other more wholesome reward; and when my eight-year-old says that he’s learned to send an email when I sent my first email aged 20, and email has since taken over my life and that of every other adult I know.

Our kids don’t use computers at home. They watch a bit of television, but we don’t own a tablet. Their school is by no means evangelical about technology, but I nonetheless feel like it is playing the role of pusher, and I’m watching my children get hooked. When they went suspiciously quiet the other day, I found them under the kitchen table trying to explore my phone. Unfortunately for them, it’s a brick.

I’m wary of sounding sanctimonious, and corroding much-needed solidarity between busy parents with different views on screen use. But when I see an infant jabbing and swiping, I can’t help experiencing what the writer James Bridle calls in a disturbing recent essay a “Luddite twinge”; and the research suggests I should trust it.

Earlier this month the children’s commissioner for England warned that children starting secondary school were facing a social media “cliff edge” as they entered an online world of cyber-bullying and pornography. According to Public Health England, extended screen use correlates to emotional distress, anxiety and depression in children. The American College of Paediatricians associates it with sleep problems, obesity, increased aggression and low self-esteem.

And not only is screen technology harmful to children per se, there’s little evidence that it helps them to learn. A 2015 OECD report found that the impact of computers on pupil performance was “mixed, at best”, and in most cases computers were “hurting learning”. The journal Frontiers in Psychology identifies “an absence of research supporting the enthusiastic claims that iPads will ‘revolutionise education’”. Researchers at Durham University found that “technology-based interventions tend to produce just slightly lower levels of improvement” compared with other approaches. Even for the head of the e-Learning Foundation, proving technology improves results remains the “holy grail”.

Education technology is often justified on the grounds that it boosts disadvantaged children, yet research shows it widens rather than bridges socioeconomic divides. The One Laptop per Child programme, which distributed 25m low-cost computers with learning software to children in the developing world, failed to improve language or maths results.

Such evidence does not dent the faith of ed tech’s proselytisers. Children need to be prepared for the future, we are told. But companies don’t want children who learned PowerPoint aged 10, they want employees who know how to think from first principles. All those mind-numbing software programs will soon be obsolete anyway. Most coding classes only teach children to assemble pre-made building blocks. Silicon Valley executives restrict their own social media use and send their own kids to tech-free schools.

Technology does not evolve naturally; programs and devices are promoted by those with a commercial interest in selling them. Ed tech is projected to be worth £129bn by 2020. This week, the world’s biggest ed tech convention, Bett, is in London, “Creating a better future by transforming education”. Google, Microsoft and Facebook are flogging expensive kit to cash-strapped schools using buzzwords such as “engagement” and “interactivity”. The traditional teacher-pupil hierarchy must be “flipped”, they say, “empowering” pupils to direct their own learning.

In reality, children tap on tablets whose inner workings are as arcane and mystical to them as any authoritarian deity – and stare, blinds down, at the giant interactive whiteboard. Children may be temporarily gripped, but their attention spans will shrink in the long term.

Cyber-utopianism promises magic bullets for poverty and the crooked timber of humanity. But it’s old-school solutions that really work in the classroom: good teachers, plenty of fresh air and exercise, and hands-on exploration of the real, physical world. This is even what “digital natives” themselves actually want: a Canadian study of e-learning in universities revealed that students preferred “ordinary, real-life lessons” and “a smart person at the front of the room”.

I don’t want my kids fed into the sausage machine of standardised testing and the bureaucratic “information economy”. I don’t want them to become robotic competitors to the robots we are told are taking their future jobs. I can opt my children out of RE, but where technology is concerned, I feel bound by a blind determinism. Surely we have a choice, as humans, over the direction technology is taking us, and education is the perfect illustration of this capacity. Our children turn up as blank slates, and learn to design the future. It’s time for schools to join the backlash. It’s time to think again."
technology  edtech  schools  education  policy  addiction  computers  tablets  curriculum  2018  elianeglaser  standardizedtesting  standardization  digitalnatives  digital  humanism  siliconvalley 
january 2018 by robertogreco
Bat, Bean, Beam: Inside the Personal Computer
"The inside of a computer looks a bit like a city, its memory banks and I/O devices rising like buildings over the avenues of soldered circuits. But then so do modern cities resembles motherboards, especially at night, when the cars sparkle like point-to-point signal carriers travelling along the grid. It is a well-worn visual metaphor in films and advertising, suggesting that the nerve centres of business and finance have come to resemble the information infrastructure that sustains them. Besides, isn’t the city at the sharp edge of the late capitalist era above all a generator of symbols?

And yet this technology with which we are so intimate, and that more than any other since the invention of writing has extended us, remains mostly opaque to us. Why would anyone bother to learn what digital machines look like on the inside? What difference would it make, when the uses we make of them are so incommensurate with this trivial knowledge?

I like pop-up books, and early pop-up books about the inner workings of computers have become obsolete in an interesting way. They are the last thing we would think to use to demonstrate such knowledge nowadays. They are so prone to jamming or coming apart. They have none of the grace and smoothness that our devices aspire to.

The centre piece of Sharon Gallagher’s Inside the Personal Computer – An illustrated Introduction in 3 Dimensions (1984) is the machine itself, complete with keyboard and floppy disk drive.

If you push the disk inside its unit and lower the flap, a Roman blind-like mechanism changes the message on the screen from INSERT DISK AND CLOSE DOWN to HELLO: THIS BOOK EXPLAINS WHAT I AM AND HOW I WORK. BY THE END YOU’LL KNOW ME INSIDE OUT.

It’s a neat trick. But the book is at its best when it gets into the basics of how transistors work, or uses wheels to explain how to translate a number into binary code, or a typed character first into ASCII, then into its binary equivalent.

Or simply what happens when you type “M”.

There is the mechanical action that alienates us from the digital word. Writing technologized language but still allowed us to write in our own hand, whereas there is simply no way of typing gracefully. Any M is like any other M, and even if we choose a fancy font the translation from the essential M (ASCII code 77) to the fancy M happens inside the computer and in code. This is not a ‘bad thing’. It’s just the state of the tools of our culture, which require a different kind of practice.

The other thing that this book makes clear is that the personal computer hasn’t changed very much at all since 1984. Its component parts are largely unchanged: a motherboard, a central processing unit, RAM and ROM, I/O ports. Floppy disks have become USB sticks, while hard drives – which boasted at the time ‘between 5 and 50 megabytes of information – the equivalent of between 3,000 and 30,000 typewritten pages' – have fewer moving parts. But their function is the same as in the early models. Ditto the monitors, which have become flatter, and in colour. Even the mouse already existed, although back then its name still commanded inverted commas. Today’s computers, then, are a great deal more powerful, but otherwise fairly similar to what they were like three and a half decades ago. What makes them unrecognisable is that they’re all connected. And for that – for the internet – it makes even less sense to ‘take a look inside’. Inside what? Does the internet reside in the telephone exchange, or at the headquarters of ICANN, or where else?

The inside of a computer looks a bit like a city, but it’s an alien city. None of its buildings have doors or windows. The roads are made not of stone or asphalt but of plastic and metal.

The pictures above, by the way, show the guts of mine, which I recently upgraded. It’s what I used to write this blog and everything else from 2010 to June of this year, but I feel no attachment to it – it would be silly to.

There are guides on the web to help you mine your old computer for gold using household chemicals. They come with bold type warnings about how toxic the process is. But in fact computers are both hazardous to manufacture and to dismantle. Waste materials from all the PCs and assorted electronic devices discarded since 1984 have created massively polluted districts and cities in the global south. Places like the Agbogbloshie district of Accra, Ghana, and countless others. Vast dumping sites that are mined for scraps of precious metals as much as for the personal information left onto the hard drives, while leeching chemicals into the local water supply.

This would be a more meaningful inside in which to peer if we want to understand how computers work, and their effect on the world’s societies. One effect of globalisation has been to displace human labour. Not eliminate it, far from it, but rather create the illusion in the most advanced nations that manufacturing jobs have disappeared, and meaningful work consists in either farming the land or providing services. Automation has claimed many of those jobs, of course, but other have simply shifted away from the centres where most of the consumption takes place. This is another way in which the computer has become a mysterious machine: because no-one you know makes them.

Inside the Personal Computer was written 33 years ago in an effort to demystify an object that would soon become a feature in every household, and change everyone’s life. On the last page, it is no longer the book that ‘speaks’ to the reader, like in the first pop up, but the computer itself. Its message is perfectly friendly but in hindsight more than a little eerie."
giovnnitiso  computers  computing  2017  globalization  labor  hardware  geopolitics  economics  pop-upbooks  1984  sharongallagher  writing  technology  digital  physical  icann  ascii  accra  ghana  objects  environment  sustainability  ecology 
november 2017 by robertogreco
James Ryan on Twitter: "Happenthing On Travel On (1975) is a novel that integrates prose, source code, computer-generated text, and glitch art, to rhetorical effect https://t.co/Ex9zItG3xt"
"Happenthing On Travel On (1975) is a novel that integrates prose, source code, computer-generated text, and glitch art, to rhetorical effect"

[via: https://twitter.com/tealtan/status/892523355794001920 ]

"instead of making exaggerated claims about the creative (or even collaborative) role of the computer, she describes it as an expressive tool"
https://twitter.com/xfoml/status/892169553806901249

"Carole Spearin McCauley should be better recognized as a major innovator in the early period of expressive computing"
https://twitter.com/xfoml/status/892170816623751168
novels  writing  computing  computers  prose  code  coding  computer-generatedtext  text  glitchart  1975  carolespearinmccauley  collaboration  cyborgs 
august 2017 by robertogreco
Doug Engelbart, transcontextualist | Gardner Writes
"I’ve been mulling over this next post for far too long, and the results will be brief and rushed (such bad food, and such small portions!). You have been warned.

The three strands, or claims I’m engaging with (EDIT: I’ve tried to make things clearer and more parallel in the list below):

1. The computer is “just a tool.” This part’s in partial response to the comments on my previous post. [http://www.gardnercampbell.net/blog1/?p=2158 ]

2. Doug Engelbart’s “Augmenting Human Intellect: A Conceptual Framework” [http://www.dougengelbart.org/pubs/augment-3906.html ] is “difficult to understand” or “poorly written.” This one’s a perpetual reply. 🙂 It was most recently triggered by an especially perplexing Twitter exchange shared with me by Jon Becker.

3. Engelbart’s ideas regarding the augmentation of human intellect aim for an inhuman and inhumane parsing of thought and imagination, an “efficiency expert” reduction of the richness of human cognition. This one tries to think about some points raised in the VCU New Media Seminar this fall.

These are the strands. The weave will be loose. (Food, textiles, textures, text.)

1. There is no such thing as “just a tool.” McLuhan wisely notes that tools are not inert things to be used by human beings, but extensions of human capabilities that redefine both the tool and the user. A “tooler” results, or perhaps a “tuser” (pronounced “TOO-zer”). I believe those two words are neologisms but I’ll leave the googling as an exercise for the tuser. The way I used to explain this is my new media classes was to ask students to imagine a hammer lying on the ground and a person standing above the hammer. The person picks up the hammer. What results? The usual answers are something like “a person with a hammer in his or her hand.” I don’t hold much with the elicit-a-wrong-answer-then-spring-the-right-one-on-them school of “Socratic” instruction, but in this case it was irresistible and I tried to make a game of it so folks would feel excited, not tricked. “No!” I would cry. “The result is a HammerHand!” This answer was particularly easy to imagine inside Second Life, where metaphors become real within the irreality of a virtual landscape. In fact, I first came up with the game while leading a class in Second Life–but that’s for another time.

So no “just a tool,” since a HammerHand is something quite different from a hammer or a hand, or a hammer in a hand. It’s one of those small but powerful points that can make one see the designed built world, a world full of builders and designers (i.e., human beings), as something much less inert and “external” than it might otherwise appear. It can also make one feel slightly deranged, perhaps usefully so, when one proceeds through the quotidian details (so-called) of a life full of tasks and taskings.

To complicate matters further, the computer is an unusual tool, a meta-tool, a machine that simulates any other machine, a universal machine with properties unlike any other machine. Earlier in the seminar this semester a sentence popped out of my mouth as we talked about one of the essays–“As We May Think”? I can’t remember now: “This is your brain on brain.” What Papert and Turkle refer to as computers’ “holding power” is not just the addictive cat videos (not that there’s anything wrong with that, I imagine), but something weirdly mindlike and reflective about the computer-human symbiosis. One of my goals continues to be to raise that uncanny holding power into a fuller (and freer) (and more metaphorical) (and more practical in the sense of able-to-be-practiced) mode of awareness so that we can be more mindful of the environment’s potential for good and, yes, for ill. (Some days, it seems to me that the “for ill” part is almost as poorly understood as the “for good” part, pace Morozov.)

George Dyson writes, “The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same” (Turing’s Cathedral: The Origins of the Digital Universe). This is a very bold statement. I’ve connected it with everything from the myth of Orpheus to synaesthetic environments like the one @rovinglibrarian shared with me in which one can listen to, and visualize, Wikipedia being edited. Thought vectors in concept space, indeed. The closest analogies I can find are with language itself, particularly the phonetic alphabet.

The larger point is now at the ready: in fullest practice and perhaps even for best results, particularly when it comes to deeper learning, it may well be that nothing is just anything. Bateson describes the moment in which “just a” thing becomes far more than “just a” thing as a “double take.” For Bateson, the double take bears a thrilling and uneasy relationship to the double bind, as well as to some kinds of derangement that are not at all beneficial. (This is the double-edged sword of human intellect, a sword that sometimes has ten edges or more–but I digress.) This double take (the kids call it, or used to call it, “wait what?”) indicates a moment of what Bateson calls “transcontextualism,” a paradoxical level-crossing moment (micro to macro, instance to meta, territory to map, or vice-versa) that initiates or indicates (hard to tell) deeper learning.
It seems that both those whose life is enriched by transcontextual gifts and those who are impoverished by transcontextual confusions are alike in one respect: for them there is always or often a “double take.” A falling leaf, the greeting of a friend, or a “primrose by the river’s brim” is not “just that and nothing more.” Exogenous experience may be framed in the contexts of dream, and internal thought may be projected into the contexts of the external world. And so on. For all this, we seek a partial explanation in learning and experience. (“Double Bind, 1969,” in Steps to an Ecology of Mind, U Chicago Press, 2000, p. 272). (EDIT: I had originally typed “eternal world,” but Bateson writes “external.” It’s an interesting typo, though, so I remember it here.)


It does seem to me, very often, that we do our best to purge our learning environments of opportunities for transcontextual gifts to emerge. This is understandable, given how bad and indeed “unproductive” (by certain lights) the transcontextual confusions can be. No one enjoys the feeling of falling, unless there are environments and guides that can make the falling feel like flying–more matter for another conversation, and a difficult art indeed, and one that like all art has no guarantees (pace Madame Tussaud).

2. So now the second strand, regarding Engelbart’s “Augmenting Human Intellect: A Conceptual Framework.” Much of this essay, it seems to me, is about identifying and fostering transcontextualism (transcontextualization?) as a networked activity in which both the individual and the networked community recognize the potential for “bootstrapping” themselves into greater learning through the kind of level-crossing Bateson imagines (Douglas Hofstadter explores these ideas too, particularly in I Am A Strange Loop and, it appears, in a book Tom Woodward is exploring and brought to my attention yesterday, Surfaces and Essences: Analogy as the Fuel and Fire of Thinking. That title alone makes the recursive point very neatly). So when Engelbart switches modes from engineering-style-specification to the story of bricks-on-pens to the dialogue with “Joe,” he seems to me not to be willful or even prohibitively difficult (though some of the ideas are undeniably complex). He seems to me to be experimenting with transcontextualism as an expressive device, an analytical strategy, and a kind of self-directed learning, a true essay: an attempt:

And by “complex situations” we include the professional problems of diplomats, executives, social scientists, life scientists, physical scientists, attorneys, designers–whether the problem situation exists for twenty minutes or twenty years.

A list worthy of Walt Whitman, and one that explicitly (and for me, thrillingly) crosses levels and enacts transcontextualism.

Here’s another list, one in which Engelbart tallies the range of “thought kernels” he wants to track in his formulative thinking (one might also say, his “research”):

The “unit records” here, unlike those in the Memex example, are generally scraps of typed or handwritten text on IBM-card-sized edge-notchable cards. These represent little “kernels” of data, thought, fact, consideration, concepts, ideas, worries, etc. That are relevant to a given problem area in my professional life.

Again, the listing enacts a principle: we map a problem space, a sphere of inquiry, along many dimensions–or we should. Those dimensions cross contexts–or they should. To think about this in terms of language for a moment, Engelbart’s idea seems to be that we should track our “kernels” across the indicative, the imperative, the subjunctive, the interrogative. To put it another way, we should be mindful of, and somehow make available for mindful building, many varieties of cognitive activity, including affect (which can be distinguished but not divided from cognition).

3. I don’t think this activity increases efficiency, if efficiency means “getting more done in less time.” (A “cognitive Taylorism,” as one seminarian put it.) More what is always the question. For me, Engelbart’s transcontextual gifts (and I’ll concede that there are likely transcontextual confusions in there too–it’s the price of trancontextualism, clearly) are such that the emphasis lands squarely on effectiveness, which in his essay means more work with positive potential (understanding there’s some disagreement but not total disagreement about… [more]
dougengelbart  transcontextualism  gardnercampbell  2013  gregorybateson  marshallmcluhan  socraticmethod  education  teaching  howweteach  howwelearn  learning  hammerhand  technology  computers  computing  georgedyson  food  textiles  texture  text  understanding  tools  secondlife  seymourpapert  sherryturkle  alanturing  johnvonneumann  doublebind  waltwhitman  memex  taylorism  efficiency  cognition  transcontextualization 
july 2017 by robertogreco
The Edgeless & Ever-Shifting Gradient: An Encyclopaedic and Evolving Spectrum of Gradient Knowledge
"A gradient, without restriction, is edgeless and ever-shifting. A gradient moves, transitions, progresses, defies being defined as one thing. It formalizes difference across a distance. It’s a spectrum. It’s a spectral smearing. It’s an optical phenomenon occurring in nature. It can be the gradual process of acquiring knowledge. It can be a concept. It can be a graphic expression. It can be all of the above, but likely it’s somewhere in between.

A gradient, in all of it’s varied forms, becomes a catalyst in it’s ability to seamlessly blend one distinct thing/idea/color, to the next distinct thing/idea/color, to the next, etc.

In this sense, it is the gradient and the way it performs that has become a model and an underlying ethos, naturally, for this online publishing initiative that we call The Gradient.

Similarly, it’s our hope that this post—an attempt to survey gradients of all forms and to expand our own understanding of gradients—will also be edgeless and ever-shifting. This post will evolve and be progressively added to in an effort to create, as the subtitle says, an encyclopaedic and evolving spectrum of gradient knowledge."
gradients  art  2017  ryangeraldnelson  color  blending  spectrums  nature  design  gender  genderfluidity  computers  music  photography  graphics  graphicdesign  thermography  iridescence  brids  animals  insects  snakes  cephlalopods  reptiles  chameleons  rainbows  sky  math  mathematics  taubaauerbach  science  tomássaraceno  vision  brycewilner  alruppersberg  germansermičs  glass  ignazschiffermüller  lizwest  markhagen  ombré  rawcolor  samfall 
july 2017 by robertogreco
The History of Ed-Tech: What Went Wrong?
"There’s a popular origin story about education technology: that, it was first developed and adopted by progressive educators, those interested in “learning by doing” and committed to schools as democratic institutions. Then, something changed in the 1980s (or so): computers became commonplace, and ed-tech became commodified – built and sold by corporations, not by professors or by universities. Thus the responsibility for acquiring classroom technology and for determining how it would be used shifted from a handful of innovative educators (often buying hardware and software with their own money) to school administration; once computers were networked, the responsibility shifted to IT. The purpose of ed-tech shifted as well – from creative computing to keyboarding, from projects to “productivity.” (And I’ll admit. I’m guilty of having repeated some form of this narrative myself.)

[tweet: "What if the decentralized, open web was a historical aberration, an accident between broadcast models, not an ideal that was won then lost?"
https://twitter.com/ibogost/status/644994975797805056 ]

But what if, to borrow from Ian Bogost, “progressive education technology” – the work of Seymour Papert, for example – was a historical aberration, an accident between broadcast models, not an ideal that was won then lost?

There’s always a danger in nostalgia, when one invents a romanticized past – in this case, a once-upon-a-time when education technology was oriented towards justice and inquiry before it was re-oriented towards test scores and flash cards. But rather than think about “what went wrong,” it might be useful to think about what was wrong all along.

Although Papert was no doubt a pioneer, he wasn’t the first person to recognize the potential for computers in education. And he was hardly alone in the 1960s and 1970s in theorizing or developing educational technologies. There was Patrick Suppes at Stanford, for example, who developed math instruction software for IBM mainframes and who popularized what became known as “computer-assisted instruction.” (Arguably, Papert refers to Suppes’ work in Mindstorms when he refers to “the computer being used to program the child” rather than his own vision of the child programming the computer.)

Indeed, as I’ve argued repeatedly, the history of ed-tech dates at least as far back as the turn of the twentieth century and the foundation of the field of educational psychology. Much of we see in ed-tech today reflects those origins – the work of psychologist Sidney Pressey, the work of psychologist B. F. Skinner, the work of psychologist Edward Thorndike. It reflects those origins because, as historian Ellen Condliffe Lagemann has astutely observed, “One cannot understand the history of education in the United States during the twentieth century unless one realizes that Edward L. Thorndike won and John Dewey lost.”

Ed-tech has always been more Thorndike than Dewey because education has been more Thorndike than Dewey. That means more instructivism than constructionism. That means more multiple choice tests than projects. That means more surveillance than justice.
(How Thorndike's ed-tech is now being rebranded as “personalization” (and by extension, as progressive education) – now that's an interesting story..."

[via: ""Edward L. Thorndike won and John Dewey lost" is pretty much the perfect tl;dr version of the history of education."
https://twitter.com/jonbecker/status/884460561584594944

See also: "Or David Snedden won. People forget about him."
https://twitter.com/doxtdatorb/status/884520604287860736 ]
audreywatters  ianbogost  johndewey  seymourpapert  edtech  computers  technology  education  ellencondliffe  edwardthorndike  bfskinner  sidneypressey  psychology  management  administration  it  patricksuppes  constructivism  constructionism  progressive  mindstorms  progressiveeducation  standardization  personalization  instructivism  testing  davidsnedden  history 
july 2017 by robertogreco
Why I am NOT Going to Buy a Computer - Wendell Berry
"Like almost everybody else, I am hooked to the energy corporations, which I do not admire. I hope to become less hooked to them. In my work, I try to be as little hooked to them as possible. As a farmer, I do almost all of my work with horses. As a writer, I work with a pencil or a pen and a piece of paper.

My wife types my work on a Royal standard typewriter bought new in 1956 and as good now as it was then. As she types, she sees things that are wrong and marks them with small checks in the margins. She is my best critic because she is the one most familiar with my habitual errors and weaknesses. She also understands, sometimes better than I do, what ought to be said. We have, I think, a literary cottage industry that works well and pleasantly. I do not see anything wrong with it.

A number of people, by now, have told me that I could greatly improve things by buying a computer. My answer is that I am not going to do it. I have several reasons, and they are good ones.

The first is the one I mentioned at the beginning. I would hate to think that my work as a writer could not be done without a direct dependence on strip-mined coal. How could I write conscientiously against the rape of nature if I were, in the act of writing, Implicated in the rape ? For the same reason, it matters to me that my writing is done in the daytime, without electric light.

I do not admire the computer manufacturers a great deal more than I admire the energy industries. I have seen their advertisements. attempting to seduce struggling or failing farmers into the belief that they can solve their problems by buying yet another piece of expensive equipment. I am familiar with their propaganda campaigns that have put computers into public schools in need of books. That computers are expected to become as common as TV sets in "the future" does not impress me or matter to me. I do not own a TV set. I do not see that computers are bringing us one step nearer to anything that does matter to me: peace, economic justice, ecological health, political honesty, family and community stability, good work.

What would a computer cost me? More money, for one thing, than I can afford, and more than I wish to pay to people whom I do not admire. But the cost would not be just monetary. It is well understood that technological innovation always requires the discarding of the "old model"—the "old model" in this case being not just our old Royal standard. but my wife, my critic, closest reader, my fellow worker. Thus (and I think this is typical of present-day technological innovation). what would be superseded would be not only something, but somebody. In order to be technologically up-to-date as a writer, I would have to sacrifice an association that I am dependent upon and that I treasure.

My final and perhaps mv best reason for not owning a computer is that I do not wish to fool myself. I disbelieve, and therefore strongly resent, the assertion that I or anybody else could write better or more easily with a computer than with a pencil. I do not see why I should not be as scientific about this as the next fellow: when somebody has used a computer to write work that is demonstrably better than Dante's, and when this better is demonstrably attributable to the use of a computer, then I will speak of computcr with a more respectful tone of voice, though I still will not buy one.

To make myself as plain as I can, I should give my standards for technological innovation in my own work. They are as follows:-

1. The new tool should be cheaper than the one it replaces.
2. It should be at least as small in scale as the one it replaces.
3. It should do work that is clearly and demonstrably better than the one it replaces.
4. It should use less energy than the one it replaces.
5. If possible, it should use some form of solar energy, such as that of the body.
6. It should be repairable by a person of ordinary intelligence, provided that he or she has the necessary tools.
7. It should be purchasable and repairable as near to home as possible.
8. It should come from a small, privately owned shop or store that will take it back for maintenance and repair.
9. It should not replace or disrupt anything good that already exists, and this includes family and community relationships."
computers  ethics  politics  technology  tools  wendellberry  via:austinkleon  1987 
february 2017 by robertogreco
Robin Hunicke Wants to Change Video Games, But She Can’t Do It Alone | VICE | United Kingdom
""As a developer, it's my job to evangelise the games that I think are different, that are doing new things. And when they come out, I want everyone I know to know about them. But it'd be really awesome if we could somehow give away space, or create platforms of promotion, that were just about innovation."

Robin Hunicke knows a thing or two about going against what the gaming public might perceive as the stylistic grain, the marketable middle-ground, sales-numbers safe spaces of play. Having worked on MySims and Boom Blox at Electronic Arts, the San Francisco-based game designer (and professor of game design, at the University of California, Santa Cruz) moved to thatgamecompany, where she produced Journey. Perhaps you heard of it, as it was kind of a big deal.

Journey was a critical and commercial success that arrived without much in the way of how-it-works precedent, playing like nothing most that picked it up had seen before. A multiplayer game in which human-to-human interactions were all but stripped away. A short experience, coming in at under 90 minutes from start to finish, but with lasting, memory-making resonance. A story told only one way, yet left open to all manner of individual interpretation. Journey earned rave reviews and collected all manner of industry awards (dominating the 2013 Game Developers Choice Awards), and broke PlayStation Network sales records."



"Quite where Woorld fits in the wider gaming landscape, though, is hard to get a handle on. This is a new tool, a new toy, for new technology, coming through at a time when augmented reality is enjoying a spell of popularity courtesy of Pokémon Go; but without that kind of massive IP to drive its marketing, merely the wonderfully colourful and somewhat surreal visuals that Takahashi's fans have come to expect, it's not like the game is about to follow in said record-setter's monstrous footsteps. And Robin sees this problem of visibility, of public accessibility and appetite for something left of the expected, not just as a headache for Funomena, but everyone making games outside of the triple-A sphere.

"There are a lot of very different games out there, but as an industry we're not so good at presenting that in our marketing, and our PR, and the stories that we write. And that's what really carries this medium forward – how people perceive it. How many games get covered that are radically different from whatever else is around? How many of them are featured on the front of the online stores? How many are appearing in top ten lists? When your top ten lists are based on sales, and sales are influenced by marketing budgets, you're only rarely going to see a Papers, Please or That Dragon, Cancer, or Firewatch or even The Witness, right up there amongst the most popular games. And I think it's on us to change that.

"Wouldn't it be great if anyone could make the next Minecraft, or the next Journey, or the next Papers, Please? If independently made games keep growing in popularity, and we keep on expanding the marketplace – because there are a lot of people right now who don't play indie games at all – then that'd be amazing. Let's do that."

One way of doing that would be for distribution channels to place greater emphasis on highlighting experiences that are so far from the norm. "Why not have an innovation tab in online stores?" she asks, rhetorically. "Maybe the labels we're using are out of date. What are they, like, twenty, thirty years old?"

Those labels don't just mean "action", "adventure", "puzzle" or "sports"; Robin's talking about the language that flows through every way that the gaming industry presents itself, how it reflects and addresses issues that eat away at its insides.

"I've been to the White House on this initiative called Computer Science For All, and I try to volunteer for it whenever I can. That's full of fantastic people, and the last time I was there I met with the Chief Technology Officer of the United States, Megan Smith, and she was talking about Maria Klawe's work at Harvey Mudd, and she's been promoting an idea that when people come into the college, as programmers, they get divided into two groups: people who've already programmed a lot, and people who haven't at all.

"So you have experts, and beginners – two safe communities. It's not about gender, or race, or class – it's about how much experience you have. Then, in both of those groups, unconscious bias is being removed from the learning cycle. In this exercise, you will help the robot move rocks into a pile. That's one version. In this exercise, you will help the robot move the groceries from the kart into the boot of the car. The same exercise, essentially. Then in this exercise, you'll help the girls from Frozen move these snow bricks over here so they can build a castle. Same exact programming. Separate the frame from the exercise, separate the communities into beginner and expert, and they're at parity in six years.

"So let's do that across gaming – separate the gender and the cultural status of developers from their work, and from the way we write about it, and the financing, and the relationship of scale from our evaluation of its innovative qualities, and use better vocabulary. We do all that, and in ten years, we'll have moved very far away from the problems of having to discuss things like gender imbalance in the industry, and towards a situation where our values – the things that we value – are reflected in the things that we write, and the way that we give awards, and the way that we promote.

"I think it's just about living the values that we want to live, and saying the things that we want to focus on, rather than reacting to older labels that may or may not be appropriate. There's always going to be room for great art, and room for new experiences. And, if you invest in those experiences, the chances are that one in ten, or one in twenty, will deliver a really big return, on a level with a game like The Witness, or Journey. But it's impossible for one, small person to really know how we proceed."

Impossible at an individual level, maybe, but Robin's words are essential food for thought for what could, or should, happen on a united front. During our time together I ask for her opinion on how to bring more women into the making and marketing of video games – but as she so neatly elucidates in her answer to me, I'm really speaking to the wrong person. And in many ways, video gaming is constantly asking the wrong questions to the wrong people.

Want to get more women into games? Go and speak to the guys that hold the keys to those positions, not the women knocking on the doors. Want to see more innovative, independent games being played alongside the big-budget shooters and sports sims? Consider why those cover-grabbing games are enjoying such heightened visibility, and if you need to add to their oxygen of hype at the expense of something genuinely new. We could all be better at supporting games-makers who want to progress this medium in all the right ways – through sharing, through inventing, through fun, rather than rinsing and repeating what's known to "work". Robin Hunicke is just one of many people wanting to encourage change in the way we work within and relate to video games, but it's exciting to imagine what'll happen when all of the voices around hers, singing equally inspiring songs, do band together."
robinhunicke  games  gaming  videogames  gamedev  2016  funomena  thatgamecompany  jenovachen  keitatakahashi  computers  compsci  education  learning  play  gender 
august 2016 by robertogreco
Robin Hunicke's extraordinary journey • Eurogamer.net
"Hunicke's path to this moment was unorthodox and unexpected. She grew up near the mountains in Saratoga Springs, New York, close to Vermont. Her mother taught maths and weaved. Her father was a nuclear engineer. They lived on a street alongside 20 or so other families, all with children of similar ages. In the summer Hunicke and her friends would build forts in the forest, and race twig boats in the frothing river. In the winter there were board-games and NES. It was a playful, often idyllic childhood, she recalls. Each summer during high school, Hunicke would be sent to art camp, where she'd paint and build.

One year Hunicke and her father built a grandfather clock. It had been, rather befittingly, her grandfather's project originally. He built the base from African red hardwood then, upon realising the scale of the job, shipped the materials to his son and granddaughter to finish. Hunicke's father ordered the clock mechanism from Germany. The finished clock still lives at her father's house. Every time she returns home she listen to the rounded tock of the mechanism. "It fills me with joy," she says. "I love the experience of seeing something you've made come to life."

Video games were a natural fit for Hunicke's nascent interests, combining her mathematical talent ("At night, when I couldn't get to sleep, I'd count the leaves on a branch out side of my bedroom window," she recalls. "I'd multiply that by the number of branches I estimated the tree to have, and that figure by the number of trees in our yard, then our road, then our town, then our State") and her artistic sensibilities. But she has a magpie temperament. "I was interested in everything", she says. "So when I went to college I made up my major, a combination of fine art, film studies, women's studies and computer science." While at the University of Chicago Hunicke's aptitude for computers earned her a job at a police station where she would schedule the officers on a database system. "I soon discovered that was less fun than working on the computer lab at college so I got a new job managing the Mac lab there." As Hunicke learned more and more computer languages her focus narrowed. She began studying for a doctorate in artificial intelligence.

Video games had, up to this point, played only a supporting role in Hunicke's life. Her first love was M.U.L.E., the Commodore 64 game in which players compete against each other and the computer in a bid for survival, which she'd play at a friend's house when she was 12. "I loved trying to outwit each other and the game at the same time," she recalls. It was only when Hunicke started her PhD in AI, and became interested in adaptive difficulty in video games, specifically Half-Life, that she began hanging out with game-makers."



""I needed a break," she says. Perhaps, but the peak she faced in Bhutan mirrored other towering questions in her life. Was she going to stay in Los Angeles? Was she going to stay in her current relationship? Was she going to continue making games? While ascending the mountain Hunicke met other English-speaking climbers who were also taking a step back to examine their goals and challenges. "It made me realise we are all on a similar journey," she says. "That helped me with imposter syndrome." As she came over the top of the mountain, a burden lifted, she says. Then, when she arrived in Los Angeles, she received a message from an old friend, Jenova Chen: would she like to be lead designer on his new project, a game about a pilgrimage to a mountain where, en route, you meet people who fleetingly join you. "It felt right," she says.

When Hunicke joined thatgamecompany she was the sixth employee. The team was working from a "closet-sized room" and had just completed a prototype of the game, which would later be named Journey, in Flash, in which players were represented as coloured dots. "I supervised the first four player playtest," Hunicke recalls. "We brought people in through different doors so nobody knew it was a multiplayer game. Then we brought them together to discuss what they'd seen. These were just coloured dots that could only move or, if the player hit the space bar, say 'hey!', but people immediately would project emotions and personalities onto the dots they were playing with, calling them the 'mean' one, or the 'helpful' one. That's when I knew the idea was special.""



"Takashi and Hunicke make for a harmonious paring. Both designers have a background in arts and crafts and Hunicke's new studio Funomena, founded in 2012 with her former colleague Martin Middleton, is filled with sand, clay, pipe-cleaners, wire toys and so on. "We often model stuff by hand before putting them in our games," she says. Funomena is currently working on three games, one of which, Wattam, is being directed by Takahashi. Wattam, for which a release date has not yet been announced, reflects the foundations of Hunicke's childhood: playfulness, creativity, collaboration. "Keita's view on childhood and play is similar to mine," she says. "People should make things. We want this game to be more about you as a player than about us as the designers, artists and musicians behind it."

Games that encourage this kind of playfulness rather than than seek to force a specific message are at the core of Hunicke's interest. While she holds up Papers, Please and Cart Life as prime examples of what games can achieve she wants her work to have looser interpretations. "I know people who make films who feel their film has a single interpretation," she says. "But they are rare. The majority of people who write or make films and art are just trying to get something out of them. When it's in the world it's there for everybody to draw what they will from the work. Besides, you can't control the context. For example, art that's made in this moment around Brexit could have a very different interpretation in ten years depending on what happens to Britain's fortunes."

For today's context with its climate of fear and uncertainty, of strongmen on the rise, of nations baring teeth, playfulness is, Hunicke believes, a necessity. "We've been spending a lot of time thinking about mechanics and systems as an industry," he says. "Doing something a little more open is important right now for this difficult, sad and important time. We're having conversations about all of the forces that inform how we behave, and how we sustain our planet. They are crucial conversations. But that needs the counterbalance of playfulness. All of the games I'm working on at the moment are about interacting with others in purely playful ways. I hope they encourage people to help one another, not for what they can get out of it, but just for the sake of it.""
robinhunicke  games  gaming  videogames  gamedev  2016  funomena  thatgamecompany  jenovachen  keitatakahashi  childhood  computers  compsci  education  learning  play 
august 2016 by robertogreco
Earth-friendly EOMA68 Computing Devices | Crowd Supply
[via: https://boingboing.net/2016/08/04/a-freeopen-computer-on-a-card.html ]

"Have you ever had to replace an expensive laptop because it was “unfixable” or the cost of getting it repaired was ridiculously high? That really stings, doesn’t it?

Now imagine if you owned a computing device that you could easily fix yourself and inexpensively upgrade as needed. So, instead of having to shell out for a completely new computer, you could simply spend around US$50 to upgrade — which, by the way, you could easily do in SECONDS, by pushing a button on the side of your device and just popping in a new computer card. Doesn’t that sound like the way it should be?

We think so, too! That’s why we spent several years developing the easy-to-maintain, easy-on-your-pocket, easy-on-Mother Earth, EOMA68 line of computing devices.

Read on, because it gets even better. Now, let’s say you accidentally dropped your laptop and a corner gets cracked. Instead of swearing or weeping over the loss, you simply PRINT OUT REPLACEMENT PARTS with a 3D printer. With the EOMA68 line of computers, you have the freedom to make your own laptop housing parts and can download the CAD files to have replacement PCBs made. Heck, you don’t necessarily have to break anything to have a bit of fun with your laptop: maybe you would like the freedom of being able to CHANGE THE COLOR from silver to aqua to bright orange.

A great deal of thought and ingenuity has been put into the design of the EOMA68 line of computing devices to make them money-saving and convenient. For example, you can connect the computer card to your TV set to continue working if your monitor fails… and in the future, we’d like to give you the option to plug the computer card into your TV set if your monitor fails.

Security is also a major concern. We have taken measures to ensure the integrity of your computer data that exceed anything being sold in North America, Europe (or most parts of the world). And, because we have the complete set of sources, there is an opportunity to weed out the back doors that have been slowly making their way into our computing devices. There is no security without a strong foundation and understanding of what is running on your computing devices. For the first time, the EOMA68 is a standard to work off for building freedom-friendly, privacy-respecting, and secure computing devices.

Lastly, being kind to Mother Earth has to be a priority. It goes without saying that we don’t like seeing electronic goods continue to stack up in landfills around the world, and we know you don’t like it either. We envisage a thriving community developing around the re-use of older computer cards: people using them to set up ultra-low power servers, routers, entertainment centers or just passing them on to a friend.

The EOMA68 Standard
The goal of this project is to introduce the idea of being ethically responsible about both the ecological and the financial resources required to design, manufacture, acquire and maintain our personal computing devices. This campaign therefore introduces the world’s first devices built around the EOMA68 standard, a freely-accessible royalty-free, unencumbered hardware standard formulated and tested over the last five years around the ultra-simple philosophy of “just plug it in: it will work.”

Key Aspects
• Truly Free: Everything is freely licensed
• Modular: Use the same Computer Card across many devices
• Money-saving: Upgrade by replacing Computer Cards, not the whole device
• Long-lived: Designed to be relevant and useful for at least a decade, if not longer
• Ecologically Responsible: Keeps parts out of landfill by repurposing them

Some of you might recognise the form-factor of EOMA68 Computer Cards: it’s the legacy PCMCIA from the 1990s. The EOMA68 standard therefore re-uses legacy PCMCIA cases and housings, because that’s an environmentally responsible thing to do (and saves hugely on development costs).

Read more on the ecological implications of electronics waste in the white paper.

First Offerings
The first of the available devices will be a Micro-Desktop Housing, a 15.6” Laptop Housing, and two types of Computer Cards based on a highly efficient Allwinner A20 Dual-core ARM Cortex A7 processor."
plannedobsolescence  computers  hardware  computing 
august 2016 by robertogreco
The Minecraft Generation - The New York Times
"Seth Frey, a postdoctoral fellow in computational social science at Dartmouth College, has studied the behavior of thousands of youths on Minecraft servers, and he argues that their interactions are, essentially, teaching civic literacy. “You’ve got these kids, and they’re creating these worlds, and they think they’re just playing a game, but they have to solve some of the hardest problems facing humanity,” Frey says. “They have to solve the tragedy of the commons.” What’s more, they’re often anonymous teenagers who, studies suggest, are almost 90 percent male (online play attracts far fewer girls and women than single-­player mode). That makes them “what I like to think of as possibly the worst human beings around,” Frey adds, only half-­jokingly. “So this shouldn’t work. And the fact that this works is astonishing.”

Frey is an admirer of Elinor Ostrom, the Nobel Prize-­winning political economist who analyzed the often-­unexpected ways that everyday people govern themselves and manage resources. He sees a reflection of her work in Minecraft: Running a server becomes a crash course in how to compromise, balance one another’s demands and resolve conflict.

Three years ago, the public library in Darien, Conn., decided to host its own Minecraft server. To play, kids must acquire a library card. More than 900 kids have signed up, according to John Blyberg, the library’s assistant director for innovation and user experience. “The kids are really a community,” he told me. To prevent conflict, the library installed plug-ins that give players a chunk of land in the game that only they can access, unless they explicitly allow someone else to do so. Even so, conflict arises. “I’ll get a call saying, ‘This is Dasher80, and someone has come in and destroyed my house,’ ” Blyberg says. Sometimes library administrators will step in to adjudicate the dispute. But this is increasingly rare, Blyberg says. “Generally, the self-­governing takes over. I’ll log in, and there’ll be 10 or 15 messages, and it’ll start with, ‘So-and-so stole this,’ and each message is more of this,” he says. “And at the end, it’ll be: ‘It’s O.K., we worked it out! Disregard this message!’ ”

Several parents and academics I interviewed think Minecraft servers offer children a crucial “third place” to mature, where they can gather together outside the scrutiny and authority at home and school. Kids have been using social networks like Instagram or Snapchat as a digital third place for some time, but Minecraft imposes different social demands, because kids have to figure out how to respect one another’s virtual space and how to collaborate on real projects.

“We’re increasingly constraining youth’s ability to move through the world around them,” says Barry Joseph, the associate director for digital learning at the American Museum of Natural History. Joseph is in his 40s. When he was young, he and his friends roamed the neighborhood unattended, where they learned to manage themselves socially. Today’s fearful parents often restrict their children’s wanderings, Joseph notes (himself included, he adds). Minecraft serves as a new free-­ranging realm.

Joseph’s son, Akiva, is 9, and before and after school he and his school friend Eliana will meet on a Minecraft server to talk and play. His son, Joseph says, is “at home but still getting to be with a friend using technology, going to a place where they get to use pickaxes and they get to use shovels and they get to do that kind of building. I wonder how much Minecraft is meeting that need — that need that all children have.” In some respects, Minecraft can be as much social network as game.

Just as Minecraft propels kids to master Photoshop or video-­editing, server life often requires kids to acquire complex technical skills. One 13-year-old girl I interviewed, Lea, was a regular on a server called Total Freedom but became annoyed that its administrators weren’t clamping down on griefing. So she asked if she could become an administrator, and the owners said yes.

For a few months, Lea worked as a kind of cop on that beat. A software tool called “command spy” let her observe records of what players had done in the game; she teleported miscreants to a sort of virtual “time out” zone. She was eventually promoted to the next rank — “telnet admin,” which allowed her to log directly into the server via telnet, a command-­line tool often used by professionals to manage servers. Being deeply involved in the social world of Minecraft turned Lea into something rather like a professional systems administrator. “I’m supposed to take charge of anybody who’s breaking the rules,” she told me at the time.

Not everyone has found the online world of Minecraft so hospitable. One afternoon while visiting the offices of Mouse, a nonprofit organization in Manhattan that runs high-tech programs for kids, I spoke with Tori. She’s a quiet, dry-­witted 17-year-old who has been playing Minecraft for two years, mostly in single-­player mode; a recent castle-­building competition with her younger sister prompted some bickering after Tori won. But when she decided to try an online server one day, other players — after discovering she was a girl — spelled out “BITCH” in blocks.

She hasn’t gone back. A group of friends sitting with her in the Mouse offices, all boys, shook their heads in sympathy; they’ve seen this behavior “everywhere,” one said. I have been unable to find solid statistics on how frequently harassment happens in Minecraft. In the broader world of online games, though, there is more evidence: An academic study of online players of Halo, a shoot-’em-up game, found that women were harassed twice as often as men, and in an unscientific poll of 874 self-­described online gamers, 63 percent of women reported “sex-­based taunting, harassment or threats.” Parents are sometimes more fretful than the players; a few told me they didn’t let their daughters play online. Not all girls experience harassment in Minecraft, of course — Lea, for one, told me it has never happened to her — and it is easy to play online without disclosing your gender, age or name. In-game avatars can even be animals.

How long will Minecraft’s popularity endure? It depends very much on Microsoft’s stewardship of the game. Company executives have thus far kept a reasonably light hand on the game; they have left major decisions about the game’s development to Mojang and let the team remain in Sweden. But you can imagine how the game’s rich grass-roots culture might fray. Microsoft could, for example, try to broaden the game’s appeal by making it more user-­friendly — which might attenuate its rich tradition of information-­sharing among fans, who enjoy the opacity and mystery. Or a future update could tilt the game in a direction kids don’t like. (The introduction of a new style of combat this spring led to lively debate on forums — some enjoyed the new layer of strategy; others thought it made Minecraft too much like a typical hack-and-slash game.) Or an altogether new game could emerge, out-­Minecrafting Minecraft.

But for now, its grip is strong. And some are trying to strengthen it further by making it more accessible to lower-­income children. Mimi Ito has found that the kids who acquire real-world skills from the game — learning logic, administering servers, making YouTube channels — tend to be upper middle class. Their parents and after-­school programs help them shift from playing with virtual blocks to, say, writing code. So educators have begun trying to do something similar, bringing Minecraft into the classroom to create lessons on everything from math to history. Many libraries are installing Minecraft on their computers."
2016  clivethompson  education  videogames  games  minecraft  digitalculture  gaming  mimiito  robinsloan  coding  computationalthinking  stem  programming  commandline  ianbogost  walterbenjamin  children  learning  resilience  colinfanning  toys  lego  wood  friedrichfroebel  johnlocke  rebeccamir  mariamontessori  montessori  carltheodorsorensen  guilds  mentoring  mentorship  sloyd  denmark  construction  building  woodcrafting  woodcraft  adventureplaygrounds  material  logic  basic  mojang  microsoft  markuspersson  notch  modding  photoshop  texturepacks  elinorostrom  collaboration  sethfrey  civics  youtube  networkedlearning  digitalliteracy  hacking  computers  screentime  creativity  howwelearn  computing  froebel 
april 2016 by robertogreco
Rule of Three and other ideas
"and other handy thoughts: so many folks have asked me for a "quick start" set of rules for the design of 3rd Millennium learning spaces...
... this Rule of Three section and some of the other ideas here (see top of this page), have all been well received in conferences, seminars and most importantly adopted / shared with success by practitioners. These are proven, working ideas, so I thought it was time to park some of them on a web page:

***

rule of three - physical

I guess rule one is really that there is no absolutely right way to make learning better - schools are all different, their communities, contexts vary and as I have often observed on a windy day they become different places again. So you build your local recipe for great learning from the trusted and tested ingredients of others, adding a bit of local flair too. But this rule of three helps:

one: never more than three walls

two: no fewer than three points of focus

three: always able to accommodate at least three teachers, three activities (for the larger spaces three full "classes" too)

make no mistake - this is not a plea for those ghastly open plan spaces of the 1960s with their thermoplastic floors under high alumina concrete beams - with the consequent cacophony that deafened their teachers. Today's third millennium learning spaces are multi-faceted, agile (and thus easily re-configured by users as they use them), but allow all effective teaching and learning approaches, now and in the future, to be incorporated: collaborative work, mentoring, one-on-one, quiet reading, presentation, large group team taught groups... and more.

***

rule of three - pedagogic

one: ask three then me

A simple way to encourage peer support, especially in a larger mixed age, stage not age space, but it even works fine in a small 'traditional" closed single class classroom. Put simply the students should ask 3 of their peers before approaching the teacher for help. I've watched, amused in classes where a student approaches the teacher who simply holds up 3 fingers, with a quizzical expression and the student paused, turned and looked for help for her peers first. Works on so many levels...

two: three heads are better than one

Everyone engaging in team teaching reports that, once you get over the trust-wall of being confident that your colleagues will do their bit (see Superclasses) the experience of working with others, the professional gains, and the reduction in workloads are real and worthwhile. You really do learn rapidly from other teachers, the children's behaviour defaults to the expectations of the teacher in the room with the highest expectations, and so on. Remarkably schools especially report on the rapid progress of newly qualified teachers who move forward so quickly that people forget they are still NQTs. And older teachers at career end become rejuvenated by a heady mix of new ideas and of self esteem as they see that their "teaching craft" skills are valued and valuable.

three: three periods a day or fewer

Particularly in 2ndary schools a fragmented timetable of 5 or 6 lessons a day wastes so much time stopping and starting. Children arrive and spend, say, 3 minutes getting unpacked, briefed and started, then end 2 minutes before the "bell" and have 5 minutes travelling time between classes. On a 5 period day that is (3+2+5) x 5 = 50 minutes "lost" each day, 50 x 5 = 250 lost each week, which is effectively throwing away a day a week. Longer blocks, immersion can be solid blocks of a day of more, some schools even adopt a week, gets students truly engaged - and serves as a clear barrier to Dick Turpin teaching ("Stand and Deliver!") - which simply cannot be sustained for long blocks of time - thank goodness. This doesn't mean that the occasional "rapid fire" day (a bit like pedagogic Speed Dating!) can't be used to add variety. But longer blocks of time work better mainly.

***

rule of three - BYOD / UMOD

some schools adopting Bring Your Own Device (BYOD), or more recently Use My Own Device (UMOD - somehow, bringing them wasn't enough!) initially adopted really comprehensive "acceptable use policies" - bulging folders of policy that were neither understood nor adhered too (see for example the "sacrificial phones" mention under "What young people say" in the 2011 Nominet funded Cloudlearn research project).

Today though (2015) schools around the world, from Scandinavia to Australasia, are simpifying all this by three simple rules.

one: phones out, on the desk, screen up

Not everyone has a "desk" anymore of course, but the point here is that a device hidden under a work surface is more likely to be a problem than one on the worksurface, screen up. This makes it quick and easy to use, where appropriate, and simple to monitor by teachers or peers.

two: if you bring it, be prepared to share sometimes

This is more complex that it looks. Obviously handing your phone or tablet over to just anyone isn't going to happen, but the expectation that friends, or project collaborators, might simply pick up "your" device and chat to Siri, Google for resources, or whatever, means that bullying, inappropriate texts / images, or general misdemeanours are always likely to be discovered. Transparency is your friend here, secrecy masks mischief - and the expectation of occasional sharing is transparency enough. It also helps students develop simply safety / security habits - like logging out of social media to prevent Frapping or similar.

three: if you bring it, the school might notice and respond positively

If you've brought your own device along, the least you might expect is that the school gives you useful things to do, that you could not otherwise do, or couldn't do so well, without that device.

This requires a bit of imagination all round! A simple example would be the many schools that now do outdoor maths project tasks using the devices GPS trace capability (the device is sealed in a box during the excercise) like the children below tasked with drawing a Christmas tree on the park next to their school: estimating skills, geometry, measurement, scale, collaboration.... and really jolly hard to do with a pencil!

[image of a GPS traced tree]

***

knowing the 3rd millennium ABCs

A

ambition: how good might your children be?

agility: how quickly can we reconfigure to catch the wave - at a moment, only over a year, or at best across a generation?

astonishment: we want people to be astonished by what these children, and teachers, might achieve - how do we showcase this? how do we respond to it ourselves?

B

brave: what are others doing, what tested ideas can we borrow, how can we feed our own ideas to others? Brave is not foolhardy or reckless!

breadth: learning reaches out to who? embraces what? what support do you give for your school's grandparents for example?

blockers: you will need help with beating the blockers - if you run at the front, you need resources that win arguments: what is the evidence that...? why doesn't everyone do this...? where can I see it in action...? why should I change, ever...? all this exists of course (see top of page for example), but you need to organise it and be ready with it. A direct example is this workshop manual we developed for the new science spaces at Perth's Wesley College in Australia.

C

collegiality: that sense of belonging, of us-ness, sense of family, sharing, co-exploring, research. Also a sense of us (the team working on this innovation) being learners too - and able to show that we are trying cool stuff too - you won't win hearts and minds by saying but not doing;

communication: how does a learning space / building communicate what happens within? and this is about symmetry: how does the school listen to what happens outside school? how do we share and exchange all this with others?

collaboration: we don't want to be told, but we want to do this with others. How do we share what we learn as we do it? Who do we share with? How do we learn from them?"
tcsnmy  lcproject  openstudioproject  edtech  technology  schooldesign  stephenheppell  via:sebastienmarion  pedagogy  howweteach  howwelearn  education  teaching  learning  schools  collaboration  byod  umod  sharing  ambition  agility  astonishment  bravery  breadth  blockers  collegiality  communication  simplicity  mobile  phones  desks  furniture  computers  laptops  etiquette  conviviality  scheduling  teams  interdependence  canon  sfsh 
march 2016 by robertogreco
From AI to IA: How AI and architecture created interactivity - YouTube
"The architecture of digital systems isn't just a metaphor. It developed out of a 50-year collaborative relationship between architects and designers, on one side, and technologists in AI, cybernetics, and computer science, on the other. In this talk at the O'Reilly Design Conference in 2016, Molly Steenson traces that history of interaction, tying it to contemporary lessons aimed at designing for a complex world."
mollysteenson  2016  ai  artificialintelligence  douglasenglebart  symbiosis  augmentation  christopheralexander  nicholasnegroponte  richardsaulwurman  architecture  physical  digital  mitmedialab  history  mitarchitecturemachinegroup  technology  compsci  computerscience  cybernetics  interaction  structures  computing  design  complexity  frederickbrooks  computers  interactivity  activity  metaphor  marvinminsky  heuristics  problemsolving  kent  wardcunningham  gangoffour  objectorientedprogramming  apatternlanguage  wikis  agilesoftwaredevelopment  software  patterns  users  digitalspace  interactiondesign  terrywinograd  xeroxparc  petermccolough  medialab 
february 2016 by robertogreco
The Jacob’s Ladder of coding — Medium
"Anecdotes and questions about climbing up and down the ladder of abstraction: Atari, ARM, demoscene, education, creative coding, community, seeking lightness, enlightenment & strange languages"



"With only an hour or two of computer time a week, our learning and progress was largely down to intensive trial & error, daily homework and learning to code and debug with only pencil and paper, whilst trying to be the machine yourself: Playing every step through in our heads (and on paper) over and over until we were confident, the code did as we’d expect, yet, often still failing because of wrong intuitions. Learning this analytical thinking is essential to successful debugging, even today, specifically in languages / environments where no GUI debugger is available. In the late 90s, John Maeda did similar exercises at MIT Media Lab, with students role-playing different parts of a CPU or a whole computer executing a simple process. Later at college, my own CS prof too would often quote Alan Perlis:
“To understand a program you must become both the machine and the program.” — Alan Perlis

Initially we’d only be using the machine largely to just verify our ideas prepared at home (spending the majority of the time typing in/correcting numbers from paper). Through this monastic style of working, we also learned the importance of having the right tools and balance of skills within the group and were responsible to create them ourselves in order to achieve our vision. This important lesson stayed with me throughout (maybe even became) my career so far… Most projects I worked on, especially in the past 15 years, almost exclusively relied on custom-made tooling, which was as much part of the final outcome as the main deliverable to clients. Often times it even was the main deliverable. On the other hand, I’ve also had to learn the hard way that being a largely self-sufficient generalist often is undesired in the modern workplace, which frequently still encourages narrow expertise above all else…

After a few months of convincing my parents to invest all of their saved up and invaluable West-german money to purchase a piece of “Power Without the Price” (a much beloved Atari 800XL) a year before the Wall came down in Berlin, I finally gained daily access to a computer, but was still in a similar situation as before: No more hard west money left to buy a tape nor disk drive from the Intershop, I wasn’t able to save any work (apart from creating paper copies) and so the Atari was largely kept switched on until November 10, 1989, the day after the Berlin Wall was opened and I could buy an XC-12 tape recorder. I too had to choose whether to go the usual route of working with the built-in BASIC language or stick with what I’d learned/taught myself so far, Assembly… In hindsight, am glad I chose the latter, since it proved to be far more useful and transportable knowledge, even today!"



"Lesson learned: Language skills, natural and coded ones, are gateways, opening paths not just for more expression, but also to paths in life.

As is the case today, so it was back then: People tend to organize around specific technological interests, languages and platforms and then stick with them for a long time, for better or worse. Over the years I’ve been part of many such tool-based communities (chronologically: Asm, C, TurboPascal, Director, JS, Flash, Java, Processing, Clojure) and have somewhat turned into a nomad, not being able to ever find a true home in most of them. This might sound judgemental and negative, but really isn’t meant to and these travels through the land of languages and toolkits has given me much food for thought. Having slowly climbed up the ladder of abstraction and spent many years both with low & high level languages, has shown me how much each side of the spectrum can inform and learn from the other (and they really should do more so!). It’s an experience I can highly recommend to anyone attempting to better understand these machines some of us are working with for many hours a day and which impact so much of all our lives. So am extremely grateful to all the kind souls & learning encountered on the way!"



"In the vastly larger open source creative computing demographic of today, the by far biggest groups are tight-knit communities around individual frameworks and languages. There is much these platforms have achieved in terms of output, increasing overall code literacy and turning thousands of people from mere computer users into authors. This is a feat not be underestimated and a Good Thing™! Yet my issue with this siloed general state of affairs is that, apart from a few notable exceptions (especially the more recent arrivals), there’s unfortunately a) not much cross-fertilizing with fundamentally different and/or new ideas in computing going on and b) over time only incremental progress is happening, business as usual, rather than a will to continuously challenge core assumptions among these largest communities about how we talk to machines and how we can do so better. I find it truly sad that many of these popular frameworks rely only on the same old imperative programming language family, philosophy and process, which has been pre-dominant and largely unchanged for the past 30+ years, and their communities also happily avoid or actively reject alternative solutions, which might require fundamental changes to their tools, but which actually could be more suitable and/or powerful to their aims and reach. Some of these platforms have become and act as institutions in their own right and as such also tend to espouse an inward looking approach & philosophy to further cement their status (as owners or pillars?) in their field. This often includes a no-skills-neccessary, we-cater-all-problems promise to their new users, with each community re-inventing the same old wheels in their own image along the way. It’s Not-Invented-Here on a community level: A reliance on insular support ecosystems, libraries & tooling is typical, reducing overall code re-use (at least between communities sharing the same underlying language) and increasing fragmentation. More often than not these platforms equate simplicity with ease (go watch Rich Hickey taking this argument eloquently apart!). The popular prioritization of no pre-requisite knowledge, super shallow learning curves and quick results eventually becomes the main obstacle to later achieve systemic changes, not just in these tools themselves, but also for (creative) coding as discipline at large. Bloatware emerges. Please do forgive if that all sounds harsh, but I simply do believe we can do better!

Every time I talk with others about this topic, I can’t help but think about Snow Crash’s idea of “Language is a virus”. I sometimes do wonder what makes us modern humans, especially those working with computing technology, so fundamentalist and brand-loyal to these often flawed platforms we happen to use? Is it really that we believe there’s no better way? Are we really always only pressed for time? Are we mostly content with Good Enough? Are we just doing what everyone else seems to be doing? Is it status anxiety, a feeling we have to use X to make a living? Are we afraid of unlearning? Is it that learning tech/coding is (still) too hard, too much of an effort, which can only be justified a few times per lifetime? For people who have been in the game long enough and maybe made a name for themselves in their community, is it pride, sentimentality or fear of becoming a complete beginner again? Is it maybe a sign that the way we teach computing and focus on concrete tools too early in order to obtain quick, unrealistically complex results, rather than fundamental (“boring”) knowledge, which is somewhat flawed? Is it our addiction to largely focus on things we can document/celebrate every minor learning step as an achievement in public? This is no stab at educators — much of this systemic behavior is driven by the sheer explosion of (too often similar) choices, demands made by students and policy makers. But I do think we should ask ourselves these questions more often."

[author's tweet: https://twitter.com/toxi/status/676578816572067840 ]
coding  via:tealtan  2015  abstraction  demoscene  education  creativecoding  math  mathematics  howwelearn  typography  design  dennocoil  alanperlis  johnmaeda  criticalthinking  analyticalthinking  basic  programming  assembly  hexcode  georgedyson  computing  computers  atari  amiga  commodore  sinclair  identity  opensource  insularity  simplicity  ease  language  languages  community  communities  processing  flexibility  unschooling  deschooling  pedagogy  teaching  howweteach  understanding  bottomup  topdown  karstenschmidt 
december 2015 by robertogreco
Is It Time to Give Up on Computers in Schools?
"This is a version of the talk I gave at ISTE today on a panel titled "Is It Time to Give Up on Computers in Schools?" with Gary Stager, Will Richardson, Martin Levins, David Thornburg, and Wayne D'Orio. It was pretty damn fun.

Take one step into that massive shit-show called the Expo Hall and it’s hard not to agree: “yes, it is time to give up on computers in schools.”

Perhaps, once upon a time, we could believe ed-tech would change things. But as Seymour Papert noted in The Children’s Machine,
Little by little the subversive features of the computer were eroded away: … the computer was now used to reinforce School’s ways. What had started as a subversive instrument of change was neutralized by the system and converted into an instrument of consolidation.

I think we were naive when we ever thought otherwise.

Sure, there are subversive features, but I think the computers also involve neoliberalism, imperialism, libertarianism, and environmental destruction. They now involve high stakes investment by the global 1% – it’s going to be a $60 billion market by 2018, we’re told. Computers are implicated in the systematic de-funding and dismantling of a public school system and a devaluation of human labor. They involve the consolidation of corporate and governmental power. They involve scientific management. They are designed by white men for white men. They re-inscribe inequality.

And so I think it’s time now to recognize that if we want education that is more just and more equitable and more sustainable, that we need to get the ideologies that are hardwired into computers out of the classroom.

In the early days of educational computing, it was often up to innovative, progressive teachers to put a personal computer in their classroom, even paying for the computer out of their own pocket. These were days of experimentation, and as Seymour teaches us, a re-imagining of what these powerful machines could enable students to do.

And then came the network and, again, the mainframe.

You’ll often hear the Internet hailed as one of the greatest inventions of mankind – something that connects us all and that has, thanks to the World Wide Web, enabled the publishing and sharing of ideas at an unprecedented pace and scale.

What “the network” introduced in educational technology was also a more centralized control of computers. No longer was it up to the individual teacher to have a computer in her classroom. It was up to the district, the Central Office, IT. The sorts of hardware and software that was purchased had to meet those needs – the needs and the desire of the administration, not the needs and the desires of innovative educators, and certainly not the needs and desires of students.

The mainframe never went away. And now, virtualized, we call it “the cloud.”

Computers and mainframes and networks are points of control. They are tools of surveillance. Databases and data are how we are disciplined and punished. Quite to the contrary of Seymour’s hopes that computers will liberate learners, this will be how we are monitored and managed. Teachers. Students. Principals. Citizens. All of us.

If we look at the history of computers, we shouldn’t be that surprised. The computers’ origins are as weapons of war: Alan Turing, Bletchley Park, code-breakers and cryptography. IBM in Germany and its development of machines and databases that it sold to the Nazis in order to efficiently collect the identity and whereabouts of Jews.

The latter should give us great pause as we tout programs and policies that collect massive amounts of data – “big data.” The algorithms that computers facilitate drive more and more of our lives. We live in what law professor Frank Pasquale calls “the black box society.” We are tracked by technology; we are tracked by companies; we are tracked by our employers; we are tracked by the government, and “we have no clear idea of just how far much of this information can travel, how it is used, or its consequences.” When we compel the use of ed-tech, we are doing this to our students.

Our access to information is constrained by these algorithms. Our choices, our students’ choices are constrained by these algorithms – and we do not even recognize it, let alone challenge it.

We have convinced ourselves, for example, that we can trust Google with its mission: “To organize the world’s information and make it universally accessible and useful.” I call “bullshit.”

Google is at the heart of two things that computer-using educators should care deeply and think much more critically about: the collection of massive amounts of our personal data and the control over our access to knowledge.

Neither of these are neutral. Again, these are driven by ideology and by algorithms.

You’ll hear the ed-tech industry gleefully call this “personalization.” More data collection and analysis, they contend, will mean that the software bends to the student. To the contrary, as Seymour pointed out long ago, instead we find the computer programming the child. If we do not unpack the ideology, if the algorithms are all black-boxed, then “personalization” will be discriminatory. As Tressie McMillan Cottom has argued “a ‘personalized’ platform can never be democratizing when the platform operates in a society defined by inequalities.”

If we want schools to be democratizing, then we need to stop and consider how computers are likely to entrench the very opposite. Unless we stop them.

In the 1960s, the punchcard – an older piece of “ed-tech” – had become a symbol of our dehumanization by computers and by a system – an educational system – that was inflexible, impersonal. We were being reduced to numbers. We were becoming alienated. These new machines were increasing the efficiency of a system that was setting us up for a life of drudgery and that were sending us off to war. We could not be trusted with our data or with our freedoms or with the machines themselves, we were told, as the punchcards cautioned: “Do not fold, spindle, or mutilate.”

Students fought back.

Let me quote here from Mario Savio, speaking on the stairs of Sproul Hall at UC Berkeley in 1964 – over fifty years ago, yes, but I think still one of the most relevant messages for us as we consider the state and the ideology of education technology:
We’re human beings!

There is a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can’t take part; you can’t even passively take part, and you’ve got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you’ve got to make it stop. And you’ve got to indicate to the people who run it, to the people who own it, that unless you’re free, the machine will be prevented from working at all!

We’ve upgraded from punchcards to iPads. But underneath, a dangerous ideology – a reduction to 1s and 0s – remains. And so we need to stop this ed-tech machine."
edtech  education  audreywatters  bias  mariosavio  politics  schools  learning  tressuemcmillancottom  algorithms  seymourpapert  personalization  data  security  privacy  howwteach  howwelearn  subversion  computers  computing  lms  neoliberalism  imperialism  environment  labor  publicschools  funding  networks  cloud  bigdata  google  history 
july 2015 by robertogreco
The Internet of Things You Don’t Really Need - The Atlantic
"We already chose to forego a future of unconnected software. All of your devices talk constantly to servers, and your data lives in the Cloud because there’s increasingly no other choice. Eventually, we won’t have unconnected things, either. We’ve made that choice too, we just don’t know it yet. For the moment, you can still buy toasters and refrigerators and thermostats that don’t talk to the Internet, but try to find a new television that doesn’t do so. All new TVs are smart TVs, asking you to agree to murky terms and conditions in the process of connecting to Netflix or Hulu. Soon enough, everything will be like Nest. If the last decade was one of making software require connectivity, the next will be one of making everything else require it. Why? For Silicon Valley, the answer is clear: to turn every industry into the computer industry. To make things talk to the computers in giant, secured, air-conditioned warehouses owned by (or hoping to be owned by) a handful of big technology companies.

But at what cost? What improvements to our lives do we not get because we focused on “smart” things? Writing in The Baffler last year, David Graeber asked where the flying cars, force fields, teleportation pods, space colonies, and all the other dreams of the recent past’s future have gone. His answer: Technological development was re-focused so that it wouldn’t threaten existing seats of power and authority. The Internet of Things exists to build a market around new data about your toasting and grilling and refrigeration habits, while duping you into thinking smart devices are making your lives better than you could have made them otherwise, with materials other than computers. Innovation and disruption are foils meant to distract you from the fact that the present is remarkably similar to the past, with you working even harder for it.

But it sure feels like it makes things easier, doesn’t it? The automated bike locks and thermostats all doing your bidding so you can finally be free to get things done. But what will you do, exactly, once you can monitor your propane tank level from the comfort of the toilet or the garage or the liquor store? Check your Gmail, probably, or type into a Google Doc on your smartphone, maybe. Or perhaps, if you’re really lucky, tap some ideas into Evernote for your Internet of Things startup’s crowdfunding campaign. “It’s gonna be huge,” you’ll tell your cookout guests as you saw into a freshly grilled steak in the cool comfort of your Nest-controlled dining room. “This is the future.”"
2015  ianbogost  iot  internetofthings  design  davidgraeber  labor  siliconvalley  technology  power  authority  innovation  disruption  work  future  past  present  marketing  propaganda  google  cloud  cloudcomputing  computers  code  googledocs  ubicomp  ubiquitouscomputing  everyware  adamgreenfield  amazon  dropbox  kickstarter 
june 2015 by robertogreco
Bat, Bean, Beam: The broken book
"The book weighs only 170 grams but has a potentially very large – although not infinite – number of pages. It is made of plastic and rubber, and a translucent sheet at the front that acts like a window for reading its contents.

The book is portable, durable and robust, but not robust enough that you should sit on it. Which unfortunately is what I did with mine. It bent under my weight and something inside made a crunching sound. When I looked again, the black case of plastic and rubber looked intact but I could tell that the book had been damaged. The bottom half of the page I was reading when I put the book down was badly smudged, as if the text had been drawn it pencil and someone had hastily rubbed it with an eraser. Otherwise, the book was fine. I could still turn the pages and view the top half of each one.

Given the very low energy consumption and lack of significant moving parts, I could preserve the book in this state for quite a long time, there to uselessly collect the top half of a few dozen books and many more articles and essays.

What I chose to do instead was open the book and look inside. This proved a surprisingly difficult task, as the back rubber panel of my damaged Amazon Kindle was held in place by eight very tight clips and took a lot of prying. I wasn’t just driven by curiosity: seeing as I possess an older keyboard model with the screen still intact, I thought I could carry out a little transplant, in the off chance that parts were compatible. I found websites dedicated to replacing a screen on those older models, but nothing for my relatively more recent Kindle 5.

Once I finally removed the back cover, the book looked like this.

[…]

Those marks are a concrete reminder that there is something very particular about these book machines.

Words can be rearranged on a computer screen at will, but they remain virtual, and when I turn the screen off they vanish as if they had never existed. To bring them into the analogue world of inert objects, I need to print them on paper, and then they behave in every way like the old technology. Electronic books straddle those two worlds, typesetting at each turn the ordinary page of a book, only on a special plastic instead of paper. And if the book machine breaks, as it could do at any moment (and eventually will, since the battery cannot be replaced), that last page will become permanent, as if out of your whole library you had chosen to print that one alone.

I enjoyed tinkering with my broken book, although I am not sure what I learned from the experience. It seems likely to me, as it does to many historians and scholars, that the form of the technologies in which our words are written and read affects our psychology as writers and readers, therefore the character that textuality takes in any given epoch. It’s just too early to say exactly what those effects will be for ours. All the same I occasionally worry that books without physical dimensions will entail a loss; that their ghost materiality will make them mean less. As I peer within the layers of the screen of my dead Kindle I am reminded that this is not quite so, and that aspects of that history survive –for history is always the hardest to die."
kindle  giovannitiso  2015  electronics  eink  ebooks  publishing  digital  technology  computers  screens  computing  displays 
may 2015 by robertogreco
Eyeo 2014 - Leah Buechley on Vimeo
"Thinking About Making – An examination of what we mean by making (MAKEing) these days. What gets made? Who makes? Why does making matter?"



[uninscusive covers of Make Magazine and composition of Google employment]

“Meet the new boss, same as the old boss”

"I'm really tired of setting up structures where we tell young women and young brown and black kids that they should aspire to be like rich white guys."

[RTd these back than, but never watched the video. Thanks, Sara for bringing it back up.

https://twitter.com/arikan/status/477546169329938432
https://twitter.com/arikan/status/477549826498764801 ]

[Talk with some of the same content from Leah Buechley (and a lot of defensive comments from the crowd that Buechleya addresses well):
http://edstream.stanford.edu/Video/Play/883b61dd951d4d3f90abeec65eead2911d
https://www.edsurge.com/n/2013-10-29-make-ing-more-diverse-makers ]
leahbuechley  making  makermovement  critique  equality  gender  race  2014  via:ablerism  privilege  wealth  glvo  openstudioproject  lcproject  democratization  inequality  makemagazine  money  age  education  electronics  robots  robotics  rockets  technology  compsci  computerscience  computing  computers  canon  language  work  inclusivity  funding  google  intel  macarthurfoundation  opportunity  power  influence  movements  engineering  lowriders  pottery  craft  culture  universality  marketing  inclusion 
may 2015 by robertogreco
Kardashian Krypt - Chrome Web Store
"Covertly send messages to friends, family, paramours & more by hiding messages in pictures of Kim Kardashian!!!!!

Leverage Kim Kardashian's visual omnipresence thru KARDASHIAN KRYPT, a steganography Chrome extension that hides your messages in pictures of Kim Kardashian.

Easy to use, optional passwords for XTRA protection!!"

[See also:
http://fffff.at/kardashian-krypt/
http://motherboard.vice.com/read/finally-a-way-to-send-secret-messages-inside-pictures-of-kim-kardashian

and

http://fffff.at/kanyefy-your-dock/
http://www.avclub.com/article/heres-how-kanye-fy-your-apple-dock-206030 ]
maddyvarner  encryption  chrome  extensions  kimkardashian  kanyewest  computing  computers  data  imagery  mac  osx 
may 2015 by robertogreco
Winning Isn’t Everything — Matter — Medium
"I used to think that games would be the dominant medium of the 21st century. The reality? They’re too big, too complex, and too smart for that to be true."



"Despite all the aspirational chatter, a decade and a half into the 21st century a ludic century seems unlikely. Impossible, even. Perhaps it’s time to take a step back from grand proclamations about the past or the future of media, and instead treat it with the attention to detail systems thinking supposedly offers.

There’s a paradox at work in systems literacy. For games to embrace a role as windows onto complexity, as depictions of interconnected systems, they must also reject the very idea of dramatic, revolutionary, disruptive change that drives so much of our contemporary understanding about technology — or about anything whatsoever.

Real systems thinking assumes simple answers are always wrong. Yet when we talk about the future—even the future of games or of systems literacy—we tend to assume that they will unleash their transformative powers in a straightforward way, through ideas like a century with a dominant medium. We are meant to speak like Pollyannas about “changing the world,” rather than admitting that the very notion of changing the world is anathema to the fundamental promise of systems literacy, namely a rejection of simplicity and a distrust of singular answers.

After all, it’s not clear at all that the 20th century is best summarized as a century of the moving image, anyway. Too much happened to pin down a single influence or media form as dominant. Systems thinking would force us to admit that any singular innovation is caught up in a web of others. We could just as easily call the last century the “electric century,” because so many of its inventions and innovations were bound up in the rollout and use of electric power. Or perhaps the “recorded century,” because photography, phonography, and other methods of analog capture and preservation rose to prominence (eventually fusing into film) — not to mention digital information storage. Cinema itself relied on the rise of leisure and the desire for escape, facilitated by two decades of economic catastrophe and war during the Great Depression and World War II. Those features were only further amplified by the rise of suburbanism and automobile culture of the 1950s, where cinema coupled to youth, desire, and freedom.

As the media theorist Marshall McLuhan put it (in 1964, I might add), “a new medium is never an addition to an old one, nor does it leave the old one in peace. It never ceases to oppress the older media until it finds new shapes and positions for them.” McLuhan thinks about media in relation to one another, as a media ecosystem subject to analysis through media ecology. There are just too many elements at work in a medium’s development and decay to single one of them out for special treatment.

When we think about a ludic century or an age of systems literacy, we do so by putting games at the center of the media ecosystem and pondering their influences on our senses and our communities. But such an idea is a fantasy. And there’s no better way of revealing that fantasy than asking instead what conditions would have to exist in order to produce the kind of age that Zimmerman, Spector, Gee, or I have imagined.

A ludic century wouldn’t just be one in which games, play, process, and systems thinking are enhanced, to use one of McLuhan’s terms. It would also be one in which the purportedly non-systemic, non-ludic formats that have reigned in the age of information — namely speech, writing, image, and the moving image — are made obsolete. For systems thinking to reign, linear and narrative thinking would have to wane.

But just the opposite has happened. We’ve never been more surrounded with text and pictures and moving images than we are in the digital era. Over half a century ago, the MIT computer scientist Alan J. Perlis imagined an age of “procedural literacy” brought about by new computational expertise — an early version of the dream of the ludic century. But instead, digital technology has accelerated the rate of production and consumption of “legacy” media formats like writing and photography.

Mostly we use computers to read, write, and look at things — not to build or experience models of complex worlds, real or imagined. It’s as if the horse still pulled the automobile rather than being displaced by it, or if the phone booth had enjoyed a sustained new fashion as a venue to make private calls, texts, or Snapchats from your smartphone."



"Games are ancient, and they are not going anywhere anytime soon. But their stock is not rising at the rate that their fans’ Twitter streams and Web forums might suggest. Instead of a ludic age, perhaps we have entered an era of shredded media. Some forms persist more than others, but more than any one medium, we are surrounded by the rough-edged bits and pieces of too many media to enumerate. Writing, images, aphorisms, formal abstraction, collage, travesty. Photography, cinema, books, music, dance, games, tacos, cats, car services. If anything, there has never been a weirder, more disorienting, and more lively time to be a creator and a fanatic of media in all their varieties. Why ruin the moment by being the one trying to get everyone to play a game while we’re letting the flowers blossom? A ludic century need not be a century of games. Instead, it can just be a century. With games in it."
ianbogost  2014  games  gaming  systemsthinking  disruption  culture  systemsliteracy  videogames  media  theory  marshallmcluhan  play  film  linear  linearity  photography  video  narrative  alanjperlis  proceduralliteracy  computation  computers  digital  consumption  writing  complexity  ericzimmerman  tomchatfield  warrenspector  austinwintory  jamespaulgee 
march 2015 by robertogreco
The Humane Representation of Thought on Vimeo
"Closing keynote at the UIST and SPLASH conferences, October 2014.
Preface: http://worrydream.com/TheHumaneRepresentationOfThought/note.html

References to baby-steps towards some of the concepts mentioned:

Dynamic reality (physical responsiveness):
- The primary work here is Hiroshi Ishii's "Radical Atoms": http://tangible.media.mit.edu/project/inform/
- but also relevant are the "Soft Robotics" projects at Harvard: http://softroboticstoolkit.com
- and at Otherlab: http://youtube.com/watch?v=gyMowPAJwqo
- and some of the more avant-garde corners of material science and 3D printing

Dynamic conversations and presentations:
- Ken Perlin's "Chalktalk" changes daily; here's a recent demo: http://bit.ly/1x5eCOX

Context-sensitive reading material:
- http://worrydream.com/MagicInk/

"Explore-the-model" reading material:
- http://worrydream.com/ExplorableExplanations/
- http://worrydream.com/LadderOfAbstraction/
- http://ncase.me/polygons/
- http://redblobgames.com/pathfinding/a-star/introduction.html
- http://earthprimer.com/

Evidence-backed models:
- http://worrydream.com/TenBrighterIdeas/

Direct-manipulation dynamic authoring:
- http://worrydream.com/StopDrawingDeadFish/
- http://worrydream.com/DrawingDynamicVisualizationsTalk/
- http://tobyschachman.com/Shadershop/

Modes of understanding:
- Jerome Bruner: http://amazon.com/dp/0674897013
- Howard Gardner: http://amazon.com/dp/0465024335
- Kieran Egan: http://amazon.com/dp/0226190390

Embodied thinking:
- Edwin Hutchins: http://amazon.com/dp/0262581469
- Andy Clark: http://amazon.com/dp/0262531569
- George Lakoff: http://amazon.com/dp/0465037712
- JJ Gibson: http://amazon.com/dp/0898599598
- among others: http://en.wikipedia.org/wiki/Embodied_cognition

I don't know what this is all about:
- http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/
- http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/responses.html

---

Abstract:

New representations of thought — written language, mathematical notation, information graphics, etc — have been responsible for some of the most significant leaps in the progress of civilization, by expanding humanity’s collectively-thinkable territory.

But at debilitating cost. These representations, having been invented for static media such as paper, tap into a small subset of human capabilities and neglect the rest. Knowledge work means sitting at a desk, interpreting and manipulating symbols. The human body is reduced to an eye staring at tiny rectangles and fingers on a pen or keyboard.

Like any severely unbalanced way of living, this is crippling to mind and body. But it is also enormously wasteful of the vast human potential. Human beings naturally have many powerful modes of thinking and understanding.

Most are incompatible with static media. In a culture that has contorted itself around the limitations of marks on paper, these modes are undeveloped, unrecognized, or scorned.

We are now seeing the start of a dynamic medium. To a large extent, people today are using this medium merely to emulate and extend static representations from the era of paper, and to further constrain the ways in which the human body can interact with external representations of thought.

But the dynamic medium offers the opportunity to deliberately invent a humane and empowering form of knowledge work. We can design dynamic representations which draw on the entire range of human capabilities — all senses, all forms of movement, all forms of understanding — instead of straining a few and atrophying the rest.

This talk suggests how each of the human activities in which thought is externalized (conversing, presenting, reading, writing, etc) can be redesigned around such representations.

---

Art by David Hellman.
Bret Victor -- http://worrydream.com "

[Some notes from Boris Anthony:

"Those of you who know my "book hack", Bret talks about exactly what motivates my explorations starting at 20:45 in https://vimeo.com/115154289 "
https://twitter.com/Bopuc/status/574339495274876928

"From a different angle, btwn 20:00-29:00 Bret explains how "IoT" is totally changing everything
https://vimeo.com/115154289
@timoreilly @moia"
https://twitter.com/Bopuc/status/574341875836043265 ]
bretvictor  towatch  interactiondesign  davidhellman  hiroshiishii  softrobotics  robots  robotics  kenperlin  jeromebruner  howardgardner  kieranegan  edwinhutchins  andyclark  jjgibson  embodiedcognition  cognition  writing  math  mathematics  infographic  visualization  communication  graphics  graphicdesign  design  representation  humans  understanding  howwelearn  howwethink  media  digital  dynamism  movement  conversation  presentation  reading  howweread  howwewrite  chalktalk  otherlab  3dprinting  3d  materials  physical  tangibility  depth  learning  canon  ui  informationdesign  infographics  maps  mapping  data  thinking  thoughts  numbers  algebra  arithmetic  notation  williamplayfair  cartography  gestures  placevalue  periodictable  michaelfaraday  jamesclerkmaxell  ideas  print  printing  leibniz  humanism  humanerepresentation  icons  visual  aural  kinesthetic  spatial  tactile  symbols  iot  internetofthings  programming  computers  screens  computation  computing  coding  modeling  exploration  via:robertogreco  reasoning  rhetoric  gerrysussman  environments  scale  virtualization 
march 2015 by robertogreco
26 Bit Driver Kit - iFixit
"Repair on the go made easy.

• Ultra-portable and rugged, this driver kit includes a comfortable handle and 26 specially selected bits to help you overcome most common repair challenges.
• This bit set is the base of our very popular Essential Electronics Toolkit—a great value for simple electronics repairs.
• If you're looking for a more comprehensive selection, check out our end-all 54 Bit Driver Kit.

Note that this kit does not include Pentalobe bits, which are required for iPhone 4/4S/5/5s/5c, and newer models of MacBook Air/Pro. For these bits, check out our more advanced 54 Bit Driver Kit.

Kit Contents:
• 4 mm Driver Handle - rubberized for a sturdy grip and magnetized to hold bits and screws
• 60 mm Driver Extension - increase your reach into smaller devices
• Metal Tweezers - grab hold of small screws and components
• 26 bits in the following sizes:
• Flathead sizes 1.5, 2, 2.5, 3 mm
• Phillips sizes #000, #00, #0, #1, #2
• Torx sizes T4, T5, T6
• Torx Security sizes TR7, TR8, TR9, TR10, TR15, TR20 (compatible with non-security)
• Hex sizes 1.5, 2, 2.5, 3, 4 mm
• Tri-wing sizes #0, #1
• Spanner size U3.0"
tools  via:andrewjanke  repair  computers  repairing 
march 2015 by robertogreco
9 Facts About Computer Security That Experts Wish You Knew
1. Having a strong password actually can prevent most attacks…
2. Just because a device is new does not mean it's safe…
3. Even the very best software has security vulnerabilities…
4. Every website and app should use HTTPS…
5. The cloud is not safe — it just creates new security problems…
6. Software updates are crucial for your protection…
7. Hackers are not criminals…
8. Cyberattacks and cyberterrorism are exceedingly rare…
9. Darknet and Deepweb are not the same thing… "
2015  anneleenewitz  security  computers  passwords  updates  software  hackers  https  web  internet  online  cloud  darkweb  deepnet  cyberattacks  cyberterrorism 
march 2015 by robertogreco
prosthetic knowledge — Intel Compute Stick Announced today - a fully...
"Intel Compute Stick

Announced today - a fully working computer the size of a USB stick which just plugs into an HDMI port of a display:

The Intel® Compute Stick is a new generation compute-on-a-stick device that’s ready-to-go out-of–the-box and offers the performance, quality, and value you expect from Intel. Pre-installed with Windows 8.1* or Linux, get a complete experience on an ultra-small, power-efficient device that is just four inches long, yet packs the power and reliability of a quad-core Intel® Atom™ processor, with built-in wireless connectivity, on-board storage, and a micro SD card slot for additional storage. It’s everything you love about your desktop computer in a device that fits in the palm of your hand.

Computers are cheaper and smaller now - whilst it doesn’t appear to feature any specific graphical card for media capabilities, I’m sure there could be useful applications for tech arts (removing the need of a laptop)

More Here [http://www.intel.com/content/www/us/en/compute-stick/intel-compute-stick.html ]"

[See also: http://www.engadget.com/2015/01/07/intel-compute-stick/

"Your Chromecast may be able to play Netflix, but can it play Crysis? Intel's HDMI Compute Stick probably can't either, but the tiny device does have enough power to run Windows 8.1 apps on your TV. Intel has rather impressively crammed in a quad-core Atom CPU, 32GB of storage and 2GB of RAM, along with a USB port, WiFi and Bluetooth 4.0 support and a mini-USB connector for power (HDMI power will come later). "But why?" you might ask. Intel sees it as a low-priced computer or (pricey) media stick, or even a thin-client device for companies. To up the crazy factor, it may eventually launch a much zippier Core M version. The Windows version will run $149, and if that seems a bit much, a 1GB RAM/8GB memory Linux version is priced at $89. Both will arrive in March."]
intel  computers  hotswapping  windows8  computing  2015  thinclients 
january 2015 by robertogreco
Convivial Tools in an Age of Surveillance
"What would convivial ed-tech look like?

The answer can’t simply be “like the Web” as the Web is not some sort of safe and open and reliable and accessible and durable place. The answer can’t simply be “like the Web” as though the move from institutions to networks magically scrubs away the accumulation of history and power. The answer can’t simply be “like the Web” as though posting resources, reference services, peer-matching, and skill exchanges — what Illich identified as the core of his “learning webs” — are sufficient tools in the service of equity, freedom, justice, or hell, learning.

“Like the Web” is perhaps a good place to start, don’t get me wrong, particularly if this means students are in control of their own online spaces — its content, its data, its availability, its publicness. “Like the Web” is convivial, or close to it, if students are in control of their privacy, their agency, their networks, their learning. We all need to own our learning — and the analog and the digital representations or exhaust from that. Convivial tools do not reduce that to a transaction — reduce our learning to a transaction, reduce our social interactions to a transaction.

I'm not sure the phrase "safe space" is quite the right one to build alternate, progressive education technologies around, although I do think convivial tools do have to be “safe” insofar as we recognize the importance of each other’s health and well-being. Safe spaces where vulnerability isn’t a weakness for others to exploit. Safe spaces where we are free to explore, but not to the detriment of those around us. As Illich writes, "A convivial society would be the result of social arrangements that guarantee for each member the most ample and free access to the tools of the community and limit this freedom only in favor of another member’s equal freedom.”

We can’t really privilege “safe” as the crux of “convivial” if we want to push our own boundaries when it comes to curiosity, exploration, and learning. There is risk associated with learning. There’s fear and failure (although I do hate how those are being fetishized in a lot of education discussions these days, I should note.)

Perhaps what we need to build are more compassionate spaces, so that education technology isn’t in the service of surveillance, standardization, assessment, control.

Perhaps we need more brave spaces. Or at least many educators need to be braver in open, public spaces -- not brave to promote their own "brands" but brave in standing with their students. Not "protecting them” from education technology or from the open Web but not leaving them alone, and not opening them to exploitation.

Perhaps what we need to build are more consensus-building not consensus-demanding tools. Mike Caulfield gets at this in a recent keynote about “federated education.” He argues that "Wiki, as it currently stands, is a consensus *engine*. And while that’s great in the later stages of an idea, it can be deadly in those first stages.” Caulfield relates the story of the Wikipedia entry on Kate Middleton’s wedding dress, which, 16 minutes after it was created, "someone – and in this case it probably matters that is was a dude – came and marked the page for deletion as trivial, or as they put it 'A non-notable article incapable of being expanded beyond a stub.’” Debate ensues on the entry’s “talk” page, until finally Jimmy Wales steps in with his vote: a “strong keep,” adding "I hope someone will create lots of articles about lots of famous dresses. I believe that our systemic bias caused by being a predominantly male geek community is worth some reflection in this context.”

Mike Caulfield has recently been exploring a different sort of wiki, also by Ward Cunningham. This one — called the Smallest Federated Wiki — doesn’t demand consensus like Wikipedia does. Not off the bat. Instead, entries — and this can be any sort of text or image or video, it doesn’t have to “look like” an encyclopedia — live on federated servers. Instead of everyone collaborating in one space on one server like a “traditional” wiki, the work is distributed. It can be copied and forked. Ideas can be shared and linked; it can be co-developed and co-edited. But there isn’t one “vote” or one official entry that is necessarily canonical.

Rather than centralized control, conviviality. This distinction between Wikipedia and Smallest Federated Wiki echoes too what Illich argued: that we need to be able to identify when our technologies become manipulative. We need "to provide guidelines for detecting the incipient stages of murderous logic in a tool; and to devise tools and tool systems that optimize the balance of life, thereby maximizing liberty for all."

Of course, we need to recognize, those of us that work in ed-tech and adopt ed-tech and talk about ed-tech and tech writ large, that convivial tools and a convivial society must go hand-in-hand. There isn’t any sort of technological fix to make education better. It’s a political problem, that is, not a technological one. We cannot come up with technologies that address systematic inequalities — those created by and reinscribed by education— unless we are willing to confront those inequalities head on. Those radical education writers of the Sixties and Seventies offered powerful diagnoses about what was wrong with schooling. The progressive education technologists of the Sixties and Seventies imagined ways in which ed-tech could work in the service of dismantling some of the drudgery and exploitation.

But where are we now? Instead we find ourselves with technologies working to make that exploitation and centralization of power even more entrenched. There must be alternatives — both within and without technology, both within and without institutions. Those of us who talk and write and teach ed-tech need to be pursuing those things, and not promoting consumption and furthering institutional and industrial control. In Illich’s words: "The crisis I have described confronts people with a choice between convivial tools and being crushed by machines.""
toolforconviviality  ivanillich  audreywatters  edtech  technology  education  2014  seymourpapert  logo  alankay  dynabook  mikecaufield  wardcunningham  web  internet  online  schools  teaching  progressive  wikipedia  smallestfederatedwiki  wikis  society  politics  policy  decentralization  surveillance  doxxing  gamergate  drm  startups  venturecapital  bigdata  neilpostman  paulofreire  paulgoodman  datapalooza  knewton  computers  computing  mindstorms  control  readwrite  everettreimer  1960s  1970s  jonathankozol  disruption  revolution  consensus  safety  bravery  courage  equity  freedom  justice  learning 
november 2014 by robertogreco
The Sixth Stage of Grief is Retro-Computing — The Message — Medium
"Imagine having, in your confused adolescence, the friendship of an older, avuncular man who is into computers, a world-traveling photographer who would occasionally head out to, like, videotape the Dalai Lama for a few weeks, then come back and and listen to every word you said while you sat on his porch. A generous, kind person who spoke openly about love and faith and treated people with respect."



"A year after the Amiga showed up—I was 13—my life started to go backwards. Not forever, just for a while. My dad left, money was tight. My clothes were the ones my dad left behind, old blouse-like Oxfords in the days of Hobie Cat surfwear. I was already big and weird, and now I was something else. I think my slide perplexed my peers; if anything they bullied me less. I heard them murmuring as I wandered down the hall.

I was a ghost and I had haunts: I vanished into the computer. I had that box of BBS floppies. One after another I’d insert them into the computer and examine every file, thousands of files all told. That was how I pieced together the world. Second-hand books and BBS disks and trips to the library. I felt very alone but I’ve since learned that it was a normal American childhood, one millions of people experienced.

Often—how often I don’t remember—I’d go over to Tom’s. I’d share my techniques for rotating text in Deluxe Paint, show him what I’d gleaned from my disks. He always had a few spare computers around for generating title sequences in videos, and later for editing, and he’d let me practice with his videocameras. And he would listen to me.

Like I said: Avuncular. He wasn’t a father figure. Or a mother figure. He was just a kind ear when I needed as many kind ears as I could find. I don’t remember what I said; I just remember being heard. That’s the secret to building a network. People want to be heard. God, life, history, science, books, computers. The regular conversations of anxious kids. His students would show up, impossibly sophisticated 19-year-old men and women, and I’d listen to them talk as the sun went down. For years. A world passed over that porch and I got to watch and participate even though I was still a boy.

I constantly apologized for being there, for being so young and probably annoying, and people would just laugh at me. But no one put me in my place. People touched me, hugged me, told me about books to read and movies to watch. I was not a ghost.

When I graduated from high school I went by to sit on the porch and Tom gave me a little brown teddy bear. You need to remember, he said, to be a kid. To stay in touch with that part of yourself.

I did not do this."



"Technology is What We Share

Technology is what we share. I don’t mean “we share the experience of technology.” I mean: By my lights, people very often share technologies with each other when they talk. Strategies. Ideas for living our lives. We do it all the time. Parenting email lists share strategies about breastfeeding and bedtime. Quotes from the Dalai Lama. We talk neckties, etiquette, and Minecraft, and tell stories that give us guidance as to how to live. A tremendous part of daily life regards the exchange of technologies. We are good at it. It’s so simple as to be invisible. Can I borrow your scissors? Do you want tickets? I know guacamole is extra. The world of technology isn’t separate from regular life. It’s made to seem that way because of, well…capitalism. Tribal dynamics. Territoriality. Because there is a need to sell technology, to package it, to recoup the terrible investment. So it becomes this thing that is separate from culture. A product.

I went looking for the teddy bear that Tom had given me, the reminder to be a child sometimes, and found it atop a bookshelf. When I pulled it down I was surprised to find that it was in a tiny diaper.

I stood there, ridiculous, a 40-year-old man with a diapered 22-year-old teddy bear in my hand. It stared back at me with root-beer eyes.

This is what I remembered right then: That before my wife got pregnant we had been trying for kids for years without success. We had considered giving up.

That was when I said to my wife: If we do not have children, we will move somewhere where there is a porch. The children who need love will find the porch. They will know how to find it. We will be as much parents as we want to be.

And when she got pregnant with twins we needed the right-sized doll to rehearse diapering. I went and found that bear in an old box.

I was handed that toy, sitting on Tom’s porch, in 1992. A person offering another person a piece of advice. Life passed through that object as well, through the teddy bear as much as through the operating systems of yore.

Now that I have children I can see how tuned they are to the world. Living crystals tuned to all manner of frequencies. And how urgently they need to be heard. They look up and they say, look at me. And I put my phone away.

And when they go to bed, protesting and screaming, I go to mess with my computers, my old weird imaginary emulated computers. System after system. I open up these time capsules and look at the thousands of old applications, millions of dollars of software, but now it can be downloaded in a few minutes and takes up a tiny portion of a hard drive. It’s all comically antiquated.

When you read histories of technology, whether of successes or failures, you sense the yearning of people who want to get back into those rooms for a minute, back to solving the old problems. How should a window open? How should the mouse look? What will people want to do, when we give them these machines? Who wouldn’t want to go back 20 years—to drive again into the office, to sit before the whiteboard in a beanbag chair, in a place of warmth and clarity, and give it another try?

Such a strange way to say goodbye. So here I am. Imaginary disks whirring and screens blinking as I visit my old haunts. Wandering through lost computer worlds for an hour or two, taking screenshots like a tourist. Shutting one virtual machine down with a sigh, then starting up another one. But while these machines run, I am a kid. A boy on a porch, back among his friends."
paulford  memory  memories  childhood  neoteny  play  wonder  sharing  obituaries  technology  history  sqeak  amiga  textcraft  plan9  smalltalk-80  smalltalk  mac  1980s  1990s  1970s  xerox  xeroxalto  texteditors  wordprocessors  software  emulators  emulations  2014  computers  computing  adolescence  listening  parenting  adults  children  mentors  macwrite  howwelearn  relationships  canon  caring  love  amigaworkbench  commodore  aegisanimator  jimkent  vic-20  commodore64  1985  andywarhol  debbieharry  1987  networks  porches  kindness  humility  lisp  windows3.1  microsoft  microsoftpaint  capitalism  next  openstep  1997  1992  stevejobs  objectivec  belllabs  xeroxparc  inria  doom  macos9  interfacebuilder 
november 2014 by robertogreco
Why the Landline Telephone Was the Perfect Tool - Suzanne Fischer - The Atlantic
"Illich's achievement was a reframing of human relationships to systems and society, in everyday, accessible language. He advocated for the reintegration of community decisionmaking and personal autonomy into all the systems that had become oppressive: school, work, law, religion, technology, medicine, economics. His ideas were influential for 1970s technologists and the appropriate technology movement -- can they be useful today?

In 1971, Illich published what is still his most famous book, Deschooling Society. He argued that the commodification and specialization of learning had created a harmful education system that had become an end in itself. In other words, "the right to learn is curtailed by the obligation to attend school." For Illich, language often pointed to how toxic ideas had poisoned the ways we relate to each other. "I want to learn," he said, had been transmuted by industrial capitalism into "I want to get an education," transforming a basic human need for learning into something transactional and coercive. He proposed a restructuring of schooling, replacing the manipulative system of qualifications with self-determined, community-supported, hands-on learning. One of his suggestions was for "learning webs," where a computer could help match up learners and those who had knowledge to share. This skillshare model was popular in many radical communities.

With Tools for Conviviality (1973), Illich extended his analysis of education to a broader critique of the technologies of Western capitalism. The major inflection point in the history of technology, he asserts, is when, in the life of each tool or system, the means overtake the ends. "Tools can rule men sooner than they expect; the plow makes man the lord of the garden but also the refugee from the dust bowl." Often this effect is accompanied by the rise in power of a managerial class of experts; Illich saw technocracy as a step toward fascism. Tools for Conviviality points out the ways in which a helpful tool can evolve into a destructive one, and offers suggestions for how communities can escape the trap.

So what makes a tool "convivial?" For Illich, "tools foster conviviality to the extent to which they can be easily used, by anybody, as often or as seldom as desired, for the accomplishment of a purpose chosen by the user." That is, convivial technologies are accessible, flexible, and noncoercive. Many tools are neutral, but some promote conviviality and some choke it off. Hand tools, for Illich, are neutral. Illich offers the telephone as an example of a tool that is "structurally convivial" (remember, this is in the days of the ubiquitous public pay phone): anyone who can afford a coin can use it to say whatever they want. "The telephone lets anybody say what he wants to the person of his choice; he can conduct business, express love, or pick a quarrel. It is impossible for bureaucrats to define what people say to each other on the phone, even though they can interfere with -- or protect -- the privacy of their exchange."

A "manipulatory" tool, on the other hand, blocks off other choices. The automobile and the highway system it spawned are, for Illich, prime examples of this process. Licensure systems that devalue people who have not received them, such as compulsory schooling, are another example. But these kinds of tools, that is, large-scale industrial production, would not be prohibited in a convivial society. "What is fundamental to a convivial society is not the total absence of manipulative institutions and addictive goods and services, but the balance between those tools which create the specific demands they are specialized to satisfy and those complementary, enabling tools which foster self-realization."

To foster convivial tools, Illich proposes a program of research with "two major tasks: to provide guidelines for detecting the incipient stages of murderous logic in a tool; and to devise tools and tool systems that optimize the balance of life, thereby maximizing liberty for all." He also suggests that pioneers of a convivial society work through the legal and political systems and reclaim them for justice. Change is possible, Illich argues. There are decision points. We cannot abdicate our right to self-determination, and to decide how far is far enough. "The crisis I have described," says Illich, "confronts people with a choice between convivial tools and being crushed by machines."

Illich's ideas on technology, like his ideas on schooling, were influential among those who spent the 1970s thinking that we might be on the cusp of another world. Some of those utopians included early computer innovators, who saw the culture of sharing, self-determination, and DIY that they lived as something that should be baked into tools.

Computing pioneer Lee Felsenstein has spoken about the direct influence Tools for Conviviality on his work. For him, Illich's description of radio as a convivial tool in Central America was a model for computer development: "The technology itself was sufficiently inviting and accessible to them that it catalyzed their inherent tendencies to learn. In other words, if you tried to mess around with it, it didn't just burn out right away. The tube might overheat, but it would survive and give you some warning that you had done something wrong. The possible set of interactions, between the person who was trying to discover the secrets of the technology and the technology itself, was quite different from the standard industrial interactive model, which could be summed up as 'If you do the wrong thing, this will break, and God help you.' ... And this showed me the direction to go in. You could do the same thing with computers as far as I was concerned." Felsenstein described the first meeting of the legendary Homebrew Computer Club, where 30 or so people tried to understand the Altair together, as "the moment at which the personal computer became a convivial technology."

In 1978, Valentina Borremans of CIDOC prepared a Reference Guide to Convivial Tools. This guide to resources listed many of the new ideas in 1970s appropriate technology -- food self-sufficiency, earth-friendly home construction, new energy sources. But our contemporary convivial tools are mostly in the realm of communications. At their best, personal computers, the web, mobile technology, the open source movement, and the maker movement are contemporary convivial tools. What other convivial technologies do we use today? What tools do we need to make more convivial? Ivan Illich would exhort us to think carefully about the tools we use and what kind of world they are making."
ivanillich  2012  suzannefischer  technology  technogracy  conviviality  unschooling  deschoooling  education  philosophy  history  society  valentinaborremans  leefelsenstein  telephone  landlines  radio  self-determination  diy  grassroots  democracy  computing  computers  internet  web  tools  justice  flexibility  coercion  schools  schooling  openstudioproject  lcproject  learningwebs  credentials  credentialism  learning  howwelearn  commodification  business  capitalism  toolsforconviviality 
july 2014 by robertogreco
Kano
"It's a computer — and you make it yourself"
raspberrypi  kids  children  computers  hardware  kano  kits  classideas 
may 2014 by robertogreco
Everything Is Broken — The Message — Medium
"It was my exasperated acknowledgement that looking for good software to count on has been a losing battle. Written by people with either no time or no money, most software gets shipped the moment it works well enough to let someone go home and see their family. What we get is mostly terrible.

Software is so bad because it’s so complex, and because it’s trying to talk to other programs on the same computer, or over connections to other computers. Even your computer is kind of more than one computer, boxes within boxes, and each one of those computers is full of little programs trying to coordinate their actions and talk to each other. Computers have gotten incredibly complex, while people have remained the same gray mud with pretensions of godhood.

Your average piece-of-shit Windows desktop is so complex that no one person on Earth really knows what all of it is doing, or how.

Now imagine billions of little unknowable boxes within boxes constantly trying to talk and coordinate tasks at around the same time, sharing bits of data and passing commands around from the smallest little program to something huge, like a browser — that’s the internet. All of that has to happen nearly simultaneously and smoothly, or you throw a hissy fit because the shopping cart forgot about your movie tickets.

We often point out that the phone you mostly play casual games on and keep dropping in the toilet at bars is more powerful than all the computing we used to go to space for decades.

NASA had a huge staff of geniuses to understand and care for their software. Your phone has you.

Plus a system of automatic updates you keep putting off because you’re in the middle of Candy Crush Saga every time it asks.

Because of all this, security is terrible. Besides being riddled with annoying bugs and impossible dialogs, programs often have a special kind of hackable flaw called 0days by the security scene. No one can protect themselves from 0days. It’s their defining feature — 0 is the number of days you’ve had to deal with this form of attack. There are meh, not-so-terrible 0days, there are very bad 0days, and there are catastrophic 0days that hand the keys to the house to whomever strolls by. I promise that right now you are reading this on a device with all three types of 0days. “But, Quinn,” I can hear you say, “If no one knows about them how do you know I have them?” Because even okay software has to work with terrible software. The number of people whose job it is to make software secure can practically fit in a large bar, and I’ve watched them drink. It’s not comforting. It isn’t a matter of if you get owned, only a matter of when.

Look at it this way — every time you get a security update (seems almost daily on my Linux box), whatever is getting updated has been broken, lying there vulnerable, for who-knows-how-long. Sometimes days, sometimes years. Nobody really advertises that part of updates. People say “You should apply this, it’s a critical patch!” and leave off the “…because the developers fucked up so badly your children’s identities are probably being sold to the Estonian Mafia by smack addicted script kiddies right now.”



Recently an anonymous hacker wrote a script that took over embedded Linux devices. These owned computers scanned the whole rest of the internet and created a survey that told us more than we’d ever known about the shape of the internet. The little hacked boxes reported their data back (a full 10 TBs) and quietly deactivated the hack. It was a sweet and useful example of someone who hacked the planet to shit. If that malware had actually been malicious, we would have been so fucked.

This is because all computers are reliably this bad: the ones in
hospitals and governments and banks, the ones in your phone, the ones that control light switches and smart meters and air traffic control systems. Industrial computers that maintain infrastructure and manufacturing are even worse. I don’t know all the details, but those who do are the most alcoholic and nihilistic people in computer security. Another friend of mine accidentally shut down a factory with a malformed ping at the beginning of a pen test. For those of you who don’t know, a ping is just about the smallest request you can send to another computer on the network. It took them a day to turn everything back on.

Computer experts like to pretend they use a whole different, more awesome class of software that they understand, that is made of shiny mathematical perfection and whose interfaces happen to have been shat out of the business end of choleric donkey. This is a lie. The main form of security this offers is through obscurity — so few people can use this software that there’s no point in building tools to attack it. Unless, like the NSA, you want to take over sysadmins."



"When we tell you to apply updates we are not telling you to mend your ship. We are telling you to keep bailing before the water gets to your neck.

To step back a bit from this scene of horror and mayhem, let me say that things are better than they used to be. We have tools that we didn’t in the 1990s, like sandboxing, that keep the idiotically written programs where they can’t do as much harm. (Sandboxing keeps a program in an artificially small part of the computer, cutting it off from all the other little programs, or cleaning up anything it tries to do before anything else sees it.)

Certain whole classes of terrible bugs have been sent the way of smallpox. Security is taken more seriously than ever before, and there’s a network of people responding to malware around the clock. But they can’t really keep up. The ecosystem of these problems is so much bigger than it was even ten years ago that it’s hard to feel like we’re making progress.

People, as well, are broken.

“I trust you…” was my least favorite thing to hear from my sources in Anonymous. Inevitably it was followed by some piece of information they shouldn’t have been telling me. It is the most natural and human thing to share something personal with someone you are learning to trust. But in exasperation I kept trying to remind Anons they were connecting to a computer, relaying though countless servers, switches, routers, cables, wireless links, and finally to my highly targeted computer, before they were connecting to another human being. All of this was happening in the time it takes one person to draw in a deep, committal breath. It’s obvious to say, but bears repeating: humans were not built to think this way.

Everyone fails to use software correctly. Absolutely everyone, fucks up. OTR doesn’t encrypt until after the first message, a fact that leading security professionals and hackers subject to 20-country manhunts consistently forget. Managing all the encryption and decryption keys you need to keep your data safe across multiple devices, sites, and accounts is theoretically possible, in the same way performing an appendectomy on yourself is theoretically possible. This one guy did it once in Antarctica, why can’t you?

Every malware expert I know has lost track of what some file is, clicked on it to see, and then realized they’d executed some malware they were supposed to be examining. I know this because I did it once with a PDF I knew had something bad in it. My friends laughed at me, then all quietly confessed they’d done the same thing. If some of the best malware reversers around can’t keep track of their malicious files, what hope do your parents have against that e-card that is allegedly from you?"



"Security and privacy experts harangue the public about metadata and networked sharing, but keeping track of these things is about as natural as doing blood panels on yourself every morning, and about as easy. The risks on a societal level from giving up our privacy are terrible. Yet the consequences of not doing so on an individual basis are immediately crippling. The whole thing is a shitty battle of attrition between what we all want for ourselves and our families and the ways we need community to survive as humans — a Mexican stand off monetized by corporations and monitored by governments.

I live in this stuff, and I’m no better. Once when I had to step through a process to verify myself to a secretive source. I had to take a series of pictures showing my location and the date. I uploaded them, and was allowed to proceed with my interview. It turns out none of my verification had come through, because I’d failed to let the upload complete before nervously shutting down my computer. “Why did you let me through?” I asked the source. “Because only you would have been that stupid,” my source told me.

Touché.

But if I can’t do this, as a relatively well trained adult who pays attention to these issues all the damn time, what chance do people with real jobs and real lives have?

In the end, it’s culture that’s broken.

A few years ago, I went to several well respected people who work in privacy and security software and asked them a question.

First, I had to explain something:

“Most of the world does not have install privileges on the computer they are using.”
That is, most people using a computer in the world don’t own the computer they are using. Whether it’s in a cafe, or school, or work, for a huge portion of the world, installing a desktop application isn’t a straightforward option. Every week or two, I was being contacted by people desperate for better security and privacy options, and I would try to help them. I’d start, “Download th…” and then we’d stop. The next thing people would tell me they couldn’t install software on their computers. Usually this was because an IT department somewhere was limiting their rights as a part of managing a network. These people needed tools that worked with what they had access to, mostly a browser.

So the question I put to hackers… [more]
quinnnorton  privacy  security  software  2014  heartbleed  otr  libpurple  malware  computers  computing  networks  nsa  fbi 
may 2014 by robertogreco
dy/dan » Blog Archive » Adaptive Learning Is An Infinite iPod That Only Plays Neil Diamond
"If all you've ever heard in your life is Neil Diamond's music, you might think we've invented something quite amazing there. Your iPod contains the entire universe of music. If you've heard any other music at all, you might still be impressed by this infinite iPod. Neil wrote a lot of music after all, some of it good. But you'll know we're missing out on quite a lot also.

So it is with the futurists, many of whom have never been in a class where math was anything but watching someone lecture about a procedure and then replicating that procedure twenty times on a piece of paper. That entire universe fits neatly within a computer-adaptive model of learning.

But for math educators who have experienced math as a social process where students conjecture and argue with each other about their conjectures, where one student's messy handwritten work offers another student a revelation about her own work, a process which by definition can't be individualized or self-paced, computer-adaptive mathematics starts to seem rather limited.

Lectures and procedural fluency are an important aspect of a student's mathematics education but they are to the universe of math experiences as Neil Diamond is to all the other amazing artists who aren't Neil Diamond.

If I could somehow convince the futurists to see math the same way, I imagine our conversations would become a lot more productive.

BTW. While I'm here, Justin Reich wrote an extremely thoughtful series of posts on adaptive learning last month that I can't recommend enough:

Blended Learning, But The Data Are Useless
http://blogs.edweek.org/edweek/edtechresearcher/2014/04/blended_learning_but_the_data_are_useless.html

Nudging, Priming, and Motivating in Blended Learning
http://blogs.edweek.org/edweek/edtechresearcher/2014/04/nudging_priming_and_motivating_in_blended_learning.html

Computers Can Assess What Computers Do Best
http://blogs.edweek.org/edweek/edtechresearcher/2014/04/computers_can_assess_what_computers_do_best.html "
danmeyer  edtech  adaptivelearning  education  2014  blendedlearning  lectures  neildiamond  computing  computers  closedsystems  transcontextualization  via:lukeneff  transcontextualism 
may 2014 by robertogreco
No, Tech Adoption Is Not Speeding Up
"Well, what do you know? The graph doesn't show a progressively faster rate of technology adoption by the American public. What was once a clean graph that fit convenient and largely unquestioned ideas about exponential growth in tech suddenly becomes more complex.P

But please don't go passing around this new graph either. Because it's nearly as worthless as Vox's graph as a way to understand the history of technology. Why would it matter how long a technology took to go from "invention" (a really messy and complex concept) to 25 percent adoption?P

Fun With Arbitrary Numbers

If we really want to play this game, perhaps we can look at a different measure of adoption: from about 5 percent to 50 percent. To be clear, this is just as arbitrary as trying to pin down an invention date and seeing how many years it took to reach 25 percent adoption. But it feels like a slightly more honest way to measure tech growth.P

When a technology is in about 5% of American households, this means it's still in the hands of early adopters, tinkerers, and the wealthy. Breaching 50 percent usually means that it's within the reach of the middle class. So what if we look at TV technology through this lens?"
data  mattnovak  2014  technology  radio  television  internet  electricity  statistics  adoption  mobile  phones  cellphones  telephones  computers  pcs 
april 2014 by robertogreco
Should We Automate Education? | EdTech Magazine
"In 1962, Raymond Callahan published Education and the Cult of Efficiency, a historical account of the influence that “scientific management” (also known as “Taylorism,” after its developer, Frederick Taylor) had on American schools in the early 20th century — that is, the push to run schools more like factories, where the productivity of workers was measured, controlled and refined.

Callahan’s main argument was that the pressures on the education system to adopt Taylorism resulted neither in more refined ways to teach nor in better ways to learn, but rather, in an emphasis on cost cutting. Efficiency, he argued, “amounted to an analysis of the budget. … Decisions on what should be taught were not made on educational, but on financial grounds.”

Fifty years later, we remain obsessed with creating a more “efficient” educational system (although ironically, we object to schools based on that very “factory model”). Indeed, this might be one of the major promises that educational technologies make: to deliver a more efficient way to teach and learn, and a more efficient way to manage schooling.

Deciding What We Want From Education

Adaptive learning — computer-based instruction and assessment that allows each student to move at her or his pace — is perhaps the latest in a series of technologies that promise more ­efficient education. The efficiency here comes, in part, from the focus on the individual — personalization — instead of on an entire classroom of students.

But it’s worth noting that adaptive learning isn’t new. “Intelligent tutoring systems” have been under development for decades now. The term “intelligent tutoring” was coined in the 1980s; research into computer-assisted instruction dates to the 1960s; and programmed instruction predates the computer altogether, with Sidney Pressey’s and B. F. Skinner’s “teaching machines” of the 1920s and 1950s, respectively.

“Education must become more efficient,” Skinner insisted. “To this end, curricula must be revised and simplified, and textbooks and classroom techniques improved.”

Rarely do we ask what exactly “efficiency” in education or ed tech ­entails. Does it mean a reduction in ­errors? Faster learning? Reshaping the curriculum based on market demands? Does it mean cutting labor costs — larger classroom sizes, perhaps, or teachers replaced by machines?

We also often fail to ask why efficiency would be something we would value in education at all. Schools shouldn’t be factories. Students aren’t algorithms.

What happens if we prioritize efficiency in education? By doing so, are we simply upgrading the factory model of schooling with newer technologies? What happens to spontaneity and messiness? What happens to contemplation and curiosity?

There’s danger, I’d argue, in relying on teaching machines — on a push for more automation in education. We forget that we’re teaching humans."
audreywatters  automation  education  edtech  learning  children  humanism  humans  efficiency  2014  1962  raymondcallahan  management  taylorism  factoryschools  schools  industrialeducation  schooling  adaptivelearning  bfskinner  sidneypressey  computers  computing  technology  curiosity  messiness  spontaneity  unschooling  deschooling 
april 2014 by robertogreco
How one college went from 10% female computer-science majors to 40% – Quartz
"Yes, we know there aren’t enough women in tech. Yes, we know we need to change the ratio.

One college has found the answer.

With a three-step method, Harvey Mudd College in California quadrupled its female computer science majors. The experiment started in 2006 when Maria Klawe, a computer scientist and mathematician herself, was appointed college president. That year only 10% of Harvey Mudd’s CS majors were women. The department’s professors devised a plan.

They no longer wanted to weed out the weakest students during the first week of the semester. The new goal was to lure in female students and make sure they actually enjoyed their computer science initiation in the hopes of converting them to majors. This is what they did, in three steps.

1. Semantics count

They renamed the course previously called “Introduction to programming in Java” to “Creative approaches to problem solving in science and engineering using Python.” Using words like “creative” and “problem solving” just sounded more approachable. Plus, as Klawe describes it, the coding language Python is more forgiving and practical.

As part of this first step, the professors divided the class into groups—Gold for those with no coding experience and Black, for those with some coding experience. Then they implemented Operation Eliminate the Macho Effect: guys who showed-off in class were taken aside in class and told, “You’re so passionate about the material and you’re so well prepared. I’d love to continue our conversations but let’s just do it one on one.”

Literally overnight, Harvey Mudd’s introductory CS course went from being the most despised required course to the absolute favorite, says Klawe.

But that was just the beginning.

2. Visualize success

After successfully completing the introductory class, how to ensure female students voluntarily signed up for another CS class? The female professors packed up the students and took them to the annual Grace Hopper Conference, which bills itself as a celebration of women in technology. Klawe says the conference is a place for students to visualize women in technology; humans who happened to be female who love computers. Not everyone looks like the dudes in the trailer for HBO’s Silicon Valley.

3. Make it matter

Finally, the college offered a summer of research between freshman and sophomore years so female students could apply their new skills and make something. “We had students working on things like educational games and a version of Dance Dance Revolution for the elderly. They could use computer technology to actually work on something that mattered,” says Klawe.

The three-step strategy resulted in a domino effect. Female students loved the CS introductory course. They loved going to the conference. So they took “just one more course” and they loved that.

Before they knew it, women were saying, “‘I could be a computer science major, I guess.’ And so they are!” says Klawe.

By the time the first four-year experiment was over the college had gone from 10% female computer science majors to 40% female. UC Berkeley, Duke, Northwestern have had some success with similar tactics."
education  gender  women  girls  programming  coding  compsci  computers  computerscience  harveymuddcollege  semantics  support  learning  mariaklawe  manoushzomorodi  2014  via:sha 
march 2014 by robertogreco
Before Minecraft or Snapchat, there was MicroMUSE – Robin Sloan – Aeon
"As kids, we make secret worlds – in trees, in our imaginations, even online – but can we go back to them when we’re grown?"



"If you explore MicroMUSE today, you’ll get a preview of the fate that awaits all of our social systems. The streets are empty, but it’s more than that: there is a palpable sense of entropy. You can query the system for a list of commands, but many of them no longer work. It’s half glitchy video game, half haunted house. Sometimes it falls offline entirely, only to return days later.

The system still speaks. You are welcomed by the transporter attendant, who gives directions to all newcomers to this space city. It cautions you: Clear communication is very important in a text-based environment…

When I logged in again after many years away – connected directly, no Gopher required, using the Terminal program on my MacBook, sleek descendant of that old Mac Plus – the first thing I did was look for Nib’s Knoll. In truth, I wasn’t sure where to begin. I had long forgotten the path through the holodeck. There were ways to teleport but, to teleport, you need to know where you’re going, and MicroMUSE wouldn’t, or couldn’t, reveal the location of my old home.

It is very likely that it no longer exists, swept away in a database purge sometime during the past 15 years. I mean, really very likely. Ninety-five percent likely.

And yet, the ghostliness of present-day MicroMUSE – the inability of the system to deliver a definitive yea or nay – leaves space for a dim hope. I wander the empty streets, and I see familiar places: structures and descriptions I remember from the mid-1990s. I remember the things I built with Hacker VII, and the feeling that followed when they actually worked. I remember the scrum of users; there would be five or six of us gathered in a room, and it would seem like a crowd, a veritable riot of life.

Hacker VII’s real name was Joe VanDeventer, and today Joe is a web developer in Chicago. Nib Noals’s real name was Robin Sloan, and today I am a writer in San Francisco.

Both of these paths were prefigured almost perfectly on MicroMUSE. All we did there – all we could do – was program and write. Build and describe. Every additional feature called for more words: words to tell a user what he or she was doing, words to show everyone else. It was a whole world made of words. It was the web before the web; it was a novel that could stand up and speak.

I don’t mean to mythologise a crusty old system; its innocence and simplicity were handicaps as much as they were virtues. But even so, I’m grateful that MicroMUSE, of all places, was my training ground. Social systems have values – arguments baked into their design. For example, Twitter’s core argument seems to be: everything should be public, and messages should find the largest audience possible. Snapchat’s might be: communication should be private and ephemeral. The video game Counter-Strike’s is almost certainly: aim for the head. Back in 1994, MicroMUSE’s core argument was: language is all you need. If you can write, it can be real.

I left the holodeck, but I never abandoned that notion.

It is, frankly, miraculous that MicroMUSE still runs at all. It’s not hosted by MIT anymore; the system has migrated to a server called MuseNet. If you can get yourself to a command prompt, you can type ‘telnet micromuse.musenet.org 4201’ and walk the empty streets yourself."
robinsloan  2014  minecraft  muse  micromuse  play  childhood  worldbuilding  imagaination  computers  creativity  online  internet  degradation  disappearance  digitalartifacts 
march 2014 by robertogreco
Philip Guo - Silent Technical Privilege
"Okay that entire paragraph was a lie. Did you believe me? If so, why? Was it because I looked like a kid programming whiz?

When that photo was taken, I didn't even know how to touch-type. My parents were just like, “Quick, pose in front of our new computer!” (Look closely. My fingers aren't even in the right position.) My parents were both humanities majors, and there wasn't a single programming book in my house. In 6th grade, I tried teaching myself BASIC for a few weeks but quit because it was too hard. The only real exposure I had to programming prior to college was taking AP Computer Science in 11th grade, taught by a math teacher who had learned the material only a month before class started. Despite its shortcomings, that class inspired me to major in Computer Science in college. But when I started freshman year at MIT, I felt a bit anxious because many of my classmates actually did have over ten years of childhood programming experience; I had less than one.

SILENT TECHNICAL PRIVILEGE
Even though I didn't grow up in a tech-savvy household and couldn't code my way out of a paper bag, I had one big thing going for me: I looked like I was good at programming. Here's me during freshman year of college:



As an Asian male student at MIT, I fit society's image of a young programmer. Thus, throughout college, nobody ever said to me:

• “Well, you only got into MIT because you're an Asian boy.”

• (while struggling with a problem set) “Well, not everyone is cut out for Computer Science; have you considered majoring in bio?”

• (after being assigned to a class project team) “How about you just design the graphics while we handle the backend? It'll be easier for everyone that way.”

• “Are you sure you know how to do this?”

Although I started off as a complete novice (like everyone once was), I never faced any micro-inequities to impede my intellectual growth. Throughout college and grad school, I gradually learned more and more via classes, research, and internships, incrementally taking on harder and harder projects, and getting better and better at programming while falling deeper and deeper in love with it. Instead of doing my ten years of deliberate practice from ages 8 to 18, I did mine from ages 18 to 28. And nobody ever got in the way of my learning – not even inadvertently – because I looked like the sort of person who would be good at such things.

Instead of facing implicit bias or stereotype threat, I had the privilege of implicit endorsement. For instance, whenever I attended technical meetings, people would assume that I knew what I was doing (regardless of whether I did or not) and treat me accordingly. If I stared at someone in silence and nodded as they were talking, they would usually assume that I understood, not that I was clueless. Nobody ever talked down to me, and I always got the benefit of the doubt in technical settings.

As a result, I was able to fake it till I made it, often landing jobs whose postings required skills I hadn't yet learned but knew that I could pick up on the spot. Most of my interviews for research assistantships and summer internships were quite casual – I looked and sounded like I knew what I was doing, so people just gave me the chance to try. And after enough rounds of practice, I actually did start knowing what I was doing. As I gained experience, I was able to land more meaningful programming jobs, which led to a virtuous cycle of further improvement.

This kind of privilege that I – and other people who looked like me – possessed was silent, manifested not in what people said, but rather in what they didn't say. We had the privilege to spend enormous amounts of time developing technical expertise without anyone's interference or implicit discouragement. Sure, we worked really hard, but our efforts directly translated into skill improvements without much loss due to interpersonal friction. Because we looked the part."
programming  technology  privilege  gender  culture  compsci  computers  2014  philipguo  bias  micro-inequities  sterotypethreat 
january 2014 by robertogreco
Tools | LettError
"Once an alert designer has become familiar with the software, it is to be hoped that questions will arise which the software is incapable of solving. This can be frustrating. You think of an image or a solution that requires a specific combination of functions, and then it turns out not to exist. Or you want to repeat an action a large number of times, while the program does not offer any way of doing it automatically. The toolhorizon comes into view. Should you begin to have doubts about yourself as a designer? On the contrary. It simply means that the people who devised the program did not take your idea into account, so it is a relatively new idea. And it is no bad thing for a designer to have new ideas. All the same, good advice is a rare commodity when you run up against the limits of the tool-kit in the middle of the thinking process. Should designers slow down and adjust their ideas to what the computer can handle? As we know, to design is to make images within given limitations. But not all limitations are the same. Limitations and demands imposed by a client are easier to accept than the arbitrary limitations of your digital tools."



"The critical outsider will note that this method also has its disadvantages. After all, sometimes designing proceeds faster and more securely if nothing is left to chance, if work starts straight away as on the computer with a precision of a hundredth of a millimeter (‘exactly one cm’ is also possible). Is it really handy to generate the layout of a calendar with a program that can shift parameters endlessly? You have to write a program like that first, and that takes a lot of time. Of course not, will be the answer, the first time naturally takes more time and trouble, but that is what makes it so much fun. Design is hardly a challenge any more, but programming is. The paradox of designing like Just and Erik is that a lot of individually written tools are only efficient (in the sense of saving time) if the same sort of design is repeated a large number of times; but that is a very rare occurrence. Explorers, and that is what they are, do not want to do the same thing twice. They prefer to leave that up to ordinary designers, and studios. Which brings us to the second paradox: such designers may never get around to programming. They hope that the scripts and programs of LettError will simply be available on the internet one day. Ready to use."
design  type  computers  toolmaking  making  digitaltoolkit  onlinetoolkit  janmiddendorp  2000  via:tealtan  tools 
december 2013 by robertogreco
In Defense of Messiness: David Weinberger and the iPad Summit - EdTech Researcher - Education Week
[via: http://willrichardson.com/post/67746828029/the-limitations-of-the-ipad ]

"We were very lucky today to have David Weinberger give the opening address at our iPad Summit in Boston yesterday. We've started a tradition at the iPad Summit that our opening keynote speaker should know, basically, nothing about teaching with iPads. We don't want to lead our conversation with technology, we want to lead with big ideas about how the world is changing and how we can prepare people for that changing world.

Dave spoke drawing on research from his most recent book, Too Big To Know: How the Facts are not the Facts, Experts are not Experts, and the Smartest Person in the Room is the Room.

It's hard to summarize a set of complex ideas, but at the core of Dave's argument is the idea that our framing of "knowledge," the metaphysics of knowledge (pause: yes, we start our iPad Summit with discussions of the metaphysics of knowledge), is deeply intertwined with the technology we have used for centuries to collect and organize knowledge: the book. So we think of things that are known as those that are agreed upon and fixed--placed on a page that cannot be changed; we think of them as stopping places--places for chapters to end; we think of them as bounded--literally bounded in the pages of a book; we think of them as organized in a single taxonomy--because each library has to choose a single place for the physical location of each book. The limitations of atoms constrained our metaphysics of knowledge.

We then encoded knowledge into bits, and we began to discover a new metaphysics of knowledge. Knowledge is not bound, but networked. It is not agreed, but debated. It is not ordered, but messy.

A changing shape of knowledge demands that we look seriously at changes in educational practice. For many educators at the iPad Summit, the messiness that David sees as generative the emerging shape of knowledge reflects the messiness that they see in their classrooms. As Holly Clark said in her presentation, "I used to want my administrators to drop in when my students were quiet, orderly, and working alone. See we're learning! Now I want them to drop in when we are active, engaged, collaborative, loud, messy, and chaotic. See, we're learning!"

These linkages are exactly what we hope can happen when we start our conversations about teaching with technology by leading with our ambitions for our students rather than leading with the affordances of a device.

I want to engage David a little further on one point. When I invited David to speak, he said "I can come, but I have some real issues with iPads in education." We talked about it some, and I said, "Great, those sound like serious concerns. Air them. Help us confront them."

David warned us again this morning "I have one curmudgeonly old man slide against iPads," and Tom Daccord (EdTechTeacher co-founder) and I both said "Great." The iPad Summit is not an Apple fanboygirl event. At the very beginning, Apple's staff, people like Paul Facteau, were very clear that iPads were never meant to be computer replacements--that some things were much better done on laptops or computes. Any educator using a technology in their classroom should be having an open conversation about the limitations of their tools.

Tom then gave some opening remarks where he said something to the effect of "The iPad is not a repository of apps, but a portable, media creation device." If you talk to most EdTechTeacher staff, we'll tell you that with an iPad, you get a camera, microphone, connection to the Internet, scratchpad, and keyboard--and a few useful apps that let you use those things. (Apparently, there are all kinds of people madly trying to shove "content" on the iPad, but we're not that interested. For the most part, they've done a terrible job.)

Dave took the podium and said in his introductory remarks, "There is one slide that I already regret." He followed up with this blog post, No More Magic Knowledge [http://www.hyperorg.com/blogger/2013/11/14/2b2k-no-more-magic-knowledge/ ]:
I gave a talk at the EdTechTeacher iPad Summit this morning, and felt compelled to throw in an Angry Old Man slide about why iPads annoy me, especially as education devices. Here's my List of Grievances:
• Apple censors apps
• iPads are designed for consumers. [This is false for these educators, however. They are using iPad apps to enable creativity.]
• They are closed systems and thus lock users in
• Apps generally don't link out
That last point was the one that meant the most in the context of the talk, since I was stressing the social obligation we all have to add to the Commons of ideas, data, knowledge, arguments, discussion, etc.
I was sorry I brought the whole thing up, though. None of the points I raised is new, and this particular audience is using iPads in creative ways, to engage students, to let them explore in depth, to create, and to make learning mobile.

I, for one, was not sorry that Dave brought these issues up. There are real issues with our ability as educators to add to the Commons through iPads. It's hard to share what you are doing inside a walled garden. In fact, one of the central motivations for the iPad Summit is to bring educators together to share their ideas and to encourage them to take that extra step to share their practice with the wider world; it pains me to think of all of the wheels being reinvented in the zillions of schools that have bought iPads. We're going to have to hack the garden walls of the iPad to bring our ideas together to the Common.

The issue of the "closedness" of iPads is also critical. Dave went on to say that one limitation of the iPad is that you can't view source from a browser. (It's not strictly true, but it's a nuisance of a hack--see here or here.) From Dave again:

"Even though very few of us ever do peek beneath the hood -- why would we? -- the fact that we know there's an openable hood changes things. It tells us that what we see on screen, no matter how slick, is the product of human hands. And that is the first lesson I'd like students to learn about knowledge: it often looks like something that's handed to us finished and perfect, but it's always something that we built together. And it's all the cooler because of that."

I'd go further than you can't view source: there is no command line. You can't get under the hood of the operating system, either. You can't unscrew the back. Now don't get wrong, when you want to make a video, I'm very happy to declare that you won't need to update your codecs in order to get things to compress properly. Simplicity is good in some circumstances. But we are captive to the slickness that Dave describes. Let's talk about that.

A quick tangent: Educators come up to me all the time with concerns that students can't word process on an iPad--I have pretty much zero concern about this. Kids can write papers using Swype on a smartphone with a cracked glass. Just because old people can't type on digitized keyboards doesn't mean kids can't (and you probably haven't been teaching them touch-typing anyway).

I'm not concerned that kids can't learn to write English on an iPad, I'm concerned they can't learn to write Python. If you believe that learning to code is a vital skill for young people, then the iPad is not the device for you. The block programming languages basically don't work. There is no Terminal or Putty or iPython Notebook. To teach kids to code, they need a real computer. (If someone has a robust counter-argument to that assertion, I'm all ears.) We should be very, very clear that if we are putting all of our financial eggs in the iPad basket, there are real opportunities that we are foreclosing.

Some of the issues that Dave raises we can hack around. Some we can't. The iPad Summit, all technology-based professional development, needs to be a place where we talk about what technology can't do, along with what it can.

Dave's keynote about the power of open systems reminds us that knowledge is networked and messy. Our classrooms, and the technologies we use to support learning in our classrooms, should be the same. To the extent that the technologies we choose are closed and overly-neat, we should be talking about that.

Many thanks again to Dave for a provocative morning, and many thanks to the attendees of the iPad Summit for joining in and enriching the conversation."
justinreich  ipad  2013  ipadsummit  davidweinberger  messiness  learning  contructionism  howthingswork  edtech  computers  computing  coding  python  scratch  knowledge  fluidity  flux  tools  open  closed  walledgardens  cv  teaching  pedagogy  curriculum  tomdaccord  apple  ios  closedness  viewsource  web  internet  commons  paulfacteau  schools  education  mutability  plasticity 
november 2013 by robertogreco
Identify Yourself
"At its core function, the Internet is a tool for the communication of information, whether factual or fictional. It has allowed us access to knowledge we would have otherwise never known, at a rate that we could have never achieved with printed materials. Each tool that we have developed to spread information has exponentially increased the speed at which it travels, leading to bursts of creativity and collaboration that have accelerated human development and accomplishment. The wired Internet at broadband speeds allows us to consume content so fast that any delay causes us to balk and whine. Wireless Internet made this information network portable and extended our range of knowledge beyond the boundaries of offices and libraries and into the world. Mobile devices have completely transformed our consumption of information, putting tiny computers in our pockets and letting us petition the wishing well of the infoverse.

Many people say this access has made us impatient, and I agree. But I also believe it reveals an innate hunger. We are now so dependent on access to knowledge at these rapid speeds that any lull in our consumption feels like a wasted moment. The currency of the information appears at all levels of society. From seeing new television shows to enjoying free, immediate access to new scientific publications that could impact your life’s work, this rapid transmission model has meaning and changes lives. We have access to information when we are waiting for an oil change and in line for coffee. While we can choose to consume web junk, as many often will, there is also a wealth of human understanding and opinions, academic texts, online courses, and library archives that can be accessed day and night, often for free."



While many seem to experience their Internet lives as a separate space of reality, I have always felt that the two were inextricable. I don’t go on the Internet; I am in the Internet and I am always online. I have extended myself into the machines I carry with me at all times. This space is continually shifting and I veer to adjust, applying myself to new media, continually gathering and recording data about myself, my relationships, my thoughts. I am a immaterial database of memory and hypertext, with invisible links in and out between the Internet and myself.

THE TEXT OBJECT
I would sit for as long as I could and devour information. It was not uncommon for me to devour a book in a single day, limiting all bodily movement except for page-turning, absolutely rapt by whatever I was reading. I was honored to be literate and sure that my dedication to knowledge would lead to great things. I was addicted to the consumption and processing of that information. It frustrated me that I could not read faster and process more. The form of the book provided me structured, linear access to information, with the reward for my attention being a complete and coherent story or idea.

Access to computers and the Internet completely changed the way that I consumed information and organized ideas in my head. I saw information stacked on top of itself in simultaneity, no longer confined to spatiotemporal dimensions of the book. This information was editable, and I could copy, paste, and cut text and images from one place to the next, squirreling away bits that felt important to me. I suddenly understood how much of myself I was finding through digital information."



"There is a system, and there are people within this system. I am only one of them, but I value deeply the opportunities this space grants me, and the wealth contained within it. We must fight to keep the Internet safe and open. Though it has already lost the magical freedom and democracy that existed in the days of the early web, we must continue to put our best minds to work using this extensive network of machines to aid us. Technology gives us so much, and we put so much of ourselves back into it, but we must always remember that we made the web and it will always be tied to us as humans, with our vast range of beauty and ugliness.

I only know my stories, my perspective, but it feels important to take note during this new technical Renaissance, to try and capture the spirit of this shift. I am vastly inspired by the capabilities of my tiny iPhone, my laptop, and all the software contained therein. This feeling is empowerment. The empowerment to learn, to create, and to communicate is something I’ve always felt is at the core of art-making, to be able to translate a complex idea or feeling into some contained or open form. Even the most simple or ethereal works have some form; the body, the image, the object. The file, the machine, the URL, these are all just new vessels for this spirit to be contained.

The files are beautiful, but I move to nominate the Internet as “sublime,” because when I stare into the glass precipice of my screen, I am in awe of the vastness contained within it, the micro and macro, simultaneously hard and technical and soft and human. Most importantly, it feels alive—with constant newness and deepening history, with endless activity and variety. May we keep this spirit intact and continue to explore new vessels into which we can pour ourselves, and reform our identities, shifting into a new world of Internet natives."

[Available as book: http://www.lulu.com/shop/krystal-south/identify-yourself/paperback/product-21189499.html ]
[About page: http://idyrself.com/about.html ]
internet  online  krystalsouth  howweread  howwewrite  atemporality  simultaneity  text  books  internetasliterature  reading  writing  computing  impatience  information  learning  unbook  copypasteculture  mutability  change  sharing  editing  levmanovich  computers  software  technology  sorting  files  taxonomy  instagram  flickr  tagging  folksonomy  facebook  presence  identity  web2.0  language  communication  internetasfavoritebook 
november 2013 by robertogreco
Yeah, I'm free-thinking
"It's tempting to conclude that the computer is the magical ingredient here: just add computers and children can learn anything. But if the story of Sergio Juárez Correa's fifth-grade class is any indication, the secret is the kids organizing themselves to learn."
kottke  self-organizedlearning  holeinthewall  sugatamitra  learning  unschooling  deschooling  2013  computers  edtech 
november 2013 by robertogreco
Q. & A. | Brian Eno on the Best Use of a Television, Why Art Students Make Good Pop Stars and the Meaning of 'Visual Music' - NYTimes.com
"How have computers altered the way you work?

When I first started making ambient music, I was setting up systems using synthesizers that generated pulses more or less randomly. The end result is a kind of music that continuously changes. Of course, until computers came along, all I could actually present of that work was a piece of its output. “Music for Airports,” for instance — the first track is 17 minutes out of a theoretically infinite piece of music. What I really wanted was to present people with the system so that anytime they switched this piece on they would hear a new version of it. That was very difficult to imagine until computers came along. The problem is that listening to a piece of music made by a computer is cumbersome and kind of unattractive. It wasn’t until the iPhone appeared that I thought, “O.K., now everyone has a computer in their pockets.” The apps I have done with my friend Peter Chilvers — Bloom, Trope and Scape — are attempts to explore that possibility of a generative music system that you could use the way you could have used a CD in the past."



"Did you learn anything about yourself in the process?

Nearly all of the works that I’ve made over all the years derive from making a system. Rather than specifying a piece of work in all its details, I wanted to make things that, when you finally switched them on, started to unfold in ways that you hadn’t anticipated. I want them to keep surprising me.

What changed over the years?

There’s an obvious change in scale. A lot of the early light work started with using TV monitors. This was in the late ’70s. I started to think about video as a source of light, rather than a source of image. For me, as a longtime hater of television, this was a very good use of the medium. Then I started looking into slide projectors. I was using between 5 and 10, all projecting on the same surface, so they were all overlaying, in much the same way as different instruments in a piece of music would overlay each other. I’d been making music that was intended to be like painting, in the sense that it’s environmental, without the customary narrative and episodic quality that music normally has. I called this ambient music. But at the same time I was trying to make visual art become more like music, in that it changed the way that music changes. I think that’s what my installations are, really. They’re what the title says — visual music."



"You studied art, not music. Why do you think art schools have produced so many innovative musicians?

Art students by definition are people who are looking at how a medium works, and thinking about what you can do with a medium. They’re different from folk musicians, who in general are accepting of a tradition. That kind of slightly-outside-looking-in approach that art students brought to music meant that they were completely able to accept a lot of new possibilities, whereas music students were not interested in them at all. It’s very conspicuous that there were a lot of art students involved in pop music in the ’60s and ’70s, and very few music students.

There’s another reason for this. By the mid-’60s, recorded music was much more like painting than it was like traditional music. When you went into the studio, you could put a sound down, then you could squeeze it around, spread it all around the canvas. Once you’re working in a multitrack studio, you stop thinking of the music as performance and you start thinking of it as sound painting. After Phil Spector and George Martin and Joe Meek, this new role called the producer had started to become an important creative role. When art students really started flooding into music, it was at exactly that point where recorded music had become more like painting. So it was a natural transition for art students. They knew how to work within a medium that required continual revisiting, where the elements were mutable, could be scraped off and replaced the next day.

Much of your work seems to encourage quiet contemplation, which has spiritual undertones. Is there anything spiritual about what you do?

Your nervous system has two major sectors, the sympathetic and the parasympathetic. The first one is the fight-and-flight zone. I think most popular art is directed towards that. The other part, which is also called the rest-and-digest or breed-and feed, is what you’re using when you relax. My theory is that what I’ve been doing is more directed at that second part. And I think that also is the part of the nervous system people are using when they say they’re having a spiritual experience. Now I want to make clear that I slightly shrink from the word “spiritual,” because I don’t like anything occultish, and I’m not religious."
brianeno  music  systems  art  systemsthinking  interviews  2013  light  video  computing  computers 
november 2013 by robertogreco
The Writer as Meme Machine: How Has the Internet Altered Poetry? : The New Yorker
"It’s not uncommon to see blogs that recount someone’s every sneeze since 2007, or of a man who shoots exactly one second of video every day and strings the clips together in time-lapsed mashups. There is guy who secretly taped all of his conversations for three years and a woman who documents every morsel of food that she puts into her mouth. While some of these people aren’t consciously framing their activities as works of art, Wershler argues that what they’re doing is so close to the practices of sixties conceptualism that the connection between the two can’t be ignored."
kennethgoldsmith  2013  internet  web  memes  conceptualism  conceptualart  lawrenceweschler  writing  computers  interenet  art 
october 2013 by robertogreco
Bat, Bean, Beam: Sixteen tales of information technology in education, 1991-2013
"1.
It was not compulsory. My father, a technician and audio engineer, belonged to an Apple Computer Users’ Group and read print publications – magazines – about computing. The resource closet adjacent to his workroom was stocked floor to ceiling with used audiocassettes, loosely classified by course code."



"4.
It was not compulsory. The new students used it differently; those who came from abroad were willing to spend their home currency on things teachers considered wasteful and expensive, like international mobile phone calls.

One student faced off a test supervisor in mutual bewilderment after he left the room to take a business call and was not allowed back in. "



"8.
It was obligatory. A widespread rumour was that a colleague whose role was made redundant had been targeted because of a refusal to use email, or any technology other than the photocopier.

Another colleague brought long handwritten essays to meetings from which to read counterarguments to whatever was under discussion. There was only ever one copy available."



"12.
It was fragmentary. A student, young and perpetually dazed, came into the office to ask for weeks-old course materials, explanations of content, assignment extensions. Haven’t you read the weekly emails on what you have to do? I asked. Oh, I don’t really check my email, said the student. Too many messages."



"14.
It was breaking into bits, even while it was new.

You can give course notices on your phone.

I only use my phone for emergencies, like in the earthquake.

The hard shell of the open laptop, raised like a drawbridge to deflect, to disconnect.

I don’t want to put a comment in the learning forum because it might be wrong and then I’ll feel dumb.

Is this for homeworks, teacher, on the Internet? Will you give us a grade?"



"16.
It was breaking into bits, even while it was new.

The contact hours in the classroom and the sporadic access in between, the logs that show who has completed the readings and who is offline.

The copyright notices at the photocopier and the ghost-stacks of extracts that chafe at the ten percent limit.

The professional futurists whose utopias will not be mocked, except through the limits of budget proposals.

The noise, the compliance, the surveillance.

The light in the cracks."
edtech  meganclayton  2013  technology  education  schools  teaching  email  mobile  phones  surveillance  compliance  control  bureaucracy  professionaldevelopment  change  computing  computers  internet  web  twitter 
october 2013 by robertogreco
Computers are for people
"Markets are gonna market, and specs are gonna spec, but it often feels like companies are forgetting that computers are for people, first. And people have bodies, those bodies have limitations, and all of us have limitations in specific situations.

We're all disabled sometimes. If I turn off the lights in your room, you can't see. If I fill the room with enough noise, you can't hear. If your hands are full, you can't use them to do anything else.

But as Sara Hendren writes, "all technology is assistive technology." When it's working right, technology helps people of every ability overcome these limitations. It doesn't throw us back into the world of assumptions that expects us all to be fully capable all of the time.

That's not what good technology does. That's not what good design does. That's what assholes do.

I think often about Jason's post on one-handed computing because I'm in the story. He wrote it for his wife, and he wrote it for me. I'd badly broken my right arm in an accident, snapping my radius in half and shooting it out of my body."



"The thing that tech companies forget -- that journalists forget, that Wall Street never knew, that commenters who root for tech companies like sports fans for their teams could never formulate -- that technology is for people -- is obvious to Jason. Technology is for us. All of us. People who carry things.

People. Us. These stupid, stubborn, spectacular machines made of meat and electricity, friends and laughter, genes and dreams."

[Update: see also (via @ablerism):
"It’s a Man’s Phone: My female hands meant I couldn’t use my Google Nexus to document tear gas misuse"
https://medium.com/technology-and-society/its-a-mans-phone-a26c6bee1b69 ]
technology  timcarmody  2013  assistivetechnology  sarahendren  humans  vulnerability  ability  disability  iphone  limitations  computing  computers  accessibility  computersareforpeople  disabilities  zeyneptufekci 
october 2013 by robertogreco
Media Archaeology Lab
"Founded in 2009 and based at the University of Colorado at Boulder, the motto of the Media Archaeology Lab (MAL) is that “the past must be lived so that the present can be seen.” Nearly all digital media labs are conceived of as a place for experimental research using the most up-to-date, cutting-edge tools available. However, the MAL—which is the largest of its kind in North America—is a place for cross-disciplinary experimental research and teaching using obsolete tools, hardware, software and platforms, from the past. The MAL is propelled equally by the need to both preserve and maintain access to historically important media of all kinds – from magic lanterns, projectors, typewriters to personal computers from the 1970s through the 1990s – as well as early works of digital literature/art which were created on the outdated hardware/software housed in the lab."



"What the MAL does best is that it provides direct access to defining moments in the history of computing and e-literature. In addition to landmark computers such as the Commodore 64 from 1982, the Vectrex Gaming Console also from 1982, the Compaq III portable laptop from 1987, the NeXT Cube from 1990, the lab also houses working Apple IIe’s and a rare Apple Lisa. These last two computers are particularly important for understanding the history of personal computing and computer-mediated writing; while they were both released in 1983, the shift in interface from the one to the other, and therefore the shift in the limits and possibilities for what one could create, is remarkable. The Apple II series of computers all used the command-line interface and they were also the first affordable, user-friendly, and so most popular personal computers ever while the Apple Lisa was the first commercial computer to use a Graphical User Interface."
collaboration  computers  medialab  technology  mediaarchaeology  loriemerson  archives  archiving  archaeology  mitmedialab 
july 2013 by robertogreco
Body-Technology Interfaces | Sternlab
"Laptop Compubody Sock for privacy, warmth, and concentration in public spaces

Learn to make your own on Instructables.

See more pictures on Flickr.

Cell Phone Ski Mask

Ski Mask for Eating a Sandwich

Keyboard Interface for Computer Programming"
compubody  wearables  knitting  glvo  computing  computers  mobile  phones  typing  eating  sandwiches  privacy  warmth  gaming  craft  design  technology  via:meetar  chatroulette  wearable 
june 2013 by robertogreco
Syllabus | Technologies for Creative Learning
"This course explores how new technologies can engage people in creative learning experiences – and transform the ways we think about learning. Students will experiment with new learning technologies, discuss educational ideas underlying the technologies, analyze design strategies for creating new technologies, and examine how and what people learn as they use these technologies."

[Wayback: http://web.archive.org/web/20120808072239/http://mas714.media.mit.edu/syllabus ]
syllabus  learning  creativity  mit  constructivism  coding  children  technology  computing  computers  scratch  mindstorms  ivanillich  davidresnick  seymourpapert  mimiito  henryjenkins  barbararogoff  alfiekohn  caroldweck  mihalycsikszentmihalyi  sherryturkle  jamespaulgee  via:dianakimball  readinglists  education  teaching  programming  syllabi 
february 2013 by robertogreco
thinking / about dongles | cooper-hewitt labs
"Think of everything you’ve ever known about formal design and aesthetics multiplied by automated manufacturing and distributed openly available databases of designs (and gotchas) and then multiplied again by the steady, plodding march of technology.

And there’s the rub: The VGA dongle is made even more fascinating in that light. All VGA dongles are the same at one end. The end with the VGA adapter. The end with the weight of a black hole that the computer industry despite all their best efforts, and advances, can’t seem to escape.

In fairness we might just barely be starting to see a world beyond VGA in that fewer and fewer devices are using it as their default input standard but I suspect it will still be another five (probably ten) years before it will be unnecessary to ask whether there’s a VGA-to-whatever adapter.

And that’s the other end of the adapter. That whole other world of trying to improve or re-imagine video display. That whole other world of computers and other…"
computers  computing  history  googleartproject  storytelling  posterity  change  vga  dongles  context  museums  design  cooper-hewitt  2013  aaronstraupcope  from delicious
january 2013 by robertogreco
Douglas Rushkoff's Present Shock: The End Of Time Is Not The End Of The World - Forbes
"Narrative Collapse… In remix culture and contemporary activism, he sees the potential for us to seize the narrative frame and use them in new ways to invent innovative story forms and flexible agendas.

Digiphrenia… Knowing when to be in “the now,” and when to insulate yourself from it can help you reclaim control of your time and attention.

Overwinding… The “shock” part of future shock really comes from how much time we have “springloaded” into the present. …But we can also use this fact in more constructive ways to “springload” time into things, like the example Rushkoff cites of the fully functional “pop-up” hospital that Israel sent to Japan after the Tsunami.

Fractalnoia… Computers, operating out of human time, can in fact discern patterns in that noise, but it is up to us humans to put those patterns in the correct context.

Rushkoff suggests that young people have reacted to the loss of storytellers by realizing they have to become the storyteller."
present  future  singularity  apocalypto  context  patternrecognition  computers  computing  storytelling  linearthinking  linearity  narrativecollapse  digiphrenia  overwinding  fractalnoia  time  presentshock  2012  douglasrushkoff  linear 
december 2012 by robertogreco
Ethiopian kids hack OLPCs in 5 months with zero instruction | DVICE
"Just to give you a sense of what these villages in Ethiopia are like, the kids (and most of the adults) there have never seen a word. No books, no newspapers, no street signs, no labels on packaged foods or goods. Nothing. And these villages aren't unique in that respect; there are many of them in Africa where the literacy rate is close to zero. So you might think that if you're going to give out fancy tablet computers, it would be helpful to have someone along to show these people how to use them, right?

But that's not what OLPC did."

"Within five days, they were using 47 apps per child per day. Within two weeks, they were singing ABC songs [in English] in the village. And within five months, they had hacked Android. Some idiot in our organization or in the Media Lab had disabled the camera! And they figured out it had a camera, and they hacked Android."

[See also: http://www.technologyreview.com/news/506466/given-tablets-but-no-teachers-ethiopian-children-teach-themselves/ ]
motorolazoom  motoroloa  android  learning  2012  autodidacts  autodidactism  curiosity  literacy  deschooling  unschooling  education  computers  holeinthewall  ethiopia  africa  olpc  autodidacticism  from delicious
october 2012 by robertogreco
Main Page - Educate
"[M]ost of the people in Silicon Valley who are starting ed-tech companies (or funding them) don't actually know much about teaching and learning. The goal of this project is to fix that by producing a short guide to the core concepts that someone really must be familiar with to take part in a grown-up conversation about education and technology…someone who has read this guide should have a comprehensive (albeit shallow) knowledge of the educational landscape---there shouldn't be any glaring "unknown unknowns" in their understanding of the subject.

Our target is 50 topics of 1000 words each…possible to read the whole guide on a flight from NY to SF…

…we want to cover five areas:
* education in general
* educational technology
* computing education
* real-world constraints on large-scale change
* illustrative examples

Our prototypical reader is Tim, a 30-year-old programmer in Silicon Valley. He has a degree in Comp Sci from Euphoric State University, has been working for a tech…"
books  technology  computers  computing  collaboration  wiki  2012  theory  learning  amybrown  audreywatters  gregwilson  education  edtech  from delicious
august 2012 by robertogreco
Of Bears, Bats, and Bees: Making Sense of the Internet of Things | Blog | design mind
"Bears are the old guard of computation but are assimilating much of the communication attributes of IoT. Bats are an entirely new category of devices, starting off as solo beasts but slowly, haltingly, turning into an interoperable swarm. Bees on the other hand, are a fascinating flip on the entire problem, virtualizing even the computation within each device. What is clear from this exploration is that the old school capitalism of monopoly economics is not going to see us through. If every company wants to act like a bear, they win in the short run, but we all lose in the long run. We need to remember that the web is not the internet. The web tends to think in terms of winner take all systems like Facebook. The internet, on the other hand, was a fairly humble and simple means of discovery and access: the plumbing of the digital world that allowed the web, and eventually Facebook, to be built. We have to start thinking in layers. It’s perfectly fine if the very top layers are proprietary, that is not the problem. It’s when companies try to own every layer that things go wrong. We have to break up the concept of the internet of things from a proprietary play into a shared play: one where everyone can enter the playground. If we don’t get our head around this, we’ll be spending the next decade spinning from one tiny playground to the next."
internetofthings  internet  2012  capitalism  web  computing  computers  computation  winnertakeall  distributed  hived  swarms  via:Preoccupations  iot 
august 2012 by robertogreco
Fraser Speirs - Blog - The Web Kids' Kids
"The people Czserski describes are not today's pupils but the parents of today's pupils. Those who were teenagers coming of age when the consumer Internet arrived in the mid-90s are today's thirtysomethings whose five-year-olds are enrolling in your schools right now. The transition to teaching 'digital children' is long since past. In a sense, they're our missed generation. The children whose first baby photos were digital are about to enter university. We have five to seven years - maybe ten - until Czserski's Web Kids are the majority of parents. This is a trend that will never reverse itself, so we had better figure out how to meet these parents' aspirations for their children. These parents who grew up fast and online; who adopted laptops and mobile phones, then smartphones and who are now embracing iPad and Kindle. Computers aren't an afterthought for these post-digital Web Parents. They're not even a thought – they just are."
Fraser_Speirs  2012  Piotr_Czerski  technology  computers  via:Preoccupations 
july 2012 by robertogreco
The Great Pretender
"This year, the centennial of Turing's birth, we rightly celebrate Turing's life and accomplishments, the impact of which is difficult to measure sufficiently. But as we do so, we should also take a lesson from the major cultural figure whose centennial we marked last year: Marshall McLuhan. McLuhan teaches us to look beyond the content and application of inventions and discoveries in search of their structures, the logics that motivate them. For McLuhan, television was a collective nervous system pervading every sense, not a dead device for entertainment, education, or moral corruption. If we look at Alan Turing's legacy through McLuhan's lens, a pattern emerges: that of feigning, of deception and interchangeability. If we had to summarize Turing's diverse work and influence, both intentional and inadvertent, we might say he is an engineer of pretenses, as much as a philosopher of them."
Ian_Bogost  Alan_Turing  Marshall_McLuhan  2012  computers  history  networks  via:Preoccupations 
july 2012 by robertogreco
The future will be confusing. Fasten your seat belts. - Do Lectures
"Chris, a designer and computer programmer, asks how computers will change your life, and what happens when technology and genetics collide. The answers are complex and  we may not want to know them. His talk created more debate in the canteen than almost any other."
technology  change  complexity  dolectures  computers  computing  future  chrisheathcote  23&me  taste  supertasters  senses  genetics  science  alfrednorthwhitehead  from delicious
july 2012 by robertogreco
Casey A. Gollan: Notes + Links: Weeks 12, 13, and almost 14
"Nelson and Bush seem to get pretty hung up on technical (or even mechanical) hurdles rather than conceptual ones. There’s a lot of fussing about, in Bush’s case, how to shuffle microfilm around quickly, or in Nelson’s case, complicated server configurations. It reminds me of how characters in sci-fi movies park their hovercars to go use a payphone. These inventors are willing to imagine radically different worlds but can’t let go of the most banal limitations. And the things they lamented not having are no longer pipe dreams! Reading their texts in 2012, there appears to be no reason why a Memex or Xanadu can’t exist, other than that they just don’t. It seems like Nelson specficially, who I guess is still working, is too smart for his own good. Too wrapped up in the details of his obsessions. “It seemed so simple and clear to me then. It still does,” he writes, “But…I mistook a clear view for a short distance.” If perfectionism can be said to plague Nelson’s projects, it must also be acknowledged that it’s his philosophy of choice. I was shocked to read his justification for why Xanadu must be built from scratch, completely and perfectly: “Existing systems do not combine well; hooking them together creates something like the New York subway system.” … Perhaps the problems that bogged Nelson down indefinitely only reveal themselves in time, but I wonder if somebody with more distance or a less stubborn idea of the right way to build things could actually build the thing — even if it isn’t perfect. I also never realized that Bush thought a lot more about interfaces than Nelson, who basically rejected them entirely (at least as far as I’ve read): "How you will look at this world when it is spreadeagled on your screen is your own business: you control it by your choice of screen hardware, by your choice of viewing program, by what you do as you watch, but the structure of the world—the system of interconnections of its stored materials—is the same from screen to screen, no matter how a given screen may show it." … Nelson’s decoupling of backend and frontend is pretty profound. It underscores the base-ness of his ideas: he’s talking about different structures for writing and thinking, not just presenting plain old content in a style that evokes structure. There is not necessarily a visual difference between these two things but conceptually it is huge. Even if the real problems lie in data structures, I can’t help but gravitate towards the descriptive aspects and imagine tools I’d want to use. I love Nelson’s vision of computers as “a waterworks for the mind”: "Your computer screen will be the spigot—or shower nozzle—that dispenses what you need when you turn the handle. But that system must be based on the fluidity of thought—not just its crystallized and static form, which, like water’s, is hard and cold and goes nowhere.""
tednelson  vannevarbush  computers  computing  design  2012  caseygollan  literarymachines  aswemaythink 
may 2012 by robertogreco
« earlier      
per page:    204080120160

related tags

$100  1:1  1to1  3d  3dprinting  23&me  80s  99pi  1940s  1950s  1960s  1970s  1980s  1990s  aaronstraupcope  ability  abstraction  abundance  academia  accents  access  accessibility  accra  acting  activism  activity  adalovelace  adamgreenfield  adaptability  adaptivelearning  add  addiction  adelegoldberg  adhd  administration  adolescence  adoption  adults  adventureplaygrounds  advertising  advice  aegisanimator  affordability  africa  age  ageism  aggregation  aggregator  agilesoftwaredevelopment  agility  ai  AJAX  alanjacobs  alanjperlis  alankay  alanperlis  alanturing  Alan_Turing  alberteinstein  alexismadrigal  alfiekohn  alfrednorthwhitehead  algebra  algorithms  aliciablum-ross  alienaesthetic  alisongopnik  allincottrell  allsorts  almora  alruppersberg  alternative  amazon  ambient  ambition  amiga  amigaworkbench  amybrown  amyorben  analog  analyticalthinking  anarchism  anarchy  ancientgreece  andrewkeen  andrewprzybylski  android  andyclark  andyhertzfeld  andywarhol  animals  animation  annegalloway  anneleenewitz  anonymous  anthropology  antikytheramechanism  anyakamenetz  apatternlanguage  aperturecard  apnea  apocalypto  apolitical  apple  appleii  applications  archaeology  architecture  archive  archives  archiving  arduino  arg  argentina  arithmetic  art  arthistory  arthurcclarke  artichokeblog  artificial  artificialintelligence  artists  ascii  asia  assembly  assistivetechnology  astonishment  astronomy  astrophysics  asus  aswemaythink  atari  atemporality  attention  audreywatters  augmentation  aural  austinwintory  australia  authenticity  authority  autodidacticism  autodidactism  autodidacts  automation  avant-garde  averytrufelman  aviation  awarehome  backup  balance  balloons  bans  barbararogoff  basic  bayarea  behavior  belesshelpful  belgium  belllabs  benjedwards  benwilliamson  bfskinner  bias  bigbrother  bigdata  bighere  bignow  bikes  billmoyers  biofeedback  biography  biology  biomimetics  biomimicry  biotechnology  bittorrent  blainecook  blendedlearning  blending  blockers  blogging  blogs  bobstein  bodies  body  boingboing  bolivia  books  boredom  borges  bothumor  botpoetry  bottomup  boxee  brain  brasil  bravery  brazil  breadth  bretvictor  briandear  brianeno  brids  broadcast  browser  browsers  brucesterling  brycewilner  building  bureaucracy  burningman  business  byod  C64  cafes  calculators  california  californianideology  calltoaction  cambodia  camelcase  cameras  candiceodgers  canon  capitalism  capitalization  careers  cargocult  caring  carltheodorsorensen  caroldweck  carolespearinmccauley  carolinerose  carpentry  cars  cartography  caseygollan  caseyreas  cativaucelle  cc  cdi  cds  cellphones  censorship  cephlalopods  cgi  chalktalk  chameleons  change  chaos  charism  charlesbabbage  chat  chatroulette  chemistry  chess  childcare  childhood  children  chile  choice  chrisanderson  chrisheathcote  chrismessina  christopheralexander  chrome  chromebooks  chromeos  cinema  circuitry  cities  citizenship  cityheights  civics  class  classideas  classification  classism  classroom  classrooms  clivethompson  closed  closedness  closedsystems  cloud  cloudbook  cloudcomputing  clubpenguin  cocreation  code  coding  coercion  coffeehouses  cognition  cognitive  cognitiveautarchy  colinfanning  collaboration  collaborative  collapse  collapsingofreality  collecting  collections  collective  collectiveintelligence  collectivism  colleges  collegiality  colombia  color  comfort  comics  commandline  commentary  comments  commodification  commodore  commodore64  commoncore  commons  commonsense  communalism  communes  communication  communities  community  comparison  complexity  compliance  complicity  compsci  compubody  compulsory  computation  computationalexpression  computationalthinking  computer  computer-generatedtext  computerbullshit  computers  computersareforpeople  computerscience  computing  concentration  concepts  conceptualart  conceptualism  conferences  conferencing  conformity  confusion  connectedlearning  connectivity  consensus  conservatism  constraints  construction  constructionism  constructivism  consumerism  consumption  content  contentcreation  context  continuouspartialattention  control  contructionism  convention  convergence  conversation  conversion  conviviality  conwaysgameoflife  cooling  cooper-hewitt  cooperation  coordination  copying  copypasteculture  copyright  corydoctorow  counterculture  courage  coworking  craft  craftwork  craighickman  crapdetection  create  creation  creative  creativecoding  creativity  credentialism  credentials  criticalthinking  criticism  critique  crosspollination  crowdsourcing  cryptography  cultofpersonality  culture  curiosity  curriculum  curriculumisdead  customization  cv  cyberattacks  cybernetics  cyberspace  cyberterrorism  cyborg  cyborgs  cydharrell  cyoa  dada  daily  damienwilliams  danachisnell  danhill  danhon  danielgreenberg  danielpink  danmeyer  dannyhillis  danrussell  daringfireball  darkweb  data  database  databases  datamining  datapalooza  dataretrieval  davidgraeber  davidhellman  davidmberry  davidresnick  davidsmith  davidsnedden  davidweinberger  debbieharry  debchachra  decentralization  decisionmaking  deepnet  degradation  dehumanization  democracy  democratization  demoscene  denmark  dennocoil  depth  depthoverbreadth  deschooling  deschoooling  design  desks  desktop  detox  development  devices  dictionaries  dictionary  differentiatedlearning  difficulty  digiphrenia  digital  digitalartifacts  digitalcitizenship  digitalculture  digitaldivide  digitalhumanities  digitalinclusion  digitallife  digitalliteracy  digitalnatives  digitalorganisms  digitalspace  digitaltoolkit  digitaluniverse  dignity  directory  disabilities  disability  disappearance  discernment  disconnect  disembodiment  displays  disruption  distraction  distributed  distribution  diversity  diy  djangoreinhardt  documentary  dolectures  dongles  doom  doublebind  dougengelbart  dougkerr  douglasenglebart  douglasrushkoff  download  doxxing  drawing  drawings  driftdeck  drm  dropbox  drugs  ds  dungeonsanddragons  dvds  dynabook  dynamism  e-learning  eames  ease  eating  ebooks  ecology  economics  ecuador  edg  editing  edtech  eduardogaleano  edublogs  education  edutainment  edwardthorndike  edwinhutchins  eeepc  effects  efficiency  egalitarianism  eink  elearning  electricity  electroniccomputation  electronics  elementary  elianeglaser  elinorostrom  ellencondliffe  email  embodiedcognition  embodiment  emotion  emotions  empathy  empowerment  emulations  emulators  encryption  energy  engagement  engineering  english  enterprise  entertainment  entrepreneurship  environment  environments  epub  equality  equity  erase  ericmeyer  ericschmidt  ericzimmerman  essays  ethics  ethiopia  ethnography  etiquette  events  everettreimer  evergreenstatecollege  everyware  evolution  exclusion  existence  experience  experiencedesign  experientiallearning  experimentation  experiments  explaining  exploration  extensions  extraction  fabbing  fabrication  facebook  factoryschools  failure  families  farrahbostic  fbi  fear  featurecreep  fiction  files  filetype  filetype:pdf  filippomarinetti  film  filmmaking  filtering  find  finland  firefox  fit  flash  flexibility  flickr  flocking  flow  fluidity  flux  focus  folksonomy  foocamp  food  ford  forgetting  format  formats  fourthworld  fractalnoia  frankchimero  fraserspeirs  Fraser_Speirs  fredbrooks  frederickbrooks  fredscharmen  fredturner  free  freedom  freelance  freemandyson  freemarkets  freeschools  freeware  friedrichfroebel  froebel  frogs  fun  funding  funomena  furniture  future  futureshock  futurism  futuristmanifesto  gadgets  galleries  game  gamechanging  gamedesign  gamedev  gamergate  games  gaming  gangoffour  gardnercampbell  garrtkasparov  garystager  geek  gender  genderfluidity  gendergap  generalcomputation  generalists  generations  generationyes  generator  genetics  geography  geopolitics  georgedyson  georgiatech  georginavoss  geotagging  germansermičs  gerrysussman  gestures  getlamp  ghana  gifts  gigeconomy  giovannitiso  giovnnitiso  girls  glass  glitchart  global  globalization  glowingrectangles  glvo  gnutella  google  googleartproject  googlechrome  googledocs  googlevoice  governance  government  gps  gradients  graffiti  grahamharman  graphic  graphicdesign  graphics  grassroots  greek  greeks  green  gregborenstein  gregorybateson  gregwilson  groups  groupsize  gtd  gui  guilds  gutenberg  hack  hackers  hacking  hackintosh  hacks  haks  hammerhand  handheld  handhelds  handsoff  handwriting  happiness  harddrive  hardfun  hardware  harveymuddcollege  hashtags  hawalbagh  health  heartbleed  heartrate  heathercamp  henryford  henryjenkins  heuristics  hexcode  highered  highereducation  hippies  hiroshiishii  historians  history  hived  hoaxes  holeinthewall  homebrew  homes  homeschool  honesty  hotswapping  howardgardner  howardrheingold  howthingswork  howto  howwelearn  howweread  howweteach  howwethink  howwework  howwewrite  howwteach  html  https  hulu  human  humanerepresentation  humanism  humanities  humanitiescomputing  humanity  humanprogress  humans  humansarestillunmatched  humanskills  humility  humor  hyderabad  hypercard  hyperlinks  hypertext  hypocrisy  ianbogost  ianjukes  Ian_Bogost  IBM  icann  icloud  iconographicdrift  icons  ict  idealism  ideas  identity  ideology  if  ignazschiffermüller  illustration  IM  imagaination  imagery  images  imagination