robertogreco + privacy   483

Are.na Blog / Reimagining Privacy Online Through A Spectrum of Intimacy
“Our privacy and intimacy metaphors:

The town hall is a digital gathering place that is the most public, somewhat like Twitter. This is where we shout our thoughts or share things we don’t mind thousands of people seeing. The town hall is a public square for speaking loudly and deliberately. Your thoughts can spread virally; They will be heard, amplified and sometimes misinterpreted.

The park bench is semi-public. It’s like walking down the street and engaging in conversation with a coworker or friend, or having a discussion on the tube or in a pub—is a space where anyone can have a conversation between two or a few people, but that conversation takes place in public. Those in the conversation can control who hears it by lowering their voice or walking to a less populated area. This setting is like Facebook: the content you put on Facebook cannot be accessed outside of Facebook, unlike Twitter, Sina Weibo, and others. This little bit of friction creates a higher level of intimacy than the town square, and the result is that it feels slightly more private. Depending upon a user’s settings, content or conversations can be accessed only by people on Facebook (quite a large amount), only by a user’s friends, or only by their friends’ friends.

The next metaphor, the living room, highlights a shift into the “privacy” end of the spectrum, with the town hall and park bench being “public” entities. It’s semi-private, but can also host large groups and conversations that are designed to be public, private, or in-between. This setting allows for more intimacy because it allows for a smaller group. This design functions much like a salon or a group gathered for lively debate. The living room is a metaphor for a closed Facebook group or a WhatsApp chat group.

The loo is most the private of the intimacy metaphors, and the most intimate place for conversations and activities. This is like a private DM or a text message between one or two friends or family members. It is a space to share your thoughts. Secrets are welcomed, and comfortably kept. One can also think of this metaphor as the “bedroom,” an equally intimate space where only a few people are invited in.

**

Our metaphors will not work as a literal guidepost for solving every problem within digital conversations, but we offer these as provocations for looking at how the form and design of a space creates the affordances and functions in that space.

[image: "Install shot of the Tate museum’s Higher Resolution exhibition by Hyphen Labs and Caroline Sinders."]

To start creating solutions for online harassment, tracking, and targeting in social networks, and to create better protections for users online, communication apps and online technology need to think of privacy not just as a security protocol, but as an intimate setting—and something that is already an organic part of our lives. This privacy needs to be designed into how conversations unfold. In practice, this could mean better privacy filters to create small and large groups easily, the ability to turn off comments or replies, the ability to easily share posts or content with a handful of people, and security protocols that protect user’s data and online behavior.  

We need town halls, park benches, living rooms, and loos online in every platform and every piece of technology that hosts social interactions.”
carolinesinders  are.na  harrassment  ai  intimacy  privacy  technology  small  slow  online  internet  web  socialmedia  interaction  interactiondesign  socialnetworking  socialnetworks  communication  conversation  hyphen-labs  michellecortese  andreazeller 
5 days ago by robertogreco
Renata Ávila: "The Internet of creation disappeared. Now we have the Internet of surveillance and control” | CCCB LAB
““At the start of the 21st century, one of the questions that excited me most about access to the Internet was the possibility of producing infinite copies of books and sharing knowledge. That idea of an Internet that was going to be a tool for integration and access to knowledge has shattered into smithereens. It was a booby trap. We are working as the unpaid slaves of the new digital world. I feel that it’s like when the Spanish colonisers reached Latin America. We believed the story of ‘a new world’. And we were in a box, controlled by the most powerful country in the world. We should have regulated a long time before. And we should have said: ‘I will share my photo, but how are you benefitting and how am I?’ Because what we are doing today is work for free; with our time, creativity and energy we are paying these empires. We are giving them everything”.”



“We move into the field of ethics and ask Renata Ávila about three concepts that have modified their meaning in the last decade, precisely due to the acceleration with which we have adopted technology. They are trust, privacy and transparency and how these influence the new generations. We cannot divorce these three questions from the concepts of austerity, precarity and the institutional corruption crisis”, she argues. “Letting strangers into your home to spend the night, is that an excess of trust or the need to seek resources?”.”



“After all that has been discussed, some might think that this Guatemalan activist is so realistic that she leaves no room for optimism. But Renata Ávila does not like being negative and she is convinced that the human race is capable of finding resources to emerge from any “mess”, even at the most critical moments. “We have a perfect cocktail” – she says with a half-smile of worry. “A democratic crisis caused by some terrible leaders in power, with a climate-change and technological crisis. This can only lead to a collective reflection and make us reconsider on what planet we want to live in the future”.”
renataávila  2019  internet  history  surveillance  latinamerica  knowledge  labor  work  colonization  regulation  creativity  capitalism  web  online  activism  democracy  crisis  power  politics  technology  reflection  climatechange  transparency  privacy  corruption  precarity  austerity  trust  influence 
5 weeks ago by robertogreco
Say No to the “Cashless Future” — and to Cashless Stores | American Civil Liberties Union
"It is great to see this pushback against the supposed cashless future because this is a trend that should very much be nipped in the bud. There are several reasons why cashless stores, and a cashless society more broadly, are a bad idea. Such stores are:

Bad for privacy. When you pay cash, there is no middleman; you pay, you receive goods or services — end of story. When a middleman becomes part of the transaction, that middleman often gets to learn about the transaction — and under our weak privacy laws, has a lot of leeway to use that information as it sees fit. (Cash transactions of more than $10,000 must be reported to the government, however.) More on privacy and payment systems in a follow-up post.

Bad for low-income communities. Participation in a cashless society presumes a level of financial stability and enmeshment in bureaucratic financial systems that many people simply do not possess. Opening a bank account requires an ID, which many poor and elderly people lack, as well as other documents such as a utility bill or other proof of address, which the homeless lack, and which generally create bureaucratic barriers to participating in electronic payment networks. Banks also charge fees that can be significant for people living on the economic margins. According to government data from 2017, about one in 15 U.S. households (6.5%) were “unbanked” (had no checking or savings account), while almost one in five (18.7%) were “underbanked” (had a bank account but resorted to using money orders, check cashing, or payday loans). Finally, because merchants usually pass along the cost of credit card fees to all their customers through their prices, the current credit card system effectively serves to transfer money from poor households to high-income households, according to a study by the Federal Reserve.

Bad for people of color. The burden of lack of access to banking services such as credit cards does not fall equally. While 84% of white people in 2017 were what the Federal Reserve calls “fully banked,” only 52% of Black and 63% of Hispanic people were.
Bad for the undocumented. Facing a lack of official identity documents, not to mention all the other obstacles mentioned above, undocumented immigrants can have an even harder time accessing banking services.

Bad for many merchants. Merchants pay roughly 2-3% of every transaction to the credit card companies, which can be a significant “tax,” especially on low-margin businesses. With the credit card sector dominated by an oligopoly of 2-3 companies, there is not enough competition to keep these “swipe fees” low. Big companies have the leverage to negotiate lower fees, but small merchants are out of luck, and the amount that they pay to the credit card companies is often greater than their profit. If cashless stores are allowed to become widespread, that will harm the many merchants who either discourage or flat-out refuse to accept credit cards due to these fees.

Less resilient. The nationwide outage of electronic cash registers at Target stores several weeks ago left customers unable to make purchases — except those who had cash. That’s a reminder that electronic payments systems can mean centralized points of failure — not just technical failures like Target’s, but also security failures. A cashless society would also leave people more susceptible to economic failure on an individual basis: if a hacker, bureaucratic error, or natural disaster shuts a consumer out of their account, the lack of a cash option would leave them few alternatives.

The issue goes beyond restaurants and retail stores; other services that are built around electronic payments should also offer cash options (or cash-like anonymous stored value cards). Those include ride-share services like Uber and Lyft, bike and scooter share systems, and transit systems. In San Francisco, for example, the city’s bike-share program is providing an option to pay with cash. In DC, where I live, the Metro requires a smart card to use — but riders have the option to either register their card so that they can cancel it if it’s lost or stolen, or buy it with cash and not register it to keep it more private."

...

"What to do

So what should you do if you walk into a store and are told: “your cash is no good here”?

Register your objection. Say to the staff, “I know this isn’t your policy personally, but I think it’s a bad one, and I hope you’ll pass that along to your management. Not accepting cash is bad for privacy, bad for poor people, and bad for the undocumented.”

Refuse to provide a credit card. If you haven’t been given very clear advance notice that cash is not accepted, tell them you don’t have a credit card with you and see what they propose. There’s no law that a person has to possess a credit card or furnish one on demand. This may tie up their line, require the calling of a manger, create abandoned food that has already been prepared, and generally create inefficiencies that, if repeated among enough customers, will start to erode the advantages of going cashless for merchants.

Walk out. If you can do without, leave the establishment without buying anything after registering your objection to a staff person so they are aware they’ve lost your business over it.

Understand why some stores charge fees for credit card use. If you visit a store or restaurant that charges a higher price for credit card purchases, understand that this is a socially beneficial policy and be supportive. Merchants are explicitly permitted to pass swipe fees (also known as “interchange fees”) along to customers, which among other things is fairer to low-income customers who don’t have credit cards and shouldn’t have to absorb the costs of those cards. If you are a business, consider passing along those fees to increase fairness as well as customer awareness of how the current system works.

Contact your elected representatives. We have already seen some cities and states ban cashless stores. Your state or city can do so as well.

The bottom line is that the technocratic “dream” of a cashless society is a vision in which we discard what is left of the anonymity that has characterized urban life since the dawn of modernity, and our freedom from the power of centralized companies like banks. Doing without cash may be convenient at times, but if we lose cash as an option we’re going to regret it later."
money  privacy  technology  privilege  cashless  currency  2019  aclu  inequality  resilience  bias  business  economics  policy  siliconvalley  creditcards  cash  technocracy  technosolutionism 
august 2019 by robertogreco
Digital Life Collective
"The Digital Life Collective researches, develops, funds and supports Tech We Trust: technologies that prioritize our autonomy, privacy and dignity. Our tech, not their tech.

Forming a co-operative of like-minded people is the best and perhaps only way to nurture Tech We Trust.

We are a co-operative.

Member funded. Member owned. Member controlled. Our rules are published here, and all our member rights and responsibilities can be found here.

Co-operation requires trust, and Tech We Trust requires co-operation.

Why we're here

The norm today is Tech We Don’t Trust. Every time we engage with digital technologies, we should ask – what’s actually going on here? – and the answer is too typically illusive.

We cannot wait for the invisible hand of the market to help solve this. If anything, it seems to guide many companies in the opposite direction for now. We cannot wait for governments to solve this either. In short, we need to co-operate. And that’s why we exist, why we are incorporated as a co-operative, and why we need you to join us.
JOIN US

Why are you incorporated as a UK co-op?

When we got started, we had several core team members in the UK who were willing to put in the effort and take on the responsibilities of becoming directors and founding members, and some with significant knowledge of UK Co-op structures and their legal requirements. Once we are better resourced/bigger, we anticipate having entities in multiple countries subject to our review of the respective technical and legal environment."
accessibility  collectives  equality  technology  privacy  autonomy  cooperatives 
july 2019 by robertogreco
Search Results for “ Toxic Philanthropy” – Wrench in the Gears
[from “A Skeptical Parent’s Thoughts on Digital Curriculum” (via comments here: https://larrycuban.wordpress.com/2019/07/08/goodbye-altschool-hello-altitude-learning/ )

“Toxic Philanthropy Part Three: The Silicon Valley Community Foundation”
https://wrenchinthegears.com/2019/01/04/toxic-philanthropy-part-three-the-silicon-valley-community-foundation/

“Toxic Philanthropy Part 2: Hewlett Packard Re-Engineers the Social Sector”
https://wrenchinthegears.com/2018/11/25/toxic-philanthropy-part-2-hewlett-packard-re-engineers-the-social-sector/

“Toxic Philanthropy Part 1: Surveillance”
https://wrenchinthegears.com/2018/11/18/toxic-philanthropy-part-1-surveillance/

“Philanthropy’s lesser known weapons: PRIs, MRIs and DAFs”
https://wrenchinthegears.com/2019/01/04/philanthropys-lesser-known-weapons-pris-mris-and-dafs/

“Hewlett Packard And The Pitfalls Of “Deeper Learning” In An Internet Of Things World”
https://wrenchinthegears.com/2019/07/07/hewlett-packard-and-the-pitfalls-of-deeper-learning-in-an-internet-of-things-world/

“Pay for Success Finance Preys Upon The Poor: Presentation at Left Forum 6/29/19”
https://wrenchinthegears.com/2019/06/26/pay-for-success-finance-preys-upon-the-poor-presentation-at-left-forum-6-29-19/

“Alice & Automated Poverty Management”
https://wrenchinthegears.com/2019/06/19/alice-automated-poverty-management/

“What About Alice? The United Way, Collective Impact & Libertarian “Charity””
https://wrenchinthegears.com/2019/06/09/what-about-alice-the-united-way-collective-impact-libertarian-charity/

“Home Visit Legislation: A Sales Pitch For Family Surveillance?”
https://wrenchinthegears.com/2019/02/17/home-visit-legislation-a-sales-pitch-for-family-surveillance/

“Stanley Druckenmiller and Paul Tudor Jones: The Billionaire Networks Behind Harlem’s Human Capital Lab”
https://wrenchinthegears.com/2019/01/26/stanley-druckenmiller-and-paul-tudor-jones-the-billionaire-networks-behind-harlems-human-capital-lab/

“Charter, Public Health, and Catholic Charity Interests Help Launch “Disruptive” Pay for Success Program”
https://wrenchinthegears.com/2019/01/04/charter-public-health-and-catholic-charity-interests-help-launch-disruptive-pay-for-success-program/

“When “Community Foundations” Go Global (Or Coastal)”
https://wrenchinthegears.com/2019/01/04/when-community-foundations-go-global-or-coastal/

“To Serve Man: It’s A Cookbook!”
https://wrenchinthegears.com/2019/01/04/to-serve-man-its-a-cookbook/

“Silicon Valley’s Social Impact Deal Maker”
https://wrenchinthegears.com/2019/01/04/silicon-valleys-social-impact-deal-maker/

“New Governors Pritzker and Newsom Set Up For Their ReadyNation Gold Rush”
https://wrenchinthegears.com/2018/11/11/readynation-pritzker-and-newsom-get-ready-for-the-next-gold-rush/

“Too big to map, but I tried.”
https://wrenchinthegears.com/2018/03/18/too-big-to-map-but-i-tried/

“Who Is Pulling The Muppet Strings?”
https://wrenchinthegears.com/2018/01/14/who-is-pulling-the-muppet-strings/

“When someone shows you who they are, believe them the first time.”
https://wrenchinthegears.com/2017/09/20/when-someone-shows-you-who-they-are-believe-them-the-first-time/

“Smart Cities & Social Impact Bonds: Public Education’s Hostile Takeover Part II”
https://wrenchinthegears.com/2017/07/13/smart-cities-social-impact-bonds-public-educations-hostile-takeover-part-ii/ ]
education  edtech  philanthropicindustrialcomplex  philanthropy  charterschools  charity  siliconvalley  californianideology  surveillance  schools  hewlettpackard  internetofthings  data  privacy  children  poverty  policy  unitedway  libertarianism  stanleydruckenmiller  paultudorjones  disruption  socialimpact  gavinnewsom  governance  government  readynation  smartcities  privatization  schooling  publicschools  inequality  charitableindustrialcomplex  dianeravitch 
july 2019 by robertogreco
David F. Noble: A Wrench in the Gears - 1/8 - YouTube
davidnoble  power  education  progressive  corporatism  highered  highereducation  documentary  rules  schools  schooling  deschooling  unschooling  cv  learning  howwelearn  howweteach  teaching  activism  authority  abuse  academia  resistance  canada  us  lobbying  israel  criticalthinking  capitalism  experience  life  living  hierarchy  oppression  collegiality  unions  self-respect  organizing  humanrights  corporatization  luddism  automation  technology  luddites  distancelearning  correspondencecourses  history  creditcards  privacy  criticaltheory  criticalpedagogy  attendance  grades  grading  assessment  experientialeducation  training  knowledge  self  self-directed  self-directedlearning  pedagogy  radicalpedagogy  alienation  authoritarianism  anxiety  instrinsicmotivation  motivation  parenting  relationships  love  canon  defiance  freedom  purpose  compulsory  liberation 
july 2019 by robertogreco
Schools Are Deploying Massive Digital Surveillance Systems. The Results Are Alarming - Education Week
"Sometimes students with a concern simply email themselves, with the expectation that algorithms will flag the message for adults, said Jessica Mays, an instructional technology specialist for Texas’s Temple Independent School District, another Gaggle client.

One student “opened a Google Doc, wrote down concerns about a boy in class acting strange, then typed every bad word they could think of,” Mays said. At the end of the note, the student apologized for the foul language, but wrote that they wanted to make sure the message tripped alarms.

For proponents, it’s evidence that students appreciate having new ways of seeking help.

But for Levinson-Waldman, the lawyer for the Brennan Center for Justice, it raises a bigger question.

“Are we training children from a young age to accept constant surveillance?” she asked."
privacy  surveillance  schools  2019  algorithms  policy  data  information  technology  edtech 
june 2019 by robertogreco
Fear-based social media Nextdoor, Citizen, Amazon’s Neighbors is getting more popular - Vox
"Why people are socializing more about crime even as it becomes rarer."



"These apps have become popular because of — and have aggravated — the false sense that danger is on the rise. Americans seem to think crime is getting worse, according to data from both Gallup and Pew Research Center. In fact, crime has fallen steeply in the last 25 years according to both the FBI and the Bureau of Justice Statistics.

Of course, unjustified fear, nosy neighbors, and the neighborhood watch are nothing new. But the proliferation of smart homes and smart devices is putting tools like cameras and sensors in doorbells, porches, and hallways across America.

And as with all things technology, the reporting and sharing of the information these devices gather is easier than it used to be and its reach is wider.

These apps foment fear around crime, which feeds into existing biases and racism and largely reinforces stereotypes around skin color, according to David Ewoldsen, professor of media and information at Michigan State University.

“There’s very deep research saying if we hear about or read a crime story, we’re much more likely to identify a black person than a white person [as the perpetrator],” Ewoldsen said, regardless of who actually committed the crime.

As Steven Renderos, senior campaigns director at the Center for Media Justice, put it, “These apps are not the definitive guides to crime in a neighborhood — it is merely a reflection of people’s own bias, which criminalizes people of color, the unhoused, and other marginalized communities.”

Examples abound of racism on these types of apps, usually in the form of who is identified as criminal.

A recent Motherboard article found that the majority of people posted as “suspicious” on Neighbors in a gentrified Brooklyn neighborhood were people of color.

Nextdoor has been plagued by this sort of stereotyping.

Citizen is full of comments speculating on the race of people in 9-1-1 alerts.

While being called “suspicious” isn’t of itself immediately harmful, the repercussions of that designation can be. People of color are not only more likely to be presumed criminals, they are also more likely to be arrested, abused, or killed by law enforcement, which in turn reinforces the idea that these people are criminals in the first place.

“These apps can lead to actual contact between people of color and the police, leading to arrests, incarceration and other violent interactions that build on biased policing practices by law enforcement agencies across the country,” Renderos said. “And in the digital age, as police departments shift towards ‘data-driven policing’ programs, the data generated from these interactions including 9-1-1 calls and arrests are parts of the historic crime data often used by predictive policing algorithms. So the biases baked in to the decisions around who is suspicious and who is arrested for a crime ends up informing future policing priorities and continuing the cycle of discrimination.”

Apps didn’t create bias or unfair policing, but they can exacerbate it

“To me, the danger with these apps is it puts the power in the hands of the individual to decide who does and doesn’t belong in a community,” Renderos said. “That increases the potential for communities of color to come in contact with police. Those types of interactions have wielded deadly results in the past.

“Look what happened to Trayvon Martin. George Zimmerman was the watchdog. He saw someone who looked out of place and decided to do something about it.”

These apps can also be psychologically detrimental to the people who use them.

It’s natural for people to want to know more about the world around them in order to decrease their uncertainty and increase their ability to cope with danger, Ewoldsen said, so people turn to these apps.

“You go on because you’re afraid and you want to feel more competent, but now you’re seeing crime you didn’t know about,” Ewoldsen said. “The long-term implication is heightened fear and less of a sense of competence. ... It’s a negative spiral.”

“Focusing on these things you’re interpreting as danger can change your perception of your overall safety,” Pamela Rutledge, director of the Media Psychology Research Center, told Recode. “Essentially you’re elevating your stress level. There’s buckets of research that talks about the dangers of stress, from high blood pressure to decreased mental health.”

These apps are particularly scary since they’re discussing crime nearby, within your neighborhood or Zip code.

“Because it’s so close, my guess is it has a bigger impact on fear,” Ewoldsen said."
fear  nextdoor  crime  citizenapp  amazonneighbors  neighborhoods  2019  surveillance  sousveillance  safety  race  racism  privacy  bias  vigilantism  news  media 
may 2019 by robertogreco
Shade
[via: https://twitter.com/shannonmattern/status/1122670547777871874

who concludes…
https://twitter.com/shannonmattern/status/1122685558688485376
"🌴Imagine what LA could do if it tied street enhancement to a comprehensive program of shade creation: widening the sidewalks, undergrounding powerlines, cutting bigger tree wells, planting leafy, drought-resistant trees, + making room for arcades, galleries, + bus shelters.🌳"]

"All you have to do is scoot across a satellite map of the Los Angeles Basin to see the tremendous shade disparity. Leafy neighborhoods are tucked in hillside canyons and built around golf courses. High modernist homes embrace the sun as it flickers through labor-intensive thickets of eucalyptus. Awnings, paseos, and mature ficus trees shade high-end shopping districts. In the oceanfront city of Santa Monica, which has a dedicated municipal tree plan and a staff of public foresters, all 302 bus stops have been outfitted with fixed steel parasols (“blue spots”) that block the sun. 9 Meanwhile, in the Los Angeles flats, there are vast gray expanses — playgrounds, parking lots, and wide roads — with almost no trees. Transit riders bake at unsheltered bus stops. The homeless take refuge in tunnels and under highway overpasses; some chain their tarps and tents to fences on Skid Row and wait out the day in the shadows of buildings across the street.

Shade is often understood as a luxury amenity, lending calm to courtyards and tree-lined boulevards, cooling and obscuring jewel boxes and glass cubes. But as deadly, hundred-degree heatwaves become commonplace, we have to learn to see shade as a civic resource that is shared by all. In the shade, overheated bodies return to equilibrium. Blood circulation improves. People think clearly. They see better. In a physiological sense, they are themselves again. For people vulnerable to heat stress and exhaustion — outdoor workers, the elderly, the homeless — that can be the difference between life and death. Shade is thus an index of inequality, a requirement for public health, and a mandate for urban planners and designers.

A few years back, Los Angeles passed sweeping revisions to the general plan meant to encourage residents to walk, bike, and take more buses and trains. But as Angelenos step out of their cars, they are discovering that many streets offer little relief from the oppressive sunshine. Not everyone has the stamina to wait out the heat at an unprotected bus stop, or the money to duck into an air-conditioned cafe. 11 When we understand shade as a public resource — a kind of infrastructure, even — we can have better discussions about how to create it and distribute it fairly.

Yet cultural values complicate the provision of shade. Los Angeles is a low-rise city whose residents prize open air and sunshine. 12 They show up at planning meetings to protest tall buildings that would block views or darken sunbathing decks, and police urge residents in high-crime neighborhoods to cut down trees that hide drug dealing and prostitution. Shade trees are designed out of parks to discourage loitering and turf wars, and designed off streets where traffic engineers demand wide lanes and high visibility. Diffuse sunlight is rare in many parts of Los Angeles. You might trace this back to a cultural obsession with shadows and spotlights, drawing a line from Hollywood noir — in which long shadows and unlit corners represent the criminal underworld — to the contemporary politics of surveillance. 13 The light reveals what hides in the dark.

When I think of Los Angeles, I picture Glendale Boulevard in Atwater Village, a streetcar suburb converted into a ten-lane automobile moonscape. People say they like this street for its wall of low-slung, pre-war storefronts, home to record stores and restaurants. To me, it’s a never-ending, vertiginous tunnel of light. I squint to avoid the glare from the white stucco walls, bare pavement, and car windows. From a climate perspective, bright surfaces are good; they absorb fewer sun rays and lessen the urban heat-island effect. But on an unshaded street they can also concentrate and intensify local sunlight."



"At one time, they did. “Shade was integral, and incorporated into the urban design of southern California up until the 1930s,” Davis said. “If you go to most of the older agricultural towns … the downtown streets were arcaded. They had the equivalent of awnings over the sidewalk.” Rancho homes had sleeping porches and shade trees, and buildings were oriented to keep their occupants cool. The original settlement of Los Angeles conformed roughly to the Law of the Indies, a royal ordinance that required streets to be laid out at a 45-degree angle, ensuring access to sun in the winter and shade in the summer. Spanish adobes were built around a central courtyard cooled by awnings and plants. 15 As the city grew, the California bungalow — a low, rectangular house, with wide eaves, inspired by British Indian hill stations — became popular with the middle class. “During the 1920s, they were actually prefabricated in factories,” Davis said. “There are tens of thousands of bungalows, particularly along the Alameda corridor … that were manufactured by Pacific Ready-Cut Homes, which advertised itself as the Henry Ford of home construction.” 16

All that changed with the advent of cheap electricity. In 1936, the Los Angeles Bureau of Power and Light completed a 266-mile high-voltage transmission line from Boulder Dam (now Hoover Dam), which could supply 70 percent of the city’s power at low cost. Southern Californians bought mass-produced housing with electric heating and air conditioning. By the end of World War II, there were nearly 4 million people living in Los Angeles County, and the new neighborhoods were organized around driveways and parking lots. Parts of the city, Davis said, became “virtually treeless deserts.”"



"It’s easy to see how this hostile design reflected the values of the peak automobile era, but there is more going on here. The destruction of urban refuge was part of a long-term strategy to discourage gay cruising, drug use, and other “shady” activities downtown. In 1964, business owners sponsored another redesign that was intended, in the hyperbolic words of the Los Angeles Times, to finally clear out the “deviates and criminals.” The city removed the perimeter benches and culled even more palms and shade trees, so that office workers and shoppers could move through the park without being “accosted by derelicts and ‘bums.’” Sunlight was weaponized. “Before long, pedestrians will be walking through, instead of avoiding, Pershing Square,” the Times declared. “And that is why parks are built.” 19"



"High-concept architecture is one way to transform the shadescape of Los Angeles. Street trees are another. Unfortunately, the city’s most ubiquitous tree — the iconic Washington robusta, or Mexican fan palm — is about as useful in that respect as a telephone pole.

Palm trees have been identified with southern California since 1893, when Canary Island date palms — the fatter, stouter cousin — were displayed at the Chicago World’s Fair. On the trunk of one of those palms, boosters posted the daily temperatures at a San Diego beach, and the tree itself came to stand for “sunshine and soft air.” In his indispensable history, Trees in Paradise, Jared Farmer traces the palm’s transformation from a symbol of a healthy climate to a symbol of glamour, via its association with Hollywood. 26

Despite that early fame, palm trees did not really take over Los Angeles until the 1930s, when a citywide program set tens of thousands of palms along new or recently expanded roads. They were the ideal tree for an automobile landscape. Hardy, cheap, and able to grow anywhere, palm trees are basically weeds. Their shallow roots curl up into a ball, so they can be plugged into small pavement cuts without entangling underground sewer and water mains or buckling sidewalks. As Farmer puts it, palms are “symbiotic infrastructure,” beautifying the city without making a mess. Plus, as Mary Pickford once pointed out, the slender trunks don’t block the view of storefronts, which makes them ideal for window-shopping from the driver’s seat. The city’s first forester, L. Glenn Hall, planted more than 25,000 palm trees in 1931 alone. 27

Hall’s vision, though, was more ambitious than that. He planned to landscape all of Los Angeles’s roads with 1.2 million street trees. Tall palms, like Washingtonia robusta, would go on major thoroughfares, and side streets would be lined with elm, pine, red maple, liquidambar, ash, and sycamore. A Depression-era stimulus package provided enough funds to employ 400 men for six months. But the forestry department put the burden of watering and maintenance on property owners, and soon it charged for cutting new tree wells, too. Owners weren’t interested. So Hall concentrated his efforts on the 28 major boulevards that would serve the 1932 Olympics — including the now-iconic Ventura, Wilshire, Figueroa, Vermont, Western, and Crenshaw — and committed the city to pay for five years of tree maintenance. That may well have bankrupted the tree planting program, and before long the city was urging property owners to take on all costs, including the trees themselves.

This history partly explains the shade disparity in Los Angeles today. Consider the physical dimensions of a major city street in Hall’s time. Between the expanding road and narrowing sidewalks was an open strip of grass, three to ten feet wide, known as the parkway. Having rejected a comprehensive parks system, Los Angeles relied on these roadside strips to plant its urban forest, but over time the parkways were diminished by various agencies in the name of civic improvements — chiefly, road widening. 29 And the stewardship of these spaces was always ambiguous. The parkways are public land, owned and regulated by the … [more]
losangeles  trees  shade  history  palmtrees  urbanplanning  electricity  inequality  2019  sambloch  mikedavis  urban  urbanism  cars  transportation  disparity  streets  values  culture  pedestrians  walking  heat  light  socal  california  design  landscape  wealth  sidewalks  publictransit  transit  privacy  reynerbanham  surveillance  sun  sunshine  climatechange  sustainability  energy  ericgarcetti  antoniovillaraigosa  environment  realestate  law  legal  cities  civics 
april 2019 by robertogreco
San Francisco; or, How to Destroy a City | Public Books
"As New York City and Greater Washington, DC, prepared for the arrival of Amazon’s new secondary headquarters, Torontonians opened a section of their waterfront to Alphabet’s Sidewalk Labs, which plans to prototype a new neighborhood “from the internet up.” Fervent resistance arose in all three locations, particularly as citizens and even some elected officials discovered that many of the terms of these public-private partnerships were hashed out in closed-door deals, secreted by nondisclosure agreements. Critics raised questions about the generous tax incentives and other subsidies granted to these multibillion-dollar corporations, their plans for data privacy and digital governance, what kind of jobs they’d create and housing they’d provide, and how their arrival could impact local infrastructures, economies, and cultures. While such questioning led Amazon to cancel their plans for Long Island City in mid-February, other initiatives press forward. What does it mean when Silicon Valley—a geographic region that’s become shorthand for an integrated ideology and management style usually equated with libertarian techno-utopianism—serves as landlord, utility provider, urban developer, (unelected) city official, and employer, all rolled into one?1

We can look to Alphabet’s and Amazon’s home cities for clues. Both the San Francisco Bay Area and Seattle have been dramatically remade by their local tech powerhouses: Amazon and Microsoft in Seattle; and Google, Facebook, and Apple (along with countless other firms) around the Bay. As Jennifer Light, Louise Mozingo, Margaret O’Mara, and Fred Turner have demonstrated, technology companies have been reprogramming urban and suburban landscapes for decades.2 And “company towns” have long sprung up around mills, mines, and factories.3 But over the past few years, as development has boomed and income inequality has dramatically increased in the Bay Area, we’ve witnessed the arrival of several new books reflecting on the region’s transformation.

These titles, while focusing on the Bay, offer lessons to New York, DC, Toronto, and the countless other cities around the globe hoping to spur growth and economic development by hosting and ingesting tech—by fostering the growth of technology companies, boosting STEM education, and integrating new sensors and screens into their streetscapes and city halls. For years, other municipalities, fashioning themselves as “the Silicon Valley of [elsewhere],” have sought to reverse-engineer the Bay’s blueprint for success. As we’ll see, that blueprint, drafted to optimize the habits and habitats of a privileged few, commonly elides the material needs of marginalized populations and fragile ecosystems. It prioritizes efficiency and growth over the maintenance of community and the messiness of public life. Yet perhaps we can still redraw those plans, modeling cities that aren’t only made by powerbrokers, and that thrive when they prioritize the stewardship of civic resources over the relentless pursuit of innovation and growth."



"We must also recognize the ferment and diversity inherent in Bay Area urban historiography, even in the chronicles of its large-scale development projects. Isenberg reminds us that even within the institutions and companies responsible for redevelopment, which are often vilified for exacerbating urban ills, we find pockets of heterogeneity and progressivism. Isenberg seeks to supplement the dominant East Coast narratives, which tend to frame urban renewal as a battle between development and preservation.

In surveying a variety of Bay Area projects, from Ghirardelli Square to The Sea Ranch to the Transamerica Pyramid, Isenberg shifts our attention from star architects and planners to less prominent, but no less important, contributors in allied design fields: architectural illustration, model-making, publicity, journalism, property management, retail planning, the arts, and activism. “People who are elsewhere peripheral and invisible in the history of urban design are,” in her book, “networked through the center”; they play critical roles in shaping not only the urban landscape, but also the discourses and processes through which that landscape takes shape.

For instance, debates over public art in Ghirardelli Square—particularly Ruth Asawa’s mermaid sculpture, which featured breastfeeding lesbian mermaids—“provoked debates about gender, sexuality, and the role of urban open space in San Francisco.” Property manager Caree Rose, who worked alongside her husband, Stuart, coordinated with designers to master-plan the Square, acknowledging that retail, restaurants, and parking are also vital ingredients of successful public space. Publicist Marion Conrad and graphic designer Bobbie Stauffacher were key members of many San Francisco design teams, including that for The Sea Ranch community, in Sonoma County. Illustrators and model-makers, many of them women, created objects that mediated design concepts for clients and typically sat at the center of public debates.

These creative collaborators “had the capacity to swing urban design decisions, structure competition for land, and generally set in motion the fate of neighborhoods.” We see the rhetorical power of diverse visualization strategies reflected across these four books, too: Solnit’s offers dozens of photographs, by Susan Schwartzenberg—of renovations, construction sites, protests, dot-com workplaces, SRO hotels, artists’ studios—while Walker’s dense text is supplemented with charts, graphs, and clinical maps. McClelland’s book, with its relatively large typeface and extra-wide leading, makes space for his interviewees’ words to resonate, while Isenberg generously illustrates her pages with archival photos, plans, and design renderings, many reproduced in evocative technicolor.

By decentering the star designer and master planner, Isenberg reframes urban (re)development as a collaborative enterprise involving participants with diverse identities, skills, and values. And in elevating the work of “allied” practitioners, Isenberg also aims to shift the focus from design to land: public awareness of land ownership and commitment to responsible public land stewardship. She introduces us to several mid-century alternative publications—weekly newspapers, Black periodicals, activists’ manuals, and books that never made it to the best-seller list … or never even made it to press—that advocated for a focus on land ownership and politics. Yet the discursive power of Jacobs and Caro, which framed the debate in terms of urban development vs. preservation, pushed these other texts off the shelf—and, along with them, the “moral questions of land stewardship” they highlighted.

These alternative tales and supporting casts serve as reminders that the modern city need not succumb to Haussmannization or Moses-ification or, now, Googlization. Mid-century urban development wasn’t necessarily the monolithic, patriarchal, hegemonic force we imagined it to be—a realization that should steel us to expect more and better of our contemporary city-building projects. Today, New York, Washington, DC, and Toronto—and other cities around the world—are being reshaped not only by architects, planners, and municipal administrators, but also by technologists, programmers, data scientists, “user experience” experts and logistics engineers. These are urbanism’s new “allied” professions, and their work deals not only with land and buildings, but also, increasingly, with data and algorithms.

Some critics have argued that the real reason behind Amazon’s nationwide HQ2 search was to gather data from hundreds of cities—both quantitative and qualitative data that “could guide it in its expansion of the physical footprint, in the kinds of services it rolls out next, and in future negotiations and lobbying with states and municipalities.”5 This “trove of information” could ultimately be much more valuable than all those tax incentives and grants. If this is the future of urban development, our city officials and citizens must attend to the ownership and stewardship not only of their public land, but also of their public data. The mismanagement of either could—to paraphrase our four books’ titles—elongate the dark shadows cast by growing inequality, abet the siege of exploitation and displacement, “hollow out” our already homogenizing neighborhoods, and expedite the departure of an already “gone” city.

As Beat poet Lawrence Ferlinghetti muses in his “Pictures of the Gone World 11,” which inspired Walker’s title: “The world is a beautiful place / to be born into / if you don’t mind some people dying / all the time / or maybe only starving / some of the time / which isn’t half so bad / if it isn’t you.” This is precisely the sort of solipsism and stratification that tech-libertarianism and capitalist development promotes—and that responsible planning, design, and public stewardship must prevent."
cities  shannonmattern  2019  sanfrancisco  siliconvalley  nyc  washingtondc  seattle  amazon  google  apple  facebook  technology  inequality  governance  libertarianism  urban  urbanism  microsoft  jenniferlight  louisemozingo  margareto'mara  fredturner  efficiency  growth  marginalization  publicgood  civics  innovation  rebeccasolnit  gentrification  privatization  homogenization  susanschwartzenberg  carymcclelland  economics  policy  politics  richardwalker  bayarea  lisonisenberg  janejacobs  robertmoses  diversity  society  inclusivity  inclusion  exclusion  counterculture  cybercultue  culture  progressive  progressivism  wealth  corporatism  labor  alexkaufman  imperialism  colonization  californianideology  california  neoliberalism  privacy  technosolutionism  urbanization  socialjustice  environment  history  historiography  redevelopment  urbanplanning  design  activism  landscape  ruthasawa  gender  sexuality  openspace  publicspace  searanch  toronto  larenceferlinghetti  susanschartzenberg  bobbiestauffacher  careerose  stuartrose  ghirardellisqure  marionconrad  illustration  a 
march 2019 by robertogreco
My Gender Is: Mind Your Business - them.
"My privacy matters more to me than being seen 'correctly' in a space inhospitable to empathy, where nonbinary people are already subject to abuse and violence on a daily basis."



"I’m entitled to my privacy as much as I am my identity. I want to be respected, not known. I want to live in a world where private knowledge is a privilege, not a right. I’m in no rush to define myself. I don’t even think that is possible; that I could be so sure of who I am I could write it all down. The road to proving personhood is a harsh one, I know. An unfriendly reader is already listing ways my explanation is incomplete and my reasoning faulty. That my being genderqueer requires an explanation and description to be believed. In that case, think of it as “and then this is true, too.” That’s what I am, that feeling. That is what my gender is."




"I hold space in the possibilities of who I am as a person, as a role, and I do this without needing the participation or validation of other people. To be known — really known — is a practice, not an event. It is a gift, but not a requirement, for personhood."

[via:
https://twitter.com/hautepop/status/1100885383544475648

"“I’m entitled to my privacy as much as I am my identity. I want to be respected, not known. I’m in no rush to define myself. I don’t even think that is possible.”

My Gender Is: Mind Your Business - by ⁦@arabellesicardi⁩
https://www.them.us/story/my-gender-is-mind-your-business

The one thing woker than pronoun stickers at your conference is structuring matters such that they aren’t compulsory.

This other @them essay about Kondoing one’s way to a gender that sparks joy is also a lovely way of talking about the time & space that this unfurling process may take.

Give people that liminal space.
https://www.them.us/story/marie-kondo-gender "
gender  identity  privacy  arabellesicardi  2018  personhood 
march 2019 by robertogreco
The Stories We Were Told about Education Technology (2018)
"It’s been quite a year for education news, not that you’d know that by listening to much of the ed-tech industry (press). Subsidized by the Chan Zuckerberg Initiative, some publications have repeatedly run overtly and covertly sponsored articles that hawk the future of learning as “personalized,” as focused on “the whole child.” Some of these attempt to stretch a contemporary high-tech vision of social emotional surveillance so it can map onto a strange vision of progressive education, overlooking no doubt how the history of progressive education has so often been intertwined with race science and eugenics.

Meanwhile this year, immigrant, refugee children at the United States border were separated from their parents and kept in cages, deprived of legal counsel, deprived of access to education, deprived in some cases of water.

“Whole child” and cages – it’s hardly the only jarring juxtaposition I could point to.

2018 was another year of #MeToo, when revelations about sexual assault and sexual harassment shook almost every section of society – the media and the tech industries, unsurprisingly, but the education sector as well – higher ed, K–12, and non-profits alike, as well school sports all saw major and devastating reports about cultures and patterns of sexual violence. These behaviors were, once again, part of the hearings and debates about a Supreme Court Justice nominee – a sickening deja vu not only for those of us that remember Anita Hill ’s testimony decades ago but for those of us who have experienced something similar at the hands of powerful people. And on and on and on.

And yet the education/technology industry (press) kept up with its rosy repetition that social equality is surely its priority, a product feature even – that VR, for example, a technology it has for so long promised is “on the horizon,” is poised to help everyone, particularly teachers and students, become more empathetic. Meanwhile, the founder of Oculus Rift is now selling surveillance technology for a virtual border wall between the US and Mexico.

2018 was a year in which public school teachers all over the US rose up in protest over pay, working conditions, and funding, striking in red states like West Virginia, Kentucky, and Oklahoma despite an anti-union ruling by the Supreme Court.

And yet the education/technology industry (press) was wowed by teacher influencers and teacher PD on Instagram, touting the promise for more income via a side-hustle like tutoring rather by structural or institutional agitation. Don’t worry, teachers. Robots won’t replace you, the press repeatedly said. Unsaid: robots will just de-professionalize, outsource, or privatize the work. Or, as the AI makers like to say, robots will make us all work harder (and no doubt, with no unions, cheaper).

2018 was a year of ongoing and increased hate speech and bullying – racism and anti-Semitism – on campuses and online.

And yet the education/technology industry (press) still maintained that blockchain would surely revolutionize the transcript and help insure that no one lies about who they are or what they know. Blockchain would enhance “smart spending” and teach financial literacy, the ed-tech industry (press) insisted, never once mentioning the deep entanglements between anti-Semitism and the alt-right and blockchain (specifically Bitcoin) backers.

2018 was a year in which hate and misinformation, magnified and spread by technology giants, continued to plague the world. Their algorithmic recommendation engines peddled conspiracy theories (to kids, to teens, to adults). “YouTube, the Great Radicalizer” as sociologist Zeynep Tufekci put it in a NYT op-ed.

And yet the education/technology industry (press) still talked about YouTube as the future of education, cheerfully highlighting (that is, spreading) its viral bullshit. Folks still retyped the press releases Google issued and retyped the press releases Facebook issued, lauding these companies’ (and their founders’) efforts to reshape the curriculum and reshape the classroom.

This is the ninth year that I’ve reviewed the stories we’re being told about education technology. Typically, this has been a ten (or more) part series. But I just can’t do it any more. Some people think it’s hilarious that I’m ed-tech’s Cassandra, but it’s not funny at all. It’s depressing, and it’s painful. And no one fucking listens.

If I look back at what I’ve written in previous years, I feel like I’ve already covered everything I could say about 2018. Hell, I’ve already written about the whole notion of the “zombie idea” in ed-tech – that bad ideas never seem to go away, that just get rebranded and repackaged. I’ve written about misinformation and ed-tech (and ed-tech as misinformation). I’ve written about the innovation gospel that makes people pitch dangerously bad ideas like “Uber for education” or “Alexa for babysitting.” I’ve written about the tech industry’s attempts to reshape the school system as its personal job training provider. I’ve written about the promise to “rethink the transcript” and to “revolutionize credentialing.” I’ve written about outsourcing and online education. I’ve written about coding bootcamps as the “new” for-profit higher ed, with all the exploitation that entails. I’ve written about the dangers of data collection and data analysis, about the loss of privacy and the lack of security.

And yet here we are, with Mark Zuckerberg – education philanthropist and investor – blinking before Congress, promising that AI will fix everything, while the biased algorithms keep churning out bias, while the education/technology industry (press) continues to be so blinded by “disruption” it doesn’t notice (or care) what’s happened to desegregation, and with so many data breaches and privacy gaffes that they barely make headlines anymore.

Folks. I’m done.

I’m also writing a book, and frankly that’s where my time and energy is going.

There is some delicious irony, I suppose, in the fact that there isn’t much that’s interesting or “innovative” to talk about in ed-tech, particularly since industry folks want to sell us on the story that tech is moving faster than it’s ever moved before, so fast in fact that the ol’ factory model school system simply cannot keep up.

I’ve always considered these year-in-review articles to be mini-histories of sorts – history of the very, very recent past. Now, instead, I plan to spend my time taking a longer, deeper look at the history of education technology, with particular attention for the next few months, as the title of my book suggests, to teaching machines – to the promises that machines will augment, automate, standardize, and individualize instruction. My focus is on the teaching machines of the mid-twentieth century, but clearly there are echoes – echoes of behaviorism and personalization, namely – still today.

In his 1954 book La Technique (published in English a decade later as The Technological Society), the sociologist Jacques Ellul observes how education had become oriented towards creating technicians, less interested in intellectual development than in personality development – a new “psychopedagogy” that he links to Maria Montessori. “The human brain must be made to conform to the much more advanced brain of the machine,” Ellul writes. “And education will no longer be an unpredictable and exciting adventure in human enlightenment , but an exercise in conformity and apprenticeship to whatever gadgetry is useful in a technical world.” I believe today we call this "social emotional learning" and once again (and so insistently by the ed-tech press and its billionaire backers), Montessori’s name is invoked as the key to preparing students for their place in the technological society.

Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.

This is the only good grit:

[image of Gritty]

If I were writing a lengthier series on the year in ed-tech, I’d spend much more time talking about the promises made about personalization and social emotional learning. I’ll just note here that the most important “innovator” in this area this year (other than Gritty) was surely the e-cigarette maker Juul, which offered a mindfulness curriculum to schools – offered them the curriculum and $20,000, that is – to talk about vaping. “‘The message: Our thoughts are powerful and can set action in motion,’ the lesson plan states.”

The most important event in ed-tech this year might have occurred on February 14, when a gunman opened fire on his former classmates at Marjory Stone Douglas High School in Parkland, Florida, killing 17 students and staff and injuring 17 others. (I chose this particular school shooting because of the student activism it unleashed.)

Oh, I know, I know – school shootings and school security aren’t ed-tech, ed-tech evangelists have long tried to insist, an argument I’ve heard far too often. But this year – the worst year on record for school shootings (according to some calculations) – I think that argument started to shift a bit. Perhaps because there’s clearly a lot of money to be made in selling schools “security” products and services: shooting simulation software, facial recognition technology, metal detectors, cameras, social media surveillance software, panic buttons, clear backpacks, bulletproof backpacks, … [more]
audreywatters  education  technology  edtech  2018  surveillance  privacy  personalization  progressive  schools  quantification  gamification  wholechild  montessori  mariamontessori  eugenics  psychology  siliconvalley  history  venturecapital  highereducation  highered  guns  gunviolence  children  youth  teens  shootings  money  influence  policy  politics  society  economics  capitalism  mindfulness  juul  marketing  gritty  innovation  genetics  psychotechnologies  gender  race  racism  sexism  research  socialemotional  psychopedagogy  pedagogy  teaching  howweteach  learning  howwelearn  teachingmachines  nonprofits  nonprofit  media  journalism  access  donaldtrump  bias  algorithms  facebook  amazon  disruption  data  bigdata  security  jacquesellul  sociology  activism  sel  socialemotionallearning 
december 2018 by robertogreco
🅃🄸🄼 on Twitter: "1/ I grew up in the service industry. Great products and great service are the same."
1/ I grew up in the service industry. Great products and great service are the same.

2/ Know your audience: there’s a difference between a Michelin Star restaurant and greasy spoon. You would rightfully be annoyed if someone came and folded your napkin between slices of pizza. You build a restaurant for your customers, not for yourself.

3/ You learn how to listen to customers. If you ask “How is everything?” no one ever says things were terrible—and if they do they are probably taking out something else in their lives on you. *How* they said “everything is fine” is what matters.

4/ If a restaurant has perfect food, perfect service, perfect decor—it becomes perfectly forgettable. People expect to pay for an experience not just with their wallets but with their own effort. The lines, the waits make everything worth it. Effortless=forgettable.

5/ Don’t talk shop in front of house. Customers don’t care that a server missed their shift or that the cook is in a bad mood today. Customers literally don’t want to know how the sausage is made—they just want to eat it.

6/ Finally, churn matters. There’s only so many people who will try you once, let alone come back. If no one comes back, you’re done.

[See also: "The Internet Needs More Friction: Tech companies’ obsession with moving data across the internet as fast as possible has made it less safe."
https://motherboard.vice.com/en_us/article/3k9q33/the-internet-needs-more-friction ]

[See also:
https://twitter.com/hypervisible/status/1073649771905204224

Stifling your cough so "smart" devices don't report that you are sickly and thus unemployable is now part of the nightmarish (near) future. https://cacm.acm.org/news/233329-smarter-voice-assistants-recognize-your-favorite-brandsand-health/fulltext

[image with starred part highlighted: "Yet the new sound detection capabilities also offer the potential for controversy, as the speakers now collect low-level health data. Snoring and yawning a lot, for instance, could be signs of obstructive sleep apnea, so leaked data might impact somebody's health insurance, or even car insurance rates. **A lot of coughing and sneezing might impact employability, too, if somebody seems too sickly too often.**"]

"[Smart speaker] users express few privacy concerns, but their rationalizations indicate an incomplete understanding of privacy risks, a complicated trust relationship with speaker companies, and a reliance on the socio-technical context in which smart speakers reside."

Here's the link to that study on smart speakers if you want it: https://dl.acm.org/citation.cfm?id=3274371

TFW you realize that Black Mirror is actually too optimistic.

[image with starred part highlighted: "Mitchell says **Audio Analytic is pursuing a number of avenues for its technology, such as designing drink cans so that when opened, they make different, distinctive kinds of sounds that precisely identify the drink "and so rive some kind of interaction."** However, the drink does not have to be identified; simply knowing you're drinking from a can could be valuable, says Mitchell, and might spark a verbal request from the smart speaker to recycle the can when you're finished."]

Tech bros' obsession w/ eliminating "friction" is really just trying to eliminate the messiness of dealing with humans w/ the messiness of interacting with machines, which they can better monetize. Opening a can will initiate an interaction? FFS. 🤦🏿‍♂️"]
friction  technology  surveillance  timfrietas  effort  memory  experience  2018  educationmetaphors  education  seamlessness  effortlessness  forgettability  blackmirror  chrisgilliard  insurance  service  restaurants  smartdevices  internetofthings  internetofshit  health  healthinsurance  employment  illness  audioanalytic  privacy 
december 2018 by robertogreco
Surveillance Kills Freedom By Killing Experimentation | WIRED
"In my book Data and Goliath, I write about the value of privacy. I talk about how it is essential for political liberty and justice, and for commercial fairness and equality. I talk about how it increases personal freedom and individual autonomy, and how the lack of it makes us all less secure. But this is probably the most important argument as to why society as a whole must protect privacy: it allows society to progress.

We know that surveillance has a chilling effect on freedom. People change their behavior when they live their lives under surveillance. They are less likely to speak freely and act individually. They self-censor. They become conformist. This is obviously true for government surveillance, but is true for corporate surveillance as well. We simply aren’t as willing to be our individual selves when others are watching.

Let’s take an example: hearing that parents and children are being separated as they cross the U.S. border, you want to learn more. You visit the website of an international immigrants’ rights group, a fact that is available to the government through mass internet surveillance. You sign up for the group’s mailing list, another fact that is potentially available to the government. The group then calls or emails to invite you to a local meeting. Same. Your license plates can be collected as you drive to the meeting; your face can be scanned and identified as you walk into and out of the meeting. If instead of visiting the website you visit the group’s Facebook page, Facebook knows that you did and that feeds into its profile of you, available to advertisers and political activists alike. Ditto if you like their page, share a link with your friends, or just post about the issue.

Maybe you are an immigrant yourself, documented or not. Or maybe some of your family is. Or maybe you have friends or coworkers who are. How likely are you to get involved if you know that your interest and concern can be gathered and used by government and corporate actors? What if the issue you are interested in is pro- or anti-gun control, anti-police violence or in support of the police? Does that make a difference?

Maybe the issue doesn’t matter, and you would never be afraid to be identified and tracked based on your political or social interests. But even if you are so fearless, you probably know someone who has more to lose, and thus more to fear, from their personal, sexual, or political beliefs being exposed.

This isn’t just hypothetical. In the months and years after the 9/11 terrorist attacks, many of us censored what we spoke about on social media or what we searched on the internet. We know from a 2013 PEN study that writers in the United States self-censored their browsing habits out of fear the government was watching. And this isn’t exclusively an American event; internet self-censorship is prevalent across the globe, China being a prime example.

Ultimately, this fear stagnates society in two ways. The first is that the presence of surveillance means society cannot experiment with new things without fear of reprisal, and that means those experiments—if found to be inoffensive or even essential to society—cannot slowly become commonplace, moral, and then legal. If surveillance nips that process in the bud, change never happens. All social progress—from ending slavery to fighting for women’s rights—began as ideas that were, quite literally, dangerous to assert. Yet without the ability to safely develop, discuss, and eventually act on those assertions, our society would not have been able to further its democratic values in the way that it has.

Consider the decades-long fight for gay rights around the world. Within our lifetimes we have made enormous strides to combat homophobia and increase acceptance of queer folks’ right to marry. Queer relationships slowly progressed from being viewed as immoral and illegal, to being viewed as somewhat moral and tolerated, to finally being accepted as moral and legal.

In the end it was the public nature of those activities that eventually slayed the bigoted beast, but the ability to act in private was essential in the beginning for the early experimentation, community building, and organizing.

Marijuana legalization is going through the same process: it’s currently sitting between somewhat moral, and—depending on the state or country in question—tolerated and legal. But, again, for this to have happened, someone decades ago had to try pot and realize that it wasn’t really harmful, either to themselves or to those around them. Then it had to become a counterculture, and finally a social and political movement. If pervasive surveillance meant that those early pot smokers would have been arrested for doing something illegal, the movement would have been squashed before inception. Of course the story is more complicated than that, but the ability for members of society to privately smoke weed was essential for putting it on the path to legalization.

We don’t yet know which subversive ideas and illegal acts of today will become political causes and positive social change tomorrow, but they’re around. And they require privacy to germinate. Take away that privacy, and we’ll have a much harder time breaking down our inherited moral assumptions.

The second way surveillance hurts our democratic values is that it encourages society to make more things illegal. Consider the things you do—the different things each of us does—that portions of society find immoral. Not just recreational drugs and gay sex, but gambling, dancing, public displays of affection. All of us do things that are deemed immoral by some groups, but are not illegal because they don’t harm anyone. But it’s important that these things can be done out of the disapproving gaze of those who would otherwise rally against such practices.

If there is no privacy, there will be pressure to change. Some people will recognize that their morality isn’t necessarily the morality of everyone—and that that’s okay. But others will start demanding legislative change, or using less legal and more violent means, to force others to match their idea of morality.

It’s easy to imagine the more conservative (in the small-c sense, not in the sense of the named political party) among us getting enough power to make illegal what they would otherwise be forced to witness. In this way, privacy helps protect the rights of the minority from the tyranny of the majority.

This is how we got Prohibition in the 1920s, and if we had had today’s surveillance capabilities in the 1920s it would have been far more effectively enforced. Recipes for making your own spirits would have been much harder to distribute. Speakeasies would have been impossible to keep secret. The criminal trade in illegal alcohol would also have been more effectively suppressed. There would have been less discussion about the harms of Prohibition, less “what if we didn’t…” thinking. Political organizing might have been difficult. In that world, the law might have stuck to this day.

China serves as a cautionary tale. The country has long been a world leader in the ubiquitous surveillance of its citizens, with the goal not of crime prevention but of social control. They are about to further enhance their system, giving every citizen a “social credit” rating. The details are yet unclear, but the general concept is that people will be rated based on their activities, both online and off. Their political comments, their friends and associates, and everything else will be assessed and scored. Those who are conforming, obedient, and apolitical will be given high scores. People without those scores will be denied privileges like access to certain schools and foreign travel. If the program is half as far-reaching as early reports indicate, the subsequent pressure to conform will be enormous. This social surveillance system is precisely the sort of surveillance designed to maintain the status quo.

For social norms to change, people need to deviate from these inherited norms. People need the space to try alternate ways of living without risking arrest or social ostracization. People need to be able to read critiques of those norms without anyone’s knowledge, discuss them without their opinions being recorded, and write about their experiences without their names attached to their words. People need to be able to do things that others find distasteful, or even immoral. The minority needs protection from the tyranny of the majority.

Privacy makes all of this possible. Privacy encourages social progress by giving the few room to experiment free from the watchful eye of the many. Even if you are not personally chilled by ubiquitous surveillance, the society you live in is, and the personal costs are unequivocal."
freedom  surveillance  authoritarianism  privacy  2018  bruceschneier  experimentation  ostracization  prohibition  history  legalization  society  liberty  creativity  unschooling  deschooling  us  parenting  schooling  learning  howwelearn  behavior 
november 2018 by robertogreco
PureOS
"A user friendly, secure and freedom respecting OS for your daily usage.
With PureOS, you are the only one in control of your digital life.

Free/libre software
PureOS is a derivative of Debian GNU/Linux, with the best privacy-protecting software applications preinstalled.

Cutting-edge technology
With GNOME 3 and Wayland, enjoy fluid high-framerate videos, frame-perfect animations and better power management.

Security and Privacy
PureOS helps you surf the web safely, without being tracked by advertisers or marketers."
linux  privacy  security  free  pureos  debian  opensource 
june 2018 by robertogreco
King GAFA – And The Magical 0-1 Crop
"Let us tell you a fairytale – a fairytale about online privacy and data sovereignty. But with dragons! Episode by episode, you’ll be equipped with the tools and knowledge of a privacy knight – fight for your digital rights! Follow the arrow to enter the kingdom…"
ethics  privacy  data  online  web  internet  sovereignty 
may 2018 by robertogreco
Are.na / Blog – Alternate Digital Realities
"Writer David Zweig, who interviewed Grosser about the Demetricator for The New Yorker, describes a familiar sentiment when he writes, “I’ve evaluated people I don’t know on the basis of their follower counts, judged the merit of tweets according to how many likes and retweets they garnered, and felt the rush of being liked or retweeted by someone with a large following. These metrics, I know, are largely irrelevant; since when does popularity predict quality? Yet, almost against my will, they exert a pull on me.” Metrics can be a drug. They can also influence who we think deserves to be heard. By removing metrics entirely, Grosser’s extension allows us to focus on the content—to be free to write and post without worrying about what will get likes, and to decide for ourselves if someone is worth listening to. Additionally, it allows us to push back against a system designed not to cultivate a healthy relationship with social media but to prioritize user-engagement in order to sell ads."
digital  online  extensions  metrics  web  socialmedia  internet  omayeliarenyeka  2018  race  racism  activism  davidzeig  bejamingrosser  twitter  google  search  hangdothiduc  reginafloresmir  dexterthomas  whitesupremacy  tolulopeedionwe  patriarchy  daniellesucher  jennyldavis  mosaid  shannoncoulter  taeyoonchoi  rodrigotello  elishacohen  maxfowler  jamesbaldwin  algorithms  danielhowe  helennissenbaum  mushonzer-aviv  browsers  data  tracking  surveillance  ads  facebook  privacy  are.na 
april 2018 by robertogreco
This Children's Book About Sex And Gender Is A Total Game-Changer
"Sex is a Funny Word is nothing short of revolutionary. Cory Silverberg and Fiona Smyth's newest book is brilliant in its approach to giving caregivers and educators the tools they need to talk to kids about their bodies. Not only is it "the first trans-inclusive book for kids," but it also uses inclusionary language and diverse representation across race, ability, gender, and sexuality, to hone in on the most important aspects of discussing sex and bodies with kids aged 8-12. It is the second in a trilogy of books – the first, What Makes a Baby, is a beautiful, balanced, and many-gendered explanation of baby-making for kids aged 5-8.

(While Sex is a Funny Word discusses body parts, gender, touch, and other topics related to the word “sex,” it doesn’t delve into reproduction — intercourse is being reserved for the third book, planned for release in fall 2017, which will be geared toward older kids.)

Sex is a Funny Word is revolutionizing the way caregivers can talk to kids about their bodies."



"Although I could have made this a list of the 7,000 things that Sex is a Funny Word does to revolutionize talking to kids about their bodies, out of respect for everyone's time I’ve narrowed it down to ten. It was really hard to do.

1. Representation of all bodies should be the norm, rather than an exception.



2. Honesty + information = kids’ confidence.



3. Gender is complicated… and kids know it!



4. Conversation > silence.



5. "Justice" is an essential word when speaking about bodies.



6. Privacy isn’t just for grown-ups.



7. Consent matters at every age."
books  children  sex  gender  consent  justice  privacy  bodies  conversation  silence  honesty  information  representation  sexed  parenting  corysilverberg  fionasmyth  2015  body 
january 2018 by robertogreco
The Identities Research Project - The Identities Project
"The Identities research project explores user experiences of identity technology, brought to you by Caribou Digital, Omidyar Network and the International Institute of Information Technology, Bangalore (IIITB).

About the Identities research
The need for user-centered research in “digital identity” arose out of concerns around top-down identity systems and lack of insights on how these are being understood and used, particularly amongst lower income populations.

Identities research methodology
For our user research, our choice of states was determined by both policy and practical considerations. We started our research in Bengaluru and rural Karnataka as our research team is familiar with the area and have the language skills. However, Bengaluru is also a prime site for research, as the second fastest growing metropolis in India, and highest number of educated immigrants."



"EPISODE 1
Changing details on your ID or using it in a transaction can be a bureaucratic and frustrating experience. Our research team has been hearing directly from users about these frustrations, and how they're working together to solve them.

EPISODE 2
If you're involved in the implementation of a new technology, adoption can seem like a binary issue - people either sign up, or they don't. But the reality is far more complex. In this episode, we're sharing some of the issues and insights about adoption we've learnt from users.

EPISODE 3
We're doing something very different with the Identities Project. Instead of inviting you all to read the report when it's finished, we are sharing our research as it happens via this website, our newsletter and a series of roundtable events in India, the United States and Sweden.

EPISODE 4
So far, The Identities Project has been telling the story of the users who rely on identity systems to manage their lives. As well as interviewing users, our research team have been speaking to officials in urban and rural centers who are at the front line of identity systems. They are the people who have to make the technology and policies actually work on the ground.

EPISODE 5
Some of the most valuable insights from our research have been about vulnerability, privacy and inclusion. Digital identity systems should improve every citizen’s interaction with the state. But they’re not always designed with the needs of every citizen in mind. This episode includes stories about how enrolling can expose vulnerable users to risk, how disabled users can be excluded from digital identity systems, and four insights from our research team on privacy and vulnerability. Finally, our video explores the common myth that poorer communities don’t care about privacy.

EPISODE 6
With our research, we wanted to especially focus on two questions with regard to gender: do women face different challenges to men in obtaining and formalizing their identity? And secondly, once they do have access to "an identity" — does it empower them in some way to ameliorate unequal gender dynamics?

EPISODE 7
Our hypothesis throughout this project is that research reports presented as locked PDFs are increasingly either not read or not acted on. Therefore, within the episodic nature of the way we’ve presented the research (whilst it was still happening) and as we’ve consciously avoided the large conference circuit and focused instead on small, intimate workshop meetings, we hope we’ve been able to present the final research in the most useful and engaging way possible."

[See also: "The ID Question: Who decides who you are in the digital age?"
https://howwegettonext.com/the-id-question-6fb3b56052b5

"Building the Foundation for a More Inclusive, Secure Digital Identity in India"
http://www.omidyar.com/blog/building-foundation-more-inclusive-secure-digital-identity-india ]
identity  bangalore  anjaliramachandran  policy  rural  bengaluru  karnataka  india  technology  digital  digitalidentity  privacy  vulnerability  inclusion  inclusivity  gender 
november 2017 by robertogreco
privacy not included
"A Guide to Make Shopping for Connected Gifts Safer, Easier, and Way More Fun"
privacy  security  2017  mozilla 
november 2017 by robertogreco
An Xiao Busingye Mina en Instagram: “All of these things, including the (functioning) light bulb and the panda bear 🐼 have cameras for transmitting live streams. What happens…”
"All of these things, including the (functioning) light bulb and the panda bear 🐼 have cameras for transmitting live streams. What happens as this scales up? What are the implications for surveillance and voyeurism? For documentation of police brutality and human rights abuses? Welcome to your privacy nightmare, though if there's anything we've learned from the past few years, cameras can also empower the vulnerable under certain circumstances."
anxiaomina  surveillance  2017  privacy  technology  cameras  policebrutality  voyeurism 
august 2017 by robertogreco
Who Has Your Back? Government Data Requests 2017 | Electronic Frontier Foundation
"In this era of unprecedented digital surveillance and widespread political upheaval, the data stored on our cell phones, laptops, and especially our online services are a magnet for government actors seeking to track citizens, journalists, and activists.

In 2016, the United States government sent at least 49,868 requests to Facebook for user data. In the same time period, it sent 27,850 requests to Google and 9,076 to Apple.1 These companies are not alone: where users see new ways to communicate and store data, law enforcement agents see new avenues for surveillance.

There are three safeguards to ensure that data we send to tech companies don’t end up in a government database: technology, law, and corporate policies. Technology—including the many ways data is deleted, obscured, or encrypted to render it unavailable to the government—is beyond the scope of this report.2 Instead, we’ll focus on law and corporate policies. We’ll turn a spotlight on how the policies of technology companies either advance or hinder the privacy rights of users when the U.S. government comes knocking,3 and we’ll highlight those companies advocating to shore up legal protections for user privacy.

Since the Electronic Frontier Foundation started publishing Who Has Your Back seven years ago, we’ve seen major technology companies bring more transparency to how and when they divulge our data to the government. This shift has been fueled in large part by public attention. The Snowden revelations of 2013 and the resulting public conversation about digital privacy served as a major catalyst for widespread changes among the privacy policies of big companies. While only two companies earned credit in all of our criteria in 2013 (at a time when the criteria were somewhat less stringent than today4), in our 2014 report, there were nine companies earning credit in every category.

Today, technology users expect tech companies to have transparency around government access to user data, and to stand up for user privacy when appropriate. And companies are increasingly meeting those expectations.

But there are still many companies that lag behind, fail to enact best practices around transparency, or don’t prioritize standing up for user privacy.

The role of Who Has Your Back is to provide objective measurements for analyzing the policies and advocacy positions of major technology companies when it comes to handing data to the government. We focus on a handful of specific, measurable criteria that can act as a vital stopgap against unfettered government access to user data. Through this report, we hope to galvanize widespread changes in the policies of technology companies to ensure our digital lives are not subject to invasive and undemocratic government searches.

Major Findings and Trends

Our major findings include:

• Every company we evaluate has adopted baseline industry best practices, such as publishing a transparency report and requiring a warrant before releasing user content to the government.

• Nine companies are receiving credit in all five categories: Adobe, Credo, Dropbox, Lyft, Pinterest, Sonic, Uber, Wickr, and Wordpress.

• The four lowest performing companies are all telecoms: AT&T, Comcast, T-Mobile, and Verizon.

• Amazon and WhatsApp’s policies fall short of other similar technology companies.

We are pleased to announce that nine companies earned stars in every category we evaluated in this year’s report: Adobe, , Dropbox, Lyft, Pinterest, Sonic, Uber, Wickr and Wordpress. Not only that, but each of these companies has a track record of defending user privacy. Lyft and Uber both earned credit in each of our categories for both years they have been in our report. Credo and Sonic have earned credit for standing up for transparency and user privacy in every category we evaluate for as long as they have been included in our report. The other all-star companies—Adobe, Dropbox, Pinterest, Wickr, and Wordpress—have improved their policies over the years, and are recognized repeatedly in this annual report for adopting the best practices around privacy and transparency."
eff  data  privacy  security  2017 
july 2017 by robertogreco
Frontier notes on metaphors: the digital as landscape and playground - Long View on Education
"I am concerned with the broader class of metaphors that suggest the Internet is an inert and open place for us to roam. Scott McLeod often uses the metaphor of a ‘landscape’: “One of schools’ primary tasks is to help students master the dominant information landscape of their time.”

McLeod’s central metaphor – mastering the information landscape – fits into a larger historical narrative that depicts the Internet as a commons in the sense of “communally-held space, one which it is specifically inappropriate for any single individual or subset of the community (including governments) to own or control.” Adriane Lapointe continues, “The internet is compared to a landscape which can be used in various ways by a wide range of people for whatever purpose they please, so long as their actions do not interfere with the actions of others.”

I suspect that the landscape metaphor resonates with people because it captures how they feel the Internet should work. Sarah T. Roberts argues that we are tempted to imagine the digital as “valueless, politically neutral and as being without material consequences.” However, the digital information landscape is an artifact shaped by capitalism, the US military, and corporate power. It’s a landscape that actively tracks and targets us, buys and sells our information. And it’s mastered only by the corporations, CEOs and venture capitalists.

Be brave? I have no idea what it would mean to teach students how to ‘master’ the digital landscape. The idea of ‘mastering’ recalls the popular frontier and pioneer metaphors that have fallen out of fashion since 1990s as the Internet became ubiquitous, as Jan Rune Holmevik notes. There is of course a longer history of the “frontiers of knowledge” metaphor going back to Francis Bacon and passing through Vannevar Bush, and thinking this way has become, according to Gregory Ulmer, “ubiquitous, a reflex, a habit of mind that shapes much of our thinking about inquiry” – and one that needs to be rethought if we take the postcolonial movement seriously.

While we might worry about being alert online, we aren’t exposed to enough stories about the physical and material implications of the digital. It’s far too easy to think that the online landscape exists only on our screens, never intersecting with the physical landscape in which we live. Yet, the Washington Post reports that in order to pave the way for new data centers, “the Prince William County neighborhood [in Virginia] of mostly elderly African American homeowners is being threatened by plans for a 38-acre computer data center that will be built nearby. The project requires the installation of 100-foot-high towers carrying 230,000-volt power lines through their land. The State Corporation Commission authorized Dominion Virginia Power in late June to seize land through eminent domain to make room for the towers.” In this case, the digital is transforming the physical landscape with hostile indifference to the people that live there.

Our students cannot be digitally literate citizens if they don’t know stories about the material implications about the digital. Cathy O’Neil has developed an apt metaphor for algorithms and data – Weapons of Math Destruction – which have the potential to destroy lives because they feed on systemic biases. In her book, O’Neil explains that while attorneys cannot cite the neighborhood people live in as a reason to deny prisoners parole, it is permissible to package that judgment into an algorithm that generates a prediction of recidivism."



"When I talk to students about the implications of their searches being tracked, I have no easy answers for them. How can youth use the net for empowerment when there’s always the possibility that their queries will count against them? Yes, we can use google to ask frank questions about our sexuality, diet, and body – or any of the other ways we worry about being ‘normal’ – but when we do so, we do not wander a non-invasive landscape. And there few cues that we need to be alert or smart.

Our starting point should not be the guiding metaphors of the digital as a playground where we need to practice safety or a landscape that we can master, but Shoshana Zuboff’s analysis of surveillance capitalism: “The game is selling access to the real-time flow of your daily life –your reality—in order to directly influence and modify your behavior for profit. This is the gateway to a new universe of monetization opportunities: restaurants who want to be your destination. Service vendors who want to fix your brake pads. Shops who will lure you like the fabled Sirens.”



So what do we teach students? I think that Chris Gilliard provides the right pedagogical insight to end on:
Students are often surprised (and even angered) to learn the degree to which they are digitally redlined, surveilled, and profiled on the web and to find out that educational systems are looking to replicate many of those worst practices in the name of “efficiency,” “engagement,” or “improved outcomes.” Students don’t know any other web—or, for that matter, have any notion of a web that would be different from the one we have now. Many teachers have at least heard about a web that didn’t spy on users, a web that was (theoretically at least) about connecting not through platforms but through interfaces where individuals had a significant amount of choice in saying how the web looked and what was shared. A big part of the teaching that I do is to tell students: “It’s not supposed to be like this” or “It doesn’t have to be like this.”
"
banjamindoxtdator  2017  landscapes  playgrounds  georgelakoff  markjohnson  treborscolz  digitalcitizenship  internet  web  online  mckenziewark  privacy  security  labor  playbor  daphnedragona  gamification  uber  work  scottmcleod  adrianelapointe  sarahroberts  janruneholmevik  vannevabush  gregoryulmer  francisbacon  chrisgilliard  pedagogy  criticalthinking  shoshanazuboff  surveillance  surveillancecapitalism  safiyanoble  google  googleglass  cathyo'neil  algorithms  data  bigdata  redlining  postcolonialism  race  racism  criticaltheory  criticalpedagogy  bias 
july 2017 by robertogreco
Hayati - Fabrica
"Hayati, “my life” in Arabic, is an intimate photographic diary created entirely on a smartphone by Karim El Maktafi, in which the author reflects on his own identity as an Italian born from Moroccan parents. The photographer chose a smartphone, a medium he considers less intrusive than a camera. With this tool he creates suspended, enigmatic images that capture the sense of uncertainty, doubt and disorientation of those who live between two seemingly incompatible realities. Embracing a single status is not easy; feeling like an odd cultural hybrid happens often. Yet, while trying to define this identity, one understands the advantage of “standing on a doorstep”. One can decide who to be or where to belong, or else create new ties, keeping everything learnt along the path: more languages, more cultural taboos and references, more prohibitions to withstand and explain. Hayati explores some of these realities, using the photographer’s own life, family and friends as a case study, sometimes concealing their faces to respect their wish for privacy.

Born in Desenzano del Garda, Karim El Maktafi graduated from the Italian Institute of Photography in Milan in 2013. He has collaborated with several photographers in various fields and has then explored the concept of identity through reportages and portraits. His work has been presented in exhibitions such as the Brescia Photo Festival, the Festival of Ethical Photography, and YES Collective in Auckland. Hayati was realised between 2016 and 2017, during El Maktafi’s residency at Fabrica. It was awarded the PHM 2017 Grant – New Generation Prize, and is shortlisted for the CAP Prize 2017 – Contemporary African Photography Prize."
photography  smartphones  karimelmaktafi  fabrica  classideas  privacy  intimacy  hybrids  thirdculturekids  uncertinty  doubt  immigration  migration  identity  disorientation  incompatibility 
may 2017 by robertogreco
How Google Took Over the Classroom - The New York Times
"The tech giant is transforming public education with low-cost laptops and
free apps. But schools may be giving Google more than they are getting."



"Mr. Casap, the Google education evangelist, likes to recount Google’s emergence as an education powerhouse as a story of lucky coincidences. The first occurred in 2006 when the company hired him to develop new business at its office on the campus of Arizona State University in Tempe.

Mr. Casap quickly persuaded university officials to scrap their costly internal email service (an unusual move at the time) and replace it with a free version of the Gmail-and-Docs package that Google had been selling to companies. In one semester, the vast majority of the university’s approximately 65,000 students signed up.

And a new Google business was born.

Mr. Casap then invited university officials on a road show to share their success story with other schools. “It caused a firestorm,” Mr. Casap said. Northwestern University, the University of Southern California and many others followed.

This became Google’s education marketing playbook: Woo school officials with easy-to-use, money-saving services. Then enlist schools to market to other schools, holding up early adopters as forward thinkers among their peers.

The strategy proved so successful in higher education that Mr. Casap decided to try it with public schools.

As it happened, officials at the Oregon Department of Education were looking to help local schools cut their email costs, said Steve Nelson, a former department official. In 2010, the state officially made Google's education apps available to its school districts.

“That caused the same kind of cascade,” Mr. Casap said. School districts around the country began contacting him, and he referred them to Mr. Nelson, who related Oregon’s experience with Google’s apps.

By then, Google was developing a growth strategy aimed at teachers — the gatekeepers to the classroom — who could influence the administrators who make technology decisions. “The driving force tends to be the pedagogical side,” Mr. Bout, the Google education executive, said. “That is something we really embraced.”

Google set up dozens of online communities, called Google Educator Groups, where teachers could swap ideas for using its tech. It started training programs with names like Certified Innovator to credential teachers who wanted to establish their expertise in Google’s tools or teach their peers to use them.

Soon, teachers began to talk up Google on social media and in sessions at education technology conferences. And Google became a more visible exhibitor and sponsor at such events. Google also encouraged school districts that had adopted its tools to hold “leadership symposiums” where administrators could share their experiences with neighboring districts.

Although business practices like encouraging educators to spread the word to their peers have become commonplace among education technology firms, Google has successfully deployed these techniques on a such a large scale that some critics say the company has co-opted public school employees to gain market dominance.

“Companies are exploiting the education space for sales and public good will,” said Douglas A. Levin, the president of EdTech Strategies, a consulting firm. Parents and educators should be questioning Google’s pervasiveness in schools, he added, and examining “how those in the public sector are carrying the message of Google branding and marketing.”

Mr. Bout of Google disagreed, saying that the company’s outreach to educators was not a marketing exercise. Rather, he said, it was an effort to improve education by helping teachers learn directly from their peers how to most effectively use Google’s tools.

“We help to amplify the stories and voices of educators who have lessons learned,” he said, “because it can be challenging for educators to find ways to share with each other.”"
google  sfsh  education  apple  data  privacy  billfitzgerald  chicago  publicschools  technology  edtech  googleclassroom  googleapps  learning  schools  advertising  jaimecasap 
may 2017 by robertogreco
A lawyer rewrote Instagram's terms of service for kids. Now you can understand all of the private data you and your teen are giving up to social media — Quartz
"– Officially you own any original pictures and videos you post, but we are allowed to use them, and we can let others use them as well, anywhere around the world. Other people might pay us to use them and we will not pay you for that.

– […] we may keep, use and share your personal information with companies connected with Instagram. This information includes your name, email address, school, where you live, pictures, phone number, your likes and dislikes, where you go, who your friends are, how often you use Instagram, and any other personal information we find such as your birthday or who you are chatting with, including in private messages (DMs).

– We might send you adverts connected to your interests which we are monitoring. You cannot stop us doing this and it will not always be obvious that it is an advert.

– We can change or end Instagram, or stop you accessing Instagram at any time, for any reason and without letting you know in advance. We can also delete posts and other content randomly, without telling you, for any reason. If we do this, we will not be responsible for paying out any money and you won’t have any right to complain.

– We can force you to give up your username for any reason.

– We can, but do not have to, remove, edit, block and/or monitor anything posted or any accounts that we think breaks any of these rules. We are not responsible if somebody breaks the law or breaks these rules; but if you break them, you are responsible."
instagram  facebook  privacy  security  tos  termsofservice  2017  law  parenting 
january 2017 by robertogreco
Opera Free VPN - Unlimited Ad-Blocking VPN on the App Store
"Opera VPN blocks ads and lets you change your virtual location. Unblock more content and stop trackers from following you around the web — completely free.

With Opera VPN, you get:
- One of the fastest, most reliable VPN services
- Unblocked access via your choice of five virtual locations (with more coming soon)
- A built-in ad blocker for ads in Safari, Chrome and other apps
- A built-in tracker blocker to enhance online privacy

Opera VPN is one of the best and fastest ways to access more of your favorite online content for free. With super-fast VPN servers and other premium features included for free, Opera VPN is the smart choice for you.

Opera VPN includes free ad and tracker blocking! Block annoying ads and save time, battery life and sanity. You can also help prevent pesky sites from tracking your footsteps and activities on the web.

Opera VPN is a service provided by SurfEasy, Inc., an Opera company. Opera's 20-year history of web innovation enables more than 350 million people worldwide to do what matters most to them online. Get the performance you need from people you can trust."

[via: https://www.youtube.com/watch?v=xH1pFr5819E ]
vpn  ios  opera  adblocking  adblockers  privacy  iphone 
december 2016 by robertogreco
TILTY #21 - Selected Annotated Bibliography for the Librarian Resistance
"I am writing but I am mostly still listening. Letting my friends and community know I am here for them. And reading poetry.

[screenshot of Wendell Berry’s "The Peace of Wild Things"]

Not to be all "Hey it's going to be fine if we all just reconnect with nature and not let it bother us" but more that self-care is useful and the birds don't give a shit about this election so sometimes it can be good to just sit with them to recenter before you get back to work.

Post-election time in America is time for a lot of reflection, frustration, and planning and scheming for whatever is coming down the road. I've been reading and assessing.

My peripatetic lifestyle has always held some risks and that hasn't changed. My position otherwise is not that risky. Many people are being thrown into incredibly vulnerable positions as a result of this election--positions that were only getting slightly stabilized over the last decade--and this is happening at a national or international level, not just in our local communities. I'm proud of what libraries have been able to accomplish in the world so far. I offer a reading list and hopes that we can weather this storm together and form an effective and ruthlessly efficient resistance.


Brief Annotated Bibliography for the Librarian Resistance

• While I am still helping people get their first email addresses, people are blaming algorithms for losing the election for HRC. I am not forwarding this position personally (also not NOT forwarding it) but it's a fascinating look at what can happen when we can't get under the hood of our systems. Noted for later.

• The folks from We Need Diverse Books came out with a post-election statement.

• EFF has provided a very good Surveillance Self-Defense page for those who feel they need to communicate significantly more securely than they have been.

• Helping people with questions about what this all means for them? Lambda Legal has a post-election FAQ for GLBTQ folks. More specifics for other vulnerable populations can be found at Concrete Suggestions in Preparation for January 2017’s Change in American Government a nice repurposable online document (sometimes overloaded with readers, try again if you can't get it).

• Libraries can be a health lifeline for people most at risk, according to a US study (headline is from Reuters, let me know if you'd like me to email you the PDF of the study)

• Rebecca Solnit's book Hope in the Dark is available for free for a few more days.

• Libraries step up (in times of crisis) is a place on Facebook where you can get help with library issues concerning this recent election.

• How to weather the Trump Administration? Head to the library. An OpEd piece in the LA Times.

Librarians may be the only first responders holding the line between America and a raging national pandemic of absolutism. More desperately than ever, we need our libraries now, and all three of their traditional pillars: 1) education, 2) good reading and 3) the convivial refuge of a place apart. In other words, libraries may be the last coal we have left to blow on.

**********

Urban Libraries Unite is having their annual fund drive and will send you a My Library is for Everyone button if you donate, or you could just make your own button (but donating anyhow is a good idea, I did).

[image]

Maybe you don't know what to do? Letting people know that the library is for everyone, maybe just "surfacing" the policies that you already have like Lawrence Public Library has done, can show people that you know that this is a tough time for many and that you are there for them.

[image]

Or something like this? Other suggestions from Programming Librarian.

**********

I am bad at talking about my feelings, so I will continue mostly not to. I am better at talking about, and taking, actions. Pointers welcome. Replies to this newsletter always read and replied to. Signing off with a quote from Toni Morrison

"I know the world is bruised and bleeding, and though it is important not to ignore its pain, it is also critical to refuse to succumb to its malevolence. Like failure, chaos contains information that can lead to knowledge—even wisdom. Like art."

and another poem from Wendell Berry.

[screenshot of Wendell Berry’s "The Real Work"]"
jessamynwest  libraries  politics  resistance  donaldtrump  2016  wendellberry  tonimorrison  poetry  librarians  inclusivity  protection  rebeccasolnit  eff  security  privacy  refuge 
november 2016 by robertogreco
Privacy Concerns for ClassDojo and Other Tracking Apps for Schoolchildren - The New York Times
"One morning in mid-October, Mr. Fletcher walked to the front of the classroom where an interactive white board displayed ClassDojo, a behavior-tracking app that lets teachers award points or subtract them based on a student’s conduct. On the board was a virtual classroom showing each student’s name, a cartoon avatar and the student’s scores so far that week.

“I’m going to have to take a point for no math homework,” Mr. Fletcher said to a blond boy in a striped shirt and then clicked on the boy’s avatar, a googly-eyed green monster, and subtracted a point.

The program emitted a disappointed pong sound, audible to the whole class — and sent a notice to the child’s parents if they had signed up for an account on the service."
children  data  panopticon  surveillance  edtech  classdojo  2016  teaching  education  schools  privacy  via:lukeneff 
november 2016 by robertogreco
Surveillance Self-Defense | Tips, Tools and How-tos for Safer Online Communications
"Modern technology has given those in power new abilities to eavesdrop and collect data on innocent people. Surveillance Self-Defense is EFF's guide to defending yourself and your friends from surveillance by using secure technology and developing careful practices.

Select an article from our index to learn about a tool or issue, or check out one of our playlists to take a guided tour through a new set of skills."

[See also:

"Worried about the NSA under Trump? Here's how to protect yourself: We don’t yet know Trump’s surveillance plans, but follow these guidelines if you think it’s better to be safe than sorry"
https://www.theguardian.com/technology/2016/nov/10/nsa-trump-protect-yourself

"Surveillance Self-Defense Against the Trump Administration"
https://theintercept.com/2016/11/12/surveillance-self-defense-against-the-trump-administration/

"A 70-Day Web Security Action Plan for Artists and Activists Under Siege"
https://medium.com/@TeacherC/90dayactionplan-ff86b1de6acb

"Surveillance and inaction"
https://phiffer.org/writing/surveillance-and-inaction/

CryptoParty
https://www.cryptoparty.in/

"Digital Security and Source Protection for Journalists – A Handbook"
http://susanemcgregor.com/digital-security/

"Don’t panic! Download “A First Look at Digital Security”"
https://www.accessnow.org/a-first-look-at-digital-security/

"Protecting Your Digital Life in 7 Easy Steps"
http://www.nytimes.com/2016/11/17/technology/personaltech/encryption-privacy.html

"The Source Guide to Defending Accounts Against Common Attacks"
https://source.opennews.org/en-US/guides/defending-accounts/ ]
eff  privacy  security  surveillance  howto  tutorials  technology  2016  nsa  onlinetoolkit  digital  internet  web  online 
november 2016 by robertogreco
Courtney Martin: The new American Dream | TED Talk Subtitles and Transcript | TED.com
[via: https://twitter.com/campcreek/status/792521887343607810 ]

"Now, artist Ann Hamilton has said, "Labor is a way of knowing." Labor is a way of knowing. In other words, what we work on is what we understand about the world. If this is true, and I think it is, then women who have disproportionately cared for the little ones and the sick ones and the aging ones, have disproportionately benefited from the most profound kind of knowing there is: knowing the human condition. By prioritizing care, men are, in a sense, staking their claim to the full range of human existence.

Now, this means the nine-to-five no longer works for anyone. Punch clocks are becoming obsolete, as are career ladders. Whole industries are being born and dying every day. It's all nonlinear from here. So we need to stop asking kids, "What do you want to be when you grow up?" and start asking them, "How do you want to be when you grow up?" Their work will constantly change. The common denominator is them. So the more they understand their gifts and create crews of ideal collaborators, the better off they will be.

The challenge ahead is to reinvent the social safety net to fit this increasingly fragmented economy. We need portable health benefits. We need policies that reflect that everyone deserves to be vulnerable or care for vulnerable others, without becoming destitute. We need to seriously consider a universal basic income. We need to reinvent labor organizing. The promise of a work world that is structured to actually fit our 21st century values, not some archaic idea about bringing home the bacon, is long overdue -- just ask your mother.

Now, how about the second question: How should we live? We should live like our immigrant ancestors. When they came to America, they often shared apartments, survival tactics, child care -- always knew how to fill one more belly, no matter how small the food available. But they were told that success meant leaving the village behind and pursuing that iconic symbol of the American Dream, the white picket fence. And even today, we see a white picket fence and we think success, self-possession. But when you strip away the sentimentality, what it really does is divides us. Many Americans are rejecting the white picket fence and the kind of highly privatized life that happened within it, and reclaiming village life, reclaiming interdependence instead.

Fifty million of us, for example, live in intergenerational households. This number exploded with the Great Recession, but it turns out people actually like living this way. Two-thirds of those who are living with multiple generations under one roof say it's improved their relationships. Some people are choosing to share homes not with family, but with other people who understand the health and economic benefits of daily community. CoAbode, an online platform for single moms looking to share homes with other single moms, has 50,000 users. And people over 65 are especially prone to be looking for these alternative living arrangements. They understand that their quality of life depends on a mix of solitude and solidarity. Which is true of all of us when you think about it, young and old alike. For too long, we've pretended that happiness is a king in his castle. But all the research proves otherwise. It shows that the healthiest, happiest and even safest -- in terms of both climate change disaster, in terms of crime, all of that -- are Americans who live lives intertwined with their neighbors.

Now, I've experienced this firsthand. For the last few years, I've been living in a cohousing community. It's 1.5 acres of persimmon trees, this prolific blackberry bush that snakes around a community garden, all smack-dab, by the way, in the middle of urban Oakland. The nine units are all built to be different, different sizes, different shapes, but they're meant to be as green as possible. So big, shiny black solar cells on our roof mean our electricity bill rarely exceeds more than five bucks in a month. The 25 of us who live there are all different ages and political persuasions and professions, and we live in homes that have everything a typical home would have. But additionally, we share an industrial-sized kitchen and eating area, where we have common meals twice a week.

Now, people, when I tell them I live like this, often have one of two extreme reactions. Either they say, "Why doesn't everyone live like this?" Or they say, "That sounds totally horrifying. I would never want to do that." So let me reassure you: there is a sacred respect for privacy among us, but also a commitment to what we call "radical hospitality" -- not the kind advertised by the Four Seasons, but the kind that says that every single person is worthy of kindness, full stop, end of sentence.

The biggest surprise for me of living in a community like this? You share all the domestic labor -- the repairing, the cooking, the weeding -- but you also share the emotional labor. Rather than depending only on the idealized family unit to get all of your emotional needs met, you have two dozen other people that you can go to to talk about a hard day at work or troubleshoot how to handle an abusive teacher. Teenagers in our community will often go to an adult that is not their parent to ask for advice. It's what bell hooks called "revolutionary parenting," this humble acknowledgment that kids are healthier when they have a wider range of adults to emulate and count on. Turns out, adults are healthier, too. It's a lot of pressure, trying to be that perfect family behind that white picket fence.

The "new better off," as I've come to call it, is less about investing in the perfect family and more about investing in the imperfect village, whether that's relatives living under one roof, a cohousing community like mine, or just a bunch of neighbors who pledge to really know and look out for one another. It's good common sense, right? And yet, money has often made us dumb about reaching out. The most reliable wealth is found in relationship.

The new better off is not an individual prospect at all. In fact, if you're a failure or you think you're a failure, I've got some good news for you: you might be a success by standards you have not yet honored. Maybe you're a mediocre earner but a masterful father. Maybe you can't afford your dream home, but you throw legendary neighborhood parties. If you're a textbook success, the implications of what I'm saying could be more grim for you. You might be a failure by standards you hold dear but that the world doesn't reward. Only you can know.

I know that I am not a tribute to my great-grandmother, who lived such a short and brutish life, if I earn enough money to afford every creature comfort. You can't buy your way out of suffering or into meaning. There is no home big enough to erase the pain that she must have endured. I am a tribute to her if I live a life as connected and courageous as possible. In the midst of such widespread uncertainty, we may, in fact, be insecure. But we can let that insecurity make us brittle or supple. We can turn inward, lose faith in the power of institutions to change -- even lose faith in ourselves. Or we can turn outward, cultivate faith in our ability to reach out, to connect, to create.

Turns out, the biggest danger is not failing to achieve the American Dream. The biggest danger is achieving a dream that you don't actually believe in."
happiness  interdependence  courtneymartin  life  living  relationships  economics  success  solidarity  community  agesegregation  cohousing  us  2016  vulnerability  policy  health  housing  unschooling  deschooling  education  learning  privacy  hospitality  radicalhospitality  kindness  bellhooks  intergenerational  emotionallabor  labor  work  domesticlabor  families  money  wealth  individualism  failure  insecurity  meaningmaking  consumerism  materialism  connectedness  courage  sfsh  openstudioproject  lcproject 
october 2016 by robertogreco
Qubes OS Project
"What is Qubes OS?

Qubes OS is a security-oriented operating system (OS). The OS is the software that runs all the other programs on a computer. Some examples of popular OSes are Microsoft Windows, Mac OS X, Android, and iOS. Qubes is free and open-source software (FOSS). This means that everyone is free to use, copy, and change the software in any way. It also means that the source code is openly available so others can contribute to and audit it.

Why is OS security important?

Most people use an operating system like Windows or OS X on their desktop and laptop computers. These OSes are popular because they tend to be easy to use and usually come pre-installed on the computers people buy. However, they present problems when it comes to security. For example, you might open an innocent-looking email attachment or website, not realizing that you’re actually allowing malware (malicious software) to run on your computer. Depending on what kind of malware it is, it might do anything from showing you unwanted advertisements to logging your keystrokes to taking over your entire computer. This could jeopardize all the information stored on or accessed by this computer, such as health records, confidential communications, or thoughts written in a private journal. Malware can also interfere with the activities you perform with your computer. For example, if you use your computer to conduct financial transactions, the malware might allow its creator to make fraudulent transactions in your name.

Aren’t antivirus programs and firewalls enough?

Unfortunately, conventional security approaches like antivirus programs and (software and/or hardware) firewalls are no longer enough to keep out sophisticated attackers. For example, nowadays it’s common for malware creators to check to see if their malware is recognized by any popular antivirus programs. If it’s recognized, they scramble their code until it’s no longer recognizable by the antivirus programs, then send it out. The best antivirus programs will subsequently get updated once the antivirus programmers discover the new threat, but this usually occurs at least a few days after the new attacks start to appear in the wild. By then, it’s typically too late for those who have already been compromised. In addition, bugs are inevitably discovered in the common software we all use (such as our web browsers), and no antivirus program or firewall can prevent all of these bugs from being exploited.

How does Qubes OS provide security?

Qubes takes an approach called security by compartmentalization, which allows you to compartmentalize the various parts of your digital life into securely isolated compartments called qubes.

This approach allows you to keep the different things you do on your computer securely separated from each other in isolated qubes so that one qube getting compromised won’t affect the others. For example, you might have one qube for visiting untrusted websites and a different qube for doing online banking. This way, if your untrusted browsing qube gets compromised by a malware-laden website, your online banking activities won’t be at risk. Similarly, if you’re concerned about malicious email attachments, Qubes can make it so that every attachment gets opened in its own single-use disposable qube. In this way, Qubes allows you to do everything on the same physical computer without having to worry about a single successful cyberattack taking down your entire digital life in one fell swoop.

Moreover, all of these isolated qubes are integrated into a single, usable system. Programs are isolated in their own separate qubes, but all windows are displayed in a single, unified desktop environment with unforgeable colored window borders so that you can easily identify windows from different security levels. Common attack vectors like network cards and USB controllers are isolated in their own hardware qubes while their functionality is preserved through secure networking, firewalls, and USB device management. Integrated file and clipboard copy and paste operations make it easy to work across various qubes without compromising security. The innovative Template system separates software installation from software use, allowing qubes to share a root filesystem without sacrificing security (and saving disk space, to boot). Qubes even allows you to sanitize PDFs and images in a few clicks. Users concerned about privacy will appreciate the integration of Whonix with Qubes, which makes it easy to use Tor securely, while those concerned about physical hardware attacks will benefit from Anti Evil Maid.

How does Qubes OS compare to using a “live CD” OS?

Booting your computer from a live CD (or DVD) when you need to perform sensitive activities can certainly be more secure than simply using your main OS, but this method still preserves many of the risks of conventional OSes. For example, popular live OSes (such as Tails and other Linux distributions) are still monolithic in the sense that all software is still running in the same OS. This means, once again, that if your session is compromised, then all the data and activities performed within that same session are also potentially compromised.

How does Qubes OS compare to running VMs in a conventional OS?

Not all virtual machine software is equal when it comes to security. You may have used or heard of VMs in relation to software like VirtualBox or VMware Workstation. These are known as “Type 2” or “hosted” hypervisors. (The hypervisor is the software, firmware, or hardware that creates and runs virtual machines.) These programs are popular because they’re designed primarily to be easy to use and run under popular OSes like Windows (which is called the host OS, since it “hosts” the VMs). However, the fact that Type 2 hypervisors run under the host OS means that they’re really only as secure as the host OS itself. If the host OS is ever compromised, then any VMs it hosts are also effectively compromised.

By contrast, Qubes uses a “Type 1” or “bare metal” hypervisor called Xen. Instead of running inside an OS, Type 1 hypervisors run directly on the “bare metal” of the hardware. This means that an attacker must be capable of subverting the hypervisor itself in order to compromise the entire system, which is vastly more difficult.

Qubes makes it so that multiple VMs running under a Type 1 hypervisor can be securely used as an integrated OS. For example, it puts all of your application windows on the same desktop with special colored borders indicating the trust levels of their respective VMs. It also allows for things like secure copy/paste operations between VMs, securely copying and transferring files between VMs, and secure networking between VMs and the Internet."
qubesos  os  linux  privacy  security 
october 2016 by robertogreco
Have I been pwned? Check if your email has been compromised in a data breach
"Check if you have an account that has been compromised in a data breach"
security  privacy  passwords  web  hacking 
august 2016 by robertogreco
The Great Affluence Fallacy - The New York Times
"In 18th-century America, colonial society and Native American society sat side by side. The former was buddingly commercial; the latter was communal and tribal. As time went by, the settlers from Europe noticed something: No Indians were defecting to join colonial society, but many whites were defecting to live in the Native American one.

This struck them as strange. Colonial society was richer and more advanced. And yet people were voting with their feet the other way.

The colonials occasionally tried to welcome Native American children into their midst, but they couldn’t persuade them to stay. Benjamin Franklin observed the phenomenon in 1753, writing, “When an Indian child has been brought up among us, taught our language and habituated to our customs, yet if he goes to see his relations and make one Indian ramble with them, there is no persuading him ever to return.”

During the wars with the Indians, many European settlers were taken prisoner and held within Indian tribes. After a while, they had plenty of chances to escape and return, and yet they did not. In fact, when they were “rescued,” they fled and hid from their rescuers.

Sometimes the Indians tried to forcibly return the colonials in a prisoner swap, and still the colonials refused to go. In one case, the Shawanese Indians were compelled to tie up some European women in order to ship them back. After they were returned, the women escaped the colonial towns and ran back to the Indians.

Even as late as 1782, the pattern was still going strong. Hector de Crèvecoeur wrote, “Thousands of Europeans are Indians, and we have no examples of even one of those aborigines having from choice become European.”

I first read about this history several months ago in Sebastian Junger’s excellent book “Tribe.” It has haunted me since. It raises the possibility that our culture is built on some fundamental error about what makes people happy and fulfilled.

The native cultures were more communal. As Junger writes, “They would have practiced extremely close and involved child care. And they would have done almost everything in the company of others. They would have almost never been alone.”

If colonial culture was relatively atomized, imagine American culture of today. As we’ve gotten richer, we’ve used wealth to buy space: bigger homes, bigger yards, separate bedrooms, private cars, autonomous lifestyles. Each individual choice makes sense, but the overall atomizing trajectory sometimes seems to backfire. According to the World Health Organization, people in wealthy countries suffer depression by as much as eight times the rate as people in poor countries.

There might be a Great Affluence Fallacy going on — we want privacy in individual instances, but often this makes life generally worse.

Every generation faces the challenge of how to reconcile freedom and community — “On the Road” versus “It’s a Wonderful Life.” But I’m not sure any generation has faced it as acutely as millennials.

In the great American tradition, millennials would like to have their cake and eat it, too. A few years ago, Macklemore and Ryan Lewis came out with a song called “Can’t Hold Us,” which contained the couplet: “We came here to live life like nobody was watching/I got my city right behind me, if I fall, they got me.” In the first line they want complete autonomy; in the second, complete community.

But, of course, you can’t really have both in pure form. If millennials are heading anywhere, it seems to be in the direction of community. Politically, millennials have been drawn to the class solidarity of the Bernie Sanders campaign. Hillary Clinton — secretive and a wall-builder — is the quintessence of boomer autonomy. She has trouble with younger voters.

Professionally, millennials are famous for bringing their whole self to work: turning the office into a source of friendships, meaning and social occasions.

I’m meeting more millennials who embrace the mentality expressed in the book “The Abundant Community,” by John McKnight and Peter Block. The authors are notably hostile to consumerism.

They are anti-institutional and anti-systems. “Our institutions can offer only service — not care — for care is the freely given commitment from the heart of one to another,” they write.

Millennials are oriented around neighborhood hospitality, rather than national identity or the borderless digital world. “A neighborhood is the place where you live and sleep.” How many of your physical neighbors know your name?

Maybe we’re on the cusp of some great cracking. Instead of just paying lip service to community while living for autonomy, I get the sense a lot of people are actually about to make the break and immerse themselves in demanding local community movements. It wouldn’t surprise me if the big change in the coming decades were this: an end to the apotheosis of freedom; more people making the modern equivalent of the Native American leap."
society  capitalism  davidbrooks  2016  history  sebastianjunger  communalism  nativeamericans  abundance  depression  us  affluence  millenials  johnmcknight  peterblock  consumerism  care  hospitality  nationalism  local  community  privacy  isolation  competition  autonomy  berniesanders  solidarity  wealth  atomization  well-being  qualityoflife  hectordecrèvecoeur 
august 2016 by robertogreco
Snapchat, Instagram Stories, and the Internet of Forgetting - The New Yorker
"There was seemingly nothing wrong with Instagram, up to the moment that it underwent an identity crisis. Each day, as usual, some three hundred million users had been meticulously curating and sharing images of their lives, meals, selves, and bookshelves. Earlier this week, though, the app took a hard right turn. It introduced Stories, a feature that allows users to post photos and videos, sometimes embellished with text and illustrations, in a kind of slide show, which automatically disappears after twenty-four hours. The content must have been recorded recently—nothing older than a day can be uploaded—so the result is like viewing the backstage footage rather than the rehearsed performance. Stories looked nothing like Instagram and everything like Snapchat, another app that has for years offered users a platform for this very same interaction.

It has always been common for software developers to improve their work by co-opting their competitors’ ideas. Many functions of the iPhone, for instance, are the result of Apple’s artful borrowing—the Reading List in Safari closely resembles dedicated link-saving services, and after apps like Instagram and Hipstamatic became popular the company added a half-hearted set of analog-looking filters to its camera app. Snapchat itself is not immune to the practice. A few weeks ago, it débuted a feature called Memories, whereby users can post old photos or videos from their phone’s camera roll, rather than having to film or shoot in the moment. Instagram, which pre-dates Snapchat by less than a year, has offered this since it first appeared on the App Store, in 2010. Even Memories, however, doesn’t totally erase the immediacy of Snapchat, since a photo, no matter how old, still disappears after twenty-four hours, consistent with the over-all spirit of the app. Instagram’s more recent move, by contrast, seems to run counter to its precious spirit—a betrayal of all the careful curation and perfect visuals.

As a way of reaching new demographics, Stories makes sense. The posting tools mimic Snapchat, but they’re built right into an otherwise familiar app. Most of Snapchat’s interface is obscured and requires knowing the right taps and swipes to get around, even to add a friend, and it’s notoriously hard for people over the age of thirty—“olds,” in Internet speak—to master. Now these people have access to Snapchat-like socializing without the burden of navigating the app. But Stories is also an accommodation of the off-label ways in which another important demographic—teens—use Instagram. The average teen posts often but erases often, too, especially if the posts don’t receive enough likes, interaction, or attention from the right people. A recent Washington Post profile of Katherine Pommerening, an eighth grader from Virginia, noted that she never has more than a couple dozen posts visible on her Instagram profile at any given time. Teens love to post, but they love nearly as much to delete and unburden themselves of past gauche choices—the selfie taken in bad light, or with a then friend, now enemy. Pommerening and her cohort, in other words, have been rigging Instagram to do what Snapchat does automatically.

Snapchat has often been depicted as seedy and fly-by-night, a place for people to exchange illicit pictures without leaving much in the way of a virtual paper trail. This was particularly the case when the app first became popular. (Never mind the function that alerts you when someone has screenshotted one of your photos.) But it has since become clear that Snapchat holds a deeper appeal. It satisfies a craving for immediacy and ephemerality, one that has lately grown to encompass all of social media. Posts can’t simply disappear after they’re viewed—they have to expire, whether they’ve been seen or not. Back in 2013, Facebook released a study showing that the bigger and more diverse your online audience seems, the more pressure you feel to say the right thing, and so hesitate to post anything at all. But never posting, ironically, makes your not-so-recent history terrifyingly within reach; it could take a new friend only a few scrolls to reach the Facebook status updates from your college years, when they were meant to be seen only by a few close friends. The solution, then, is deletion—like the third-party Twitter tools that nuke your tweets after a set amount of time (a day, a week, a month).

Part of the explanation for this new desire, if indeed it is new, is that our collective understanding of the role of social media has changed. In 2012, Facebook spooked its users by making all their posts searchable; old status updates from when Facebook was a more closed environment felt so jarringly intimate that people were sure the company had published private-message exchanges by accident. This wasn’t true; we just used Facebook differently then, and we were younger then, and now we were suddenly, uncomfortably confronted with our past. Today, there are scattered indications that people want some space to be fully themselves online as they are, without years of their past selves trailing behind them. Teens, perhaps, feel this desire more acutely, and Instagram has responded.

For Facebook, which acquired Instagram in 2012, Stories is part of a concerted strategy. The company embodies the ship-of-Theseus paradox: we still use it every day, but over the years all of the parts have been upgraded, swapped, replaced. It has survived in large part through what might charitably be called inspiration—most recently, it picked up on the trend of live-streaming video from the apps Periscope and Meerkat, and integrated its own live streams right into the News Feed. Instagram has changed relatively little since Facebook bought it. But the app’s introduction of an expiring highlight reel is more than a shameless grab for one of Snapchat’s core features. It’s a response to a demand: on an Internet that always remembers, we are fighting for places we can go to forget."
instagram  facebook  socialmedia  snapchat  preservation  images  identity  youth  teen  privacy  ephemerality  immediacy  caseyjohnston  internet  forgetting  web  online  ephemeral 
august 2016 by robertogreco
Meet Moxie Marlinspike, the Anarchist Bringing Encryption to All of Us | WIRED
"Marlinspike isn’t particularly interested in a debate, either; his mind was made up long ago, during years as an anarchist living on the fringes of society. “From very early in my life I’ve had this idea that the cops can do whatever they want, that they’re not on your team,” Marlinspike told me. “That they’re an armed, racist gang.”

Marlinspike views encryption as a preventative measure against a slide toward Orwellian fascism that makes protest and civil disobedience impossible, a threat he traces as far back as J. Edgar Hoover’s FBI wiretapping and blackmailing of Martin Luther King Jr. “Moxie is compelled by the troublemakers of history and their stories,” says Tyler Rein­hard, a designer who worked on Signal. “He sees encryption tools not as taking on the state directly but making sure that there’s still room for people to have those stories.”

ASK MARLINSPIKE TO tell his own story, and—no surprise for a privacy zealot—he’ll often answer with diversions, mono­syllables, and guarded smiles. But anyone who’s crossed paths with him seems to have an outsize anecdote: how he once biked across San Francisco carrying a 40-foot-tall sailboat mast. The time he decided to teach himself to pilot a hot-air balloon, bought a used one from Craigslist, and spent a month on crutches after crashing it in the desert. One friend swears he’s seen Marlinspike play high-stakes rock-paper-scissors dozens of times—with bets of hundreds of dollars or many hours of his time on the line—and has never seen him lose.

But before Marlinspike was a subcultural contender for “most interesting man in the world,” he was a kid growing up with a different and far less interesting name on his birth certificate, somewhere in a region of central Georgia that he describes as “one big strip mall.” His parents—who called him Moxie as a nickname—separated early on. He lived mostly with his mother, a secretary and paralegal at a string of companies. Any other family details, like his real name, are among the personal subjects he prefers not to comment on.

Marlinspike hated the curiosity-killing drudgery of school. But he had the idea to try programming videogames on an Apple II in the school library. The computer had a Basic interpreter but no hard drive or even a floppy disk to save his code. Instead, he’d retype simple programs again and again from scratch with every reboot, copying in commands from manuals to make shapes fill the screen. Browsing the computer section of a local bookstore, the preteen Marlin­spike found a copy of 2600 magazine, the catechism of the ’90s hacker scene. After his mother bought a cheap desk­top computer with a modem, he used it to trawl bulletin board services, root friends’ computers to make messages appear on their screens, and run a “war-dialer” program overnight, reaching out to distant servers at random.

To a bored middle schooler, it was all a revelation. “You look around and things don’t feel right, but you’ve never been anywhere else and you don’t know what you’re missing,” Marlin­spike says. “The Internet felt like a secret world hidden within this one.”

By his teens, Marlinspike was working after school for a German software company, writing developer tools. After graduating high school—barely—he headed to Silicon Valley in 1999. “I thought it would be like a William Gibson novel,” he says. “Instead it was just office parks and highways.” Jobless and homeless, he spent his first nights in San Francisco sleeping in Alamo Square Park beside his desktop computer.

Eventually, Marlinspike found a programming job at BEA-owned Web­Logic. But almost as soon as he’d broken in to the tech industry, he wanted out, bored by the routine of spending 40 hours a week in front of a keyboard. “I thought, ‘I’m supposed to do this every day for the rest of my life?’” he recalls. “I got interested in experimenting with a way to live that didn’t involve working.”

For the next few years, Marlinspike settled into a Bay Area scene that was, if not cyberpunk, at least punk. He started squatting in abandoned buildings with friends, eventually moving into an old postal service warehouse. He began bumming rides to political protests around the country and uploading free audio books to the web of himself reading anarchist theorists like Emma Goldman.

He took up hitchhiking, then he upgraded his wanderlust to hopping freight trains. And in 2003 he spontaneously decided to learn to sail. He spent a few hundred dollars—all the money he had—on a beat-up 27-foot Catalina and rashly set out alone from San Francisco’s harbor for Mexico, teaching himself by trial and error along the way. The next year, Marlin­spike filmed his own DIY sailing documentary, called Hold Fast. It follows his journey with three friends as they navigate a rehabilitated, leaky sloop called the Pestilence from Florida to the Bahamas, finally ditching the boat in the Dominican Republic.

Even today, Marlinspike describes those reckless adven­tures in the itinerant underground as a kind of peak in his life. “Looking back, I and everyone I knew was looking for that secret world hidden in this one,” he says, repeating the same phrase he’d used to describe the early Internet. “I think we were already there.”

If anything can explain Marlinspike’s impulse for privacy, it may be that time spent off society’s grid: a set of experi­ences that have driven him to protect a less observed way of life. “I think he likes the idea that there is an unknown,” says Trevor Perrin, a security engineer who helped Marlinspike design Signal’s core protocol. “That the world is not a completely surveilled thing.”"



"Beneath its ultrasimple interface, Moxie Marlinspike’s crypto protocol hides a Rube Goldberg machine of automated moving parts. Here’s how it works.

1. When Alice installs an app that uses Marlinspike’s protocol, it generates pairs of numeric sequences known as keys. With each pair, one sequence, known as a public key, will be sent to the app’s server and shared with her contacts. The other, called a private key, is stored on Alice’s phone and is never shared with anyone. The first pair of keys serves as an identity for Alice and never changes. Subsequent pairs will be generated with each message or voice call, and these temporary keys won’t be saved.

2. When Alice contacts her friend Bob, the app combines their public and private keys—both their identity keys and the temporary ones generated for a new message or voice call—to create a secret shared key. The shared key is then used to encrypt and decrypt their messages or calls.

3. The secret shared key changes with each message or call, and old shared keys aren’t stored. That means an eavesdropper who is recording their messages can’t decrypt their older communications even if that spy hacks one of their devices. (Alice and Bob should also periodically delete their message history.)

4. To make sure she’s communicating with Bob and not an impostor, Alice can check Bob’s fingerprint, a shortened version of his public identity key. If that key changes, either because someone is impersonating Bob in a so-called man-in-the-middle attack or simply because he ­reinstalled the app, Alice’s app will display a warning."
moxiemarlinspike  encryption  privacy  security  2016  2600  surveillance  whatsapp  signal  messaging  anarchists  anarchism  openwhispersystems  tylerreinhard  emmagoldman  unschooling  education  learning  autodidacts  internet  web  online  work  economics  life  living  lawenforcement 
august 2016 by robertogreco
The Device is the Message
["THE-DEVICE-IS-THE-MESSAGE_PART_I"
http://blog.newhive.com/the-device-is-the-message_part_i/v

"The Device is the Message by Liliana Farber

Storage Un.it is a small project space located in a storage unit @ arebyte Gallery in London. The space features a series of projects, which take place online and investigate the relationship between the URL & IRL. The space was initiated in Nov 2015 as part of ‘The Wrong’ online Biennale.

The second residency in storage-un.it is artist Liliana Farber and her work titled the-device-is-the-message_Part_I.

The work focuses on the idea of the smartphone as an active agent in the way we interact with the real world, the art world and the online world, but also with each other. Confrontations become digitized and repercussions between the machine and its user are staged virtually.

In relation to the way in which the smartphone has become integral to the modern world, Farber will interrogate how this reliance affects real interactions — but also how the specific language of the virtual is shaping our perceptions of time, space and place in the real. The symbiotic relationship between the user, the machine and the notion of privacy is of interest for the artist and will be explored further via recordings and research with relation to her personal data usage.

A precise intimacy is at play between the user and the screen; private experiences are created but can also become part of the public domain. This idea of the boundaries between public and private can be seen by the way in which Farber is conducting her research and documenting the project’s progress. All aspects are continually updated via NewHive, and viewers can watch the project update in real time through September 10th, 2016.

Once the online residency is completed, the research undertaken will be presented in an exhibition displayed through the smartphone screen – both reflecting on the temporal nature of imagery and our constant exposure to content, a comment on the sub-sequential reliance on the screen to divulge information."]
thedeviceisthemessage  lilianafarber  newhive  smartphones  mobile  art  2016  privacy  online  internet  phones  time  space  place  public  private  imagery  netart 
july 2016 by robertogreco
Remarks at the SASE Panel On The Moral Economy of Tech
"I am only a small minnow in the technology ocean, but since it is my natural habitat, I want to make an effort to describe it to you.

As computer programmers, our formative intellectual experience is working with deterministic systems that have been designed by other human beings. These can be very complex, but the complexity is not the kind we find in the natural world. It is ultimately always tractable. Find the right abstractions, and the puzzle box opens before you.

The feeling of competence, control and delight in discovering a clever twist that solves a difficult problem is what makes being a computer programmer sometimes enjoyable.

But as anyone who's worked with tech people knows, this intellectual background can also lead to arrogance. People who excel at software design become convinced that they have a unique ability to understand any kind of system at all, from first principles, without prior training, thanks to their superior powers of analysis. Success in the artificially constructed world of software design promotes a dangerous confidence.

Today we are embarked on a great project to make computers a part of everyday life. As Marc Andreessen memorably frames it, "software is eating the world". And those of us writing the software expect to be greeted as liberators.

Our intentions are simple and clear. First we will instrument, then we will analyze, then we will optimize. And you will thank us.

But the real world is a stubborn place. It is complex in ways that resist abstraction and modeling. It notices and reacts to our attempts to affect it. Nor can we hope to examine it objectively from the outside, any more than we can step out of our own skin.

The connected world we're building may resemble a computer system, but really it's just the regular old world from before, with a bunch of microphones and keyboards and flat screens sticking out of it. And it has the same old problems.

Approaching the world as a software problem is a category error that has led us into some terrible habits of mind.

BAD MENTAL HABITS

First, programmers are trained to seek maximal and global solutions. Why solve a specific problem in one place when you can fix the general problem for everybody, and for all time? We don't think of this as hubris, but as a laudable economy of effort. And the startup funding culture of big risk, big reward encourages this grandiose mode of thinking. There is powerful social pressure to avoid incremental change, particularly any change that would require working with people outside tech and treating them as intellectual equals.

Second, treating the world as a software project gives us a rationale for being selfish. The old adage has it that if you are given ten minutes to cut down a tree, you should spend the first five sharpening your axe. We are used to the idea of bootstrapping ourselves into a position of maximum leverage before tackling a problem.

In the real world, this has led to a pathology where the tech sector maximizes its own comfort. You don't have to go far to see this. Hop on BART after the conference and take a look at Oakland, or take a stroll through downtown San Francisco and try to persuade yourself you're in the heart of a boom that has lasted for forty years. You'll see a residential theme park for tech workers, surrounded by areas of poverty and misery that have seen no benefit and ample harm from our presence. We pretend that by maximizing our convenience and productivity, we're hastening the day when we finally make life better for all those other people.

Third, treating the world as software promotes fantasies of control. And the best kind of control is control without responsibility. Our unique position as authors of software used by millions gives us power, but we don't accept that this should make us accountable. We're programmers—who else is going to write the software that runs the world? To put it plainly, we are surprised that people seem to get mad at us for trying to help.

Fortunately we are smart people and have found a way out of this predicament. Instead of relying on algorithms, which we can be accused of manipulating for our benefit, we have turned to machine learning, an ingenious way of disclaiming responsibility for anything. Machine learning is like money laundering for bias. It's a clean, mathematical apparatus that gives the status quo the aura of logical inevitability. The numbers don't lie.

Of course, people obsessed with control have to eventually confront the fact of their own extinction. The response of the tech world to death has been enthusiastic. We are going to fix it. Google Ventures, for example, is seriously funding research into immortality. Their head VC will call you a "deathist" for pointing out that this is delusional.

Such fantasies of control come with a dark side. Witness the current anxieties about an artificial superintelligence, or Elon Musk's apparently sincere belief that we're living in a simulation. For a computer programmer, that's the ultimate loss of control. Instead of writing the software, you are the software.

We obsess over these fake problems while creating some real ones.

In our attempt to feed the world to software, techies have built the greatest surveillance apparatus the world has ever seen. Unlike earlier efforts, this one is fully mechanized and in a large sense autonomous. Its power is latent, lying in the vast amounts of permanently stored personal data about entire populations.

We started out collecting this information by accident, as part of our project to automate everything, but soon realized that it had economic value. We could use it to make the process self-funding. And so mechanized surveillance has become the economic basis of the modern tech industry.

SURVEILLANCE CAPITALISM

Surveillance capitalism has some of the features of a zero-sum game. The actual value of the data collected is not clear, but it is definitely an advantage to collect more than your rivals do. Because human beings develop an immune response to new forms of tracking and manipulation, the only way to stay successful is to keep finding novel ways to peer into people's private lives. And because much of the surveillance economy is funded by speculators, there is an incentive to try flashy things that will capture the speculators' imagination, and attract their money.

This creates a ratcheting effect where the behavior of ever more people is tracked ever more closely, and the collected information retained, in the hopes that further dollars can be squeezed out of it.

Just like industrialized manufacturing changed the relationship between labor and capital, surveillance capitalism is changing the relationship between private citizens and the entities doing the tracking. Our old ideas about individual privacy and consent no longer hold in a world where personal data is harvested on an industrial scale.

Those who benefit from the death of privacy attempt to frame our subjugation in terms of freedom, just like early factory owners talked about the sanctity of contract law. They insisted that a worker should have the right to agree to anything, from sixteen-hour days to unsafe working conditions, as if factory owners and workers were on an equal footing.

Companies that perform surveillance are attempting the same mental trick. They assert that we freely share our data in return for valuable services. But opting out of surveillance capitalism is like opting out of electricity, or cooked foods—you are free to do it in theory. In practice, it will upend your life.

Many of you had to obtain a US visa to attend this conference. The customs service announced yesterday it wants to start asking people for their social media profiles. Imagine trying to attend your next conference without a LinkedIn profile, and explaining to the American authorities why you are so suspiciously off the grid.

The reality is, opting out of surveillance capitalism means opting out of much of modern life.

We're used to talking about the private and public sector in the real economy, but in the surveillance economy this boundary doesn't exist. Much of the day-to-day work of surveillance is done by telecommunications firms, which have a close relationship with government. The techniques and software of surveillance are freely shared between practitioners on both sides. All of the major players in the surveillance economy cooperate with their own country's intelligence agencies, and are spied on (very effectively) by all the others.

As a technologist, this state of affairs gives me the feeling of living in a forest that is filling up with dry, dead wood. The very personal, very potent information we're gathering about people never goes away, only accumulates. I don't want to see the fire come, but at the same time, I can't figure out a way to persuade other people of the great danger.

So I try to spin scenarios.

THE INEVITABLE LIST OF SCARY SCENARIOS

One of the candidates running for President this year has promised to deport eleven million undocumented immigrants living in the United States, as well as block Muslims from entering the country altogether. Try to imagine this policy enacted using the tools of modern technology. The FBI would subpoena Facebook for information on every user born abroad. Email and phone conversations would be monitored to check for the use of Arabic or Spanish, and sentiment analysis applied to see if the participants sounded "nervous". Social networks, phone metadata, and cell phone tracking would lead police to nests of hiding immigrants.

We could do a really good job deporting people if we put our minds to it.

Or consider the other candidate running for President, the one we consider the sane alternative, who has been a longtime promoter of a system of extrajudicial murder that uses blanket surveillance of cell phone traffic, email, and social media to create lists of people to be tracked and killed with autonomous aircraft. … [more]
culture  ethics  privacy  surveillance  technology  technosolutionism  maciegceglowski  2016  computing  coding  programming  problemsolving  systemsthinking  systems  software  control  power  elonmusk  marcandreessen  siliconvalley  sanfrancisco  oakland  responsibility  machinelearning  googlevntures  vc  capitalism  speculation  consent  labor  economics  poland  dystopia  government  politics  policy  immortality 
june 2016 by robertogreco
Guide to Chromebook Privacy Settings for Students | Electronic Frontier Foundation
[via: https://boingboing.net/2015/12/01/what-happened-when-a-parent-fo.html ]

"If your child's school issued them a Chromebook, there are some important settings you can chance to improve their privacy.

Be sure to also check out our Guide to Google Account Privacy Settings for Students.
https://www.eff.org/deeplinks/2015/11/guide-google-account-privacy-settings-students

Open the Chromebook’s settings by clicking on your username in the bottom right-hand corner, then clicking “Settings.”"
chromebooks  privacy  2015 
june 2016 by robertogreco
Brave Software | Building a Better Web
"We have a mission to save the web by increasing browsing speed and safety for users, while growing ad revenue share for content creators.

The web has become a different place. With the ad-tech ecosystem out of control, users have revolted and blocking ads has become the new weapon of choice. But this results in a race to the bottom where nobody wins. Without the ability for content creators to earn money for their efforts, they may need to shut down or find a new source of revenue. Users could be left with nowhere to browse, relegated to hand-picked content from controlled sources.

Brave is saving the web by building a new suite of cutting-edge web browsers that feature class leading speed, security and protection, plus a new ad revenue sharing solution to help keep publishers in business."
browsers  mac  osx  privacy  software  browser 
april 2016 by robertogreco
Databite No. 76: Neil Selwyn - live stream - YouTube
"Neil Selwyn presents (Dis)Connected Learning: the messy realities of digital schooling: In this Databite, Neil Selwyn will work through some emerging headline findings from a new three year study of digital technology use in Australian high schools. In particular Neil will highlight the ways in which schools’ actual uses of technology often contradict presumptions of ‘connected learning’, ‘digital education’ and the like. Instead Neil will consider ….

• how and why recent innovations such as maker culture, personalised learning and data-driven education are subsumed within more restrictive institutional ‘logics’;

• the tensions of ‘bring your own device’ and other permissive digital learning practices • how alternative and resistant forms of technology use by students tend to mitigate *against* educational engagement and/or learning gains;

• the ways in which digital technologies enhance (rather than disrupt) existing forms of advantage and privilege amongst groups of students;

• how the distributed nature of technology leadership and innovation throughout schools tends to restrict widespread institutional change and reform;

• the ambiguous role that digital technologies play in teachers’ work and the labor of teaching;

• the often surprising ways that technology seems to take hold throughout schools – echoing broader imperatives of accountability, surveillance and control.

The talk will provide plenty of scope to consider how technology use in schools might be ‘otherwise’, and alternate agendas to be pursued by educators, policymakers, technology developers and other stakeholders in the ed-tech space."

[via: "V interesting talk by Neil Selwyn on ed-tech and (dis)connected learning in school"
https://twitter.com/audreywatters/status/718900001271783424 ]

"the grammar of schooling"
neilselwyn  edtech  byod  via:audreywatters  logitics  technology  teaching  learning  howweteacher  power  mobile  phones  ipads  laptops  pedagogy  instruction  resistance  compliance  firewalls  making  makingdo  youth  schools  design  micromanagement  lms  application  sameoldsameold  efficiency  data  privacy  education  howweteach  regimentation  regulation  rules  flexibility  shininess  time  schooliness  assessment  engagement  evidence  resilience  knowledge  schedules  class  leadership  performativity  schooldesign  connectedlearning  surveillance  control  accountability  change  institutions  deschooling  quest2play  relationships  curriculum  monitoring  liberation  dml  liberatorytechnology  society  culture  ethnography  schooling  sorting  discipline  ipad 
april 2016 by robertogreco
intimacy gradients - Text Patterns - The New Atlantis
"Pay attention to the links here: Tim Maly pointed me to this 2004 post by Christopher Allen that draws on the famous 1977 architectural treatise A Pattern Language to talk about online life.

Got all that?

The key concept is intimacy gradients. In a well-known passage from A Pattern Language the authors write,
The street cafe provides a unique setting, special to cities: a place where people can sit lazily, legitimately, be on view, and watch the world go by... Encourage local cafes to spring up in each neighborhood. Make them intimate places, with several rooms, open to a busy path, where people can sit with coffee or a drink and watch the world go by. Build the front of the cafe so that a set of tables stretch out of the cafe, right into the street.

That's the passage as quoted in the book's Wikipedia page. But if you actually look at that section of the book, you'll see that the authors place a great deal of emphasis on the need for the ideal street café to create intimacy as well as public openness. Few people want always to "be on view"; some people almost never do. Therefore,

In addition to the terrace which is open to the street, the cafe contains several other spaces: with games, fire, soft chairs, newspapers.... This allows a variety of people to start using it, according to slightly different social styles.

And "When these conditions are present" — all of these conditions, the full appropriate range of intimacy gradients — "and the cafe takes hold, it offers something unique to the lives of the people who use it: it offers a setting for discussions of great spirit — talks, two-bit lectures, half-public, half-private learning, exchange of thought."

Twitter actually has a pretty highly developed set of intimacy gradients: public and private accounts, replies that will be seen automatically only by the person you’re replying to and people who are connected to both of you, direct messages, and so on. Where it fails is in the provision of “intimate places”: smaller rooms where friends can talk without being interrupted. It gives you the absolute privacy of one-to-one conversations (DMs) and it gives you all that comes with “being on view” at a table that extends “right into the street,” where anyone who happens to go by can listen in or make comments; but, for public accounts anyway, not much in between.

And you know, if you’re using a public Twitter account, you can’t really complain about this. If you tweet something hoping that your friends will notice and respond, that’s fine; but you’re not in a small room with just your friends, you’re in a vast public space — you’re in the street. And when you stand in the street and make a statement through a megaphone, you can’t reasonably be offended if total strangers have something so say in reply. If you want to speak only to your friends, you need to invite them into a more intimate space.

And as far as I can tell, that’s what private Twitter accounts provide: a place to talk just with friends, where you can’t be overheard.

Now, private accounts tend to work against the grain of Twitter as self-promotion, Twitter as self-branding, Twitter as “being on view.” And if we had to choose, many of us might forego community for presentation. But we don’t have to choose: it’s possible to do both, to have a private and a public presence. For some that will be too much to manage; for others, perhaps for many others, that could be where Twitter is headed.

Okay, I’m done talking about Twitter. Coming up in the next week: book reports."
alanjacobs  2014  intimacygradients  apatternlanguage  christopheralexander  cities  twitter  society  sociology  internet  culture  architecture  space  public  private  privacy 
april 2016 by robertogreco
IndieWebCamp
"What is the IndieWeb?

The IndieWeb is a people-focused alternative to the ‘corporate web’.

Your content is yours
When you post something on the web, it should belong to you, not a corporation. Too many companies have gone out of business and lost all of their users’ data. By joining the IndieWeb, your content stays yours and in your control.

You are better connected
Your articles and status messages can go to all services, not just one, allowing you to engage with everyone. Even replies and likes on other services can come back to your site so they’re all in one place.

You are in control
You can post anything you want, in any format you want, with no one monitoring you. In addition, you share simple readable links such as example.com/ideas. These links are permanent and will always work.



Beyond Blogging and Decentralization

The IndieWeb effort is different from previous efforts/communities:

• Principles over project-centrism. Others assume a monoculture of one project for all. We are developing a plurality of projects.

• Selfdogfood instead of email. Show before tell. Prioritize by scratching your own itches, creating, iterating on your own site.

• Design first, protocols & formats second. Focus on good UX & selfdogfood prototypes to create minimum necessary formats & protocols."
web  online  internet  independent  openweb  via:kissane  ownership  selfdogfood  plurality  indieweb  privacy  data  content 
april 2016 by robertogreco
'I Love My Label': Resisting the Pre-Packaged Sound in Ed-Tech
"I’ve argued elsewhere, drawing on a phrase by cyborg anthropologist Amber Case, that many of the industry-provided educational technologies we use create and reinforce a “templated self,” restricting the ways in which we present ourselves and perform our identities through their very technical architecture. The learning management system is a fine example of this, particularly with its “permissions” that shape who gets to participate and how, who gets to create, review, assess data and content. Algorithmic profiling now will be layered on top of these templated selves in ed-tech – the results, again: the pre-packaged student.

Indie ed-tech, much like the indie music from which it takes its inspiration, seeks to offer an alternative to the algorithms, the labels, the templates, the profiling, the extraction, the exploitation, the control. It’s a big task – an idealistic one, no doubt. But as the book Our Band Could Be Your Life, which chronicles the American indie music scene of the 1980s (and upon which Jim Groom drew for his talk on indie-ed tech last fall), notes, “Black Flag was among the first bands to suggest that if you didn’t like ‘the system,’ you should simply create one of your own.” If we don’t like ‘the system’ of ed-tech, we should create one of our own.

It’s actually not beyond our reach to do so.

We’re already working in pockets doing just that, with various projects to claim and reclaim and wire and rewire the Web so that it’s more just, more open, less exploitative, and counterintuitively perhaps less “personalized.” “The internet is shit today,” Pirate Bay founder Peter Sunde said last year. “It’s broken. It was probably always broken, but it’s worse than ever.” We can certainly say the same for education technology, with its long history of control, measurement, standardization.

We aren’t going to make it better by becoming corporate rockstars. This fundamental brokenness means we can’t really trust those who call for a “Napster moment” for education or those who hail the coming Internet/industrial revolution for schools. Indie means we don’t need millions of dollars, but it does mean we need community. We need a space to be unpredictable, for knowledge to be emergent not algorithmically fed to us. We need intellectual curiosity and serendipity – we need it from scholars and from students. We don’t need intellectual discovery to be trademarked, to a tab that we click on to be fed the latest industry updates, what the powerful, well-funded people think we should know or think we should become."
2016  audreywatters  edupunk  edtech  independent  indie  internet  online  technology  napster  history  serendipity  messiness  curiosity  control  measurement  standardization  walledgardens  privacy  data  schools  education  highered  highereducation  musicindustry  jimgroom  ambercase  algorithms  bigdata  prediction  machinelearning  machinelistening  echonest  siliconvalley  software 
march 2016 by robertogreco
Three Short Futures: On Children, Data and the Internet of Things — Phase Change — Medium
"Her mother had started wearing a fitness tracker long after James was born, back when it didn’t matter. Even now, James would return back to her family home to see the familiar shape of her mother digging up the weeds in the window box, her antiquated tracker abandoned to the counter. ‘I don’t like to wear that thing when I’m gardening. Gets in the way.’ She would whistle to herself to cover the alarm that sounded from inactivity, singing against it like a bird.

As the government regulations started to phase in, James joined millions of young people who secretly cursed their parents for not being more careful with their future. All the technology had been there, many cried, so why didn’t they use it?"



"Earlier that day, a minor server dropout had caused a loss of data in maternity, with hundreds of signals lighting up the nurses station as mothers, and fathers, noticed a temporary pause. A child had started crying as its mother pulled away to prod at the controls blindly, smiling at Jack as he fled down the ward.

As usual, mothers had panicked at the potential loss of resolution, of clarity, in their child’s future, as empty/silent/dropout points were routinely questioned when it came to further down the line of a child’s life. Although often minor, to a first-time (and second-time, and third-time) parent it was potentially devastating unless you had the money to make up for it later on. Those who weren’t afforded the luxury of choice tried in vain to gain advantages where possible, cheating where they could, with stories of repurposed sibling data perpetually reaching Jack’s newsfeed. He had been told to watch out for this in his retraining, thinking to himself that sibling rivalry had never been more overanalyzed. His own brother didn’t know what he was talking about."



"‘It’s not called an Xbox anymore Mum.’ Robin ran her hand across her face, blew out all the air from her lungs and continued. Typing in Alice’s unique ID, a code hidden away in under the skin of her second-hand feline companion, Ted, to authorise. She sat for a minute before thinking about sandwiches and taking the bins out, listening to the sounds of the house. Across the hall she heard her eldest tease the youngest about the creatures that lived in the woods. ‘If you don’t have your tracker on, they’ll eat you up!’ Alice screamed.

This new change to a more data-dependant education, from primary school onwards, had been great at first. The way her school dealt with her health concerns felt helpful, vital even, but after the third or fourth probing email, Robin had started to feel uncomfortable. She didn’t enjoy receiving reports of her daughter’s meal choices, or how many times she was active during the day, and so still sat and listened in faux-surprise as Alice, and Ted, told her how good the chips had been that day.

Soon it became a matter of school performance and security, with Ofsted regularly marking down schools without a good data hygiene policy. Alongside personal and social care, data care had become compulsory, as reporting a blackout in their records became as important as reporting a school bully. Cleaning your data, telling a responsible adult about any unusual behaviour, glitches, all were analysed and fed back into school reports. A way of fighting not only absence and career ambitions, but perceived radicalisation by one too many politicians.

This particular summer would be spent at a camp that taught kids how to deal with their data better, those that didn’t quite grasp it. Her oldest son, Jo, had attended one a few years back, one of the first in fact, and through games, and hiking, and competitions, they learned how to be better and smarter at collecting their data. A journey to becoming a legible young person. Paid for and regulated by their local government authority, attendance was a matter of being a good citizen; “tomorrow’s child, today.” Character building, the email had said, “An investment in your child’s future.” She couldn’t say no, other parents vocally expressing how irresponsible it would be to opt-out, and Robin would feel guilty. She already did, for so many reasons.

Over lunch, Robin’s mother compared it to a finishing school, but instead of books on the head, it would be a perfectly legible data trail. ‘I know it’s a bit much, but she’ll thank you for it. Look how much it helped Jo.’ Her son had left that summer a wildly unpredictable, spontaneous child, but in the months that followed, became obsessed with making sure that everything was up to spec, in peak condition, and always updated. It had helped him, in some part, he was doing well in school, but he had become hardened somehow, less forgiving of error."
children  data  privacy  iot  internetofthings  2016  nataliekane  speculativefiction  education  edtech 
march 2016 by robertogreco
Why We Post: Discoveries
"Discovery 1: Social media is not making us more individualistic

Discovery 2: For some people social media does not detract from education – it is education.

Discovery 3: There are many different genres of selfie.

Discovery 4: Equality online doesn't mean equality offline.

Discovery 5: It's the people who use social media who create it, not the developers of platforms.

Discovery 6: Public social media is conservative.

Discovery 7: We used to just talk now we talk photos.

Discovery 8: Social media is not making the world more homogenous

Discovery 9: Social media promotes social commerce not all commerce.

Discovery 10: Social media has created new spaces for groups between the public and private.

Discovery 11: People feel social media is now somewhere they live as well as a means for communication.

Discovery 12: Social media can have a profound impact on gender relations sometimes through using fake accounts.

Discovery 13: Each social media platform only makes sense in relation to alternative platforms and the media.

Discovery 14: Memes have become the moral police of online life.

Discovery 15: We tend to assume social media is a threat to privacy but sometimes is can increase privacy."



[https://www.ucl.ac.uk/why-we-post/about-us

"Project aims

The world seen through social media

Ignore glib claims that we are all becoming more superficial or more virtual because of social media. What is really going on is far more incredible. These are social media, intensely woven into the texture of our relationships. In our study, social media gave us intimate insight into the worlds of Chinese factory workers, young Muslim women on the Syrian/Turkish border, IT professionals in India and many others.

On this website you can gain a first impression of some of our discoveries, browse the films we made while conducting our research and read some stories about our research participants. If you want to find out more you can take our free online course and read our 11 free open access books.

In particular we recommend the book ‘How the World Changed Social Media’. Here you will find summaries of our results as they relate to topics ranging from gender, education, commerce, politics, communication, and many more."]
socialmedia  privacy  homogeneity  gender  individualism  education  memes  equality  online  internet  web  conservatism  communication  media  via:anne 
march 2016 by robertogreco
Frances Stonor Saunders · Where on Earth are you? · LRB 3 March 2016
"The one border we all cross, so often and with such well-rehearsed reflexes that we barely notice it, is the threshold of our own home. We open the front door, we close the front door: it’s the most basic geographical habit, and yet one lifetime is not enough to recount all our comings and goings across this boundary. What threshold rites do you perform before you leave home? Do you appease household deities, or leave a lamp burning in your tabernacle? Do you quickly pat down pockets or bag to check you have the necessary equipment for the journey? Or take a final check in the hall mirror, ‘to prepare a face to meet the faces that you meet’?

You don’t have a slave to guard your door, as the ancients did, so you set the alarm (or you set the dog, cave canem). Keys? Yes, they’re in your hand. You have ‘the power of the keys’, the right of possession that connects you to thousands of years of legal history, to the rights of sovereigns and states, to the gates of salvation and damnation. You open the door, step through, and turn to close it – through its diminishing arc, the details of your life inside recede. ‘On one side, me and my place,’ Georges Perec wrote:
The private, the domestic (a space overfilled with my possessions: my bed, my carpet, my table, my typewriter, my books, my odd copies of the Nouvelle Revue française); on the other side, other people, the world, the public, politics. You can’t simply let yourself slide from one into the other, can’t pass from one to the other, neither in one direction nor in the other. You have to have the password, have to cross the threshold, have to show your credentials, have to communicate … with the world outside.

You lock the door. You’ve crossed the border. You’ve ignored Pascal’s warning that all humanity’s misery derives from not being able to sit alone in a quiet room. When the Savoyard aristocrat Xavier De Maistre was sentenced to six weeks’ house arrest for duelling in 1790, he turned his detention into a grand imaginary voyage. ‘My room is situated on the 45th degree of latitude,’ he records in A Journey around my Room. ‘It stretches from east to west; it forms a long rectangle, 36 paces in perimeter if you hug the wall.’ And so he sets off, charting a course from his desk towards a painting hung in a corner, and from there he continues obliquely towards the door, but is waylaid by his armchair, which he sits in for a while, poking the fire, daydreaming. Then he bestirs himself again, presses north towards his bed, the place where ‘for one half of our life’ we forget ‘the sorrows of the other half’. And so on, ‘from the expedition of the Argonauts to the Assembly of Notables, from the lowest depths of hell to the last fixed star beyond the Milky Way, to the confines of the universe, to the gates of chaos’. ‘This,’ he declares, ‘is the vast terrain which I wander across in every direction at leisure.’

Whether around your room in forty days, or around the world in eighty days, or around the Circle Line in eighty minutes, whether still or still moving, the self is an act of cartography, and every life a study of borders. The moment of conception is a barrier surpassed, birth a boundary crossed. Günter Grass’s Oskar, the mettlesome hero of The Tin Drum, narrates, in real time, his troubling passage through the birth canal and his desire, once delivered into the world, to reverse the process. The room is cold. A moth beats against the naked light bulb. But it’s too late to turn back, the midwife has cut the cord.

Despite this uncommon ability to report live on his own birth, even Oskar’s power of self-agency is subject to the one inalienable rule: there is only one way into this life, and one way out of it. Everything that happens in between – all the thresholds we cross and recross, all the ‘decisions and revisions that a minute will reverse’ – is bordered by this unbiddable truth. What we hope for is safe passage between these two fixed boundaries, to be able to make something of the experience of being alive before we are required to stop being alive. There’s no negotiating birth or death. What we have is the journey.

On the evening of 3 October 2013, a boat carrying more than five hundred Eritreans and Somalis foundered just off the tiny island of Lampedusa. In the darkness, locals mistook their desperate cries for the sound of seagulls. The boat sank within minutes, but survivors were in the water for five hours, some of them clinging to the bodies of their dead companions as floats. Many of the 368 people who drowned never made it off the capsizing boat. Among the 108 people trapped inside the bow was an Eritrean woman, thought to be about twenty years old, who had given birth as she drowned. Her waters had broken in the water. Rescue divers found the dead infant, still attached by the umbilical cord, in her leggings. The longest journey is also the shortest journey.

Already, in the womb, our brains are laying down neural pathways that will determine how we perceive the world and our place in it. Cognitive mapping is the way we mobilise a definition of who we are, and borders are the way we protect this definition. All borders – the lines and symbols on a map, the fretwork of walls and fences on the ground, and the often complex enmeshments by which we organise our lives – are explanations of identity. We construct borders, literally and figuratively, to fortify our sense of who we are; and we cross them in search of who we might become. They are philosophies of space, credibility contests, latitudes of neurosis, signatures to the social contract, soothing containments, scars.

They’re also death zones, portals to the underworld, where explanations of identity are foreclosed. The boat that sank half a mile from Lampedusa had entered Italian territorial waters, crossing the imaginary line drawn in the sea – the impossible line, if you think about it. It had gained the common European border, only to encounter its own vanishing point, the point at which its human cargo simply dropped off the map. Ne plus ultra, nothing lies beyond.

I have no theory, no grand narrative to explain why so many people are clambering into their own hearses before they are actually dead. I don’t understand the mechanisms by which globalisation, with all its hype of mobility and the collapse of distance and terrain, has instead delivered a world of barricades and partition, in which entire populations seem to be living – and dying – in a different history from mine. All I know is that a woman who believed in the future drowned while giving birth, and we have no idea who she was. And it’s this, her lack of known identity, which places us, who are fat with it, in direct if hopelessly unequal relationship to her.

Everyone reading this has a verified self, an identity, formed through and confirmed by identification, that is attested to be ‘true’. You can’t function in the world without it: you can’t open a bank account, get a credit card or national insurance number, or a driving licence, or access to your email and social media accounts, or a passport or visa, or points on your reward card. You can’t have your tonsils removed without it. You can’t die without it. Whether you’re conscious of it or not, whether you like it or not, the verified self is the governing calculus of your life, the spectrum on which you, as an individual, are plotted from cradle to grave. As Pierre-Joseph Proudhon explained, you must be ‘noted, registered, enumerated, accounted for, stamped, measured, classified, audited, patented, licensed, authorised, endorsed, reprimanded, prevented, reformed, rectified and corrected, in every operation, every transaction, every movement.’"



"All migrants know that the reply to the question ‘Who on earth are you?’ is another question: ‘Where on earth are you?’ And so they want what we’ve got, a verified self that will transport them to our side of history. Thus, the migrant identity becomes a burden to be unloaded. Migrants often make the journey without identity documents, and I mentioned one reason for this, namely that the attempt to obtain them in their country of origin can be very dangerous. Others lose them at the outset when they’re robbed by police or border guards, or by people traffickers en route. Many destroy them deliberately because they fear, not without reason, that our system of verification will be a mechanism for sending them back. In Algeria, they’re called harraga, Arabic for ‘those who burn’. And they don’t only burn their documents: many burn their fingertips on hobs or with lighters or acid, or mutilate them with razors, to avoid biometric capture and the prospect of expulsion. These are the weapons of the weak.

The boat carrying more than five hundred Eritreans and Somalis sank off Lampedusa in October 2013, barely three months after the pope’s visit. Whether they had lost their identity papers, or destroyed them, when facing death the people on board wanted to be known. As the boat listed and took on water, and with most of the women and children stuck below deck, those who knew they wouldn’t make it called out their names and the names of their villages, so that survivors might carry ashore news of their deaths.​5 There isn’t really any other way: there’s no formal identification procedure for those who drown. In Lampedusa’s cemetery, the many plaques that read ‘unidentified migrant’ merely tell us that people have been dying in the Mediterranean for at least 25 years – more than twenty thousand of them, according to current estimates.

Everyone must be counted, but only if they count. Dead migrants don’t count. The woman who drowned while giving birth was not a biometric subject, she was a biodegradable one. I don’t want to reconstitute her as a sentimental artefact, an object to be smuggled into the already crowded room of my bad conscience. But … [more]
borders  identity  cartography  francesstonorsaunders  georgesperec  lampedusa  güntergrass  refugees  identification  personhood  geopolitics  legibility  mobility  passports  pierre-josephproudhon  globalization  thresholds  homes  milankundera  socialmedia  digitalexhaust  rfid  data  privacy  smartphones  verification  biometrics  biometricdata  migration  immigration  popefrancis  facialidentification  visas  paulfussell  stefanzweig  xenophobia  naomimitchison  nobility  surveillance  intentionality  gilbertharding  whauden  lronhubbard  paulekman 
march 2016 by robertogreco
Sandstorm
"Sandstorm is an open source operating system for personal and private clouds."

"What can I do with it?

Create
Create Google-Docs-like spreadsheets, documents, forms, etc. with EtherCalc, Etherpad, Sandforms, Draw.io, and more.

Collaborate
Share documents, diagrams, and other files with your colleagues and friends, and collaborate in real-time.

Communicate
Sync up with your colleagues securely with great chat applications like Rocket.Chat.

How is it different?

Usability | Designed for Humans
Sandstorm is the easiest way there has ever been to run a server.

Sandstorm requires no technical expertise to use.

Installing apps on Sandstorm is as easy as installing apps on your phone. No need to read documentation and edit config files – and no need to wait for IT to do it for you.

Sandstorm emphasizes users over apps.

You log into Sandstorm, not into each app separately.

All of your data across all apps (documents, chat rooms, whatever) can be found and searched in one place, rather than logging into each one separately.

You can share and collaborate on anything – or keep it private.

Security | Secure by Default
Sandstorm is ridiculously secure.

The biggest challenge to securing any server is buggy apps. Some app developers are good at security, but some are not, and it's usually impossible to know who is whom without doing a costly security audit.

Sandstorm, therefore, takes a different approach: break data down into "grains" (for example, individual documents, or chat rooms) and isolate each one in a secure sandbox from which it cannot talk to the world without your express permission. With this approach, no matter how buggy your document editor might be, each document can only possibly be accessed by the people you shared it with. No matter how buggy your chat room, only the people you permitted will ever see the logs.

Skeptical? Check out our security docs and list of security non-events to learn more.

Because Sandstorm manages access control on every document, it can tell you who has accessed your data and allow you to revoke that access at any time. Prove that your sensitive data is secure by reviewing all the systems it is connected to."
cloud  opensource  privacy  security  servers  sandstorm  onlinetoolkit  ethercalc  etherpad  sandforms  draw.io 
february 2016 by robertogreco
Toward Humane Tech — Medium
"If you make technology, or work in the tech industry, I have good news for you: we won."

"We’re not nerds, or outsiders, or underdogs anymore. What we do, and what we make, shapes culture and society, deeply influencing everything from artistic expression to policy and regulation to the way we see our friends, family and selves.

But we haven’t taken responsibility for ourselves in a manner that befits the wealthiest and most powerful industry that’s ever been created. We fancy ourselves outlaws while we shape laws, and consider ourselves disruptive without sufficient consideration for the people and institutions we disrupt. We have to do better, and we will.

While thinking about this reality, and these problems, I’ve struggled with all the different dimensions of the challenge. We could address our profound issues around inclusion and diversity but still be wildly irresponsible about our environmental impact. We could start to respect legal processes and the need for thoughtful engagement with policy makers but still be cavalier about the privacy and security of our users’ data. We could continue to invest in design and user experience but remain thoughtless about the emotional and psychological impacts of the experiences we create. We could continue to bemoan the shortcomings of legacy industries while exacerbating issues like income inequality or social inequity.

I’m not hopeless about it; in fact, if there’s one unifying value that connects everyone in tech, no matter how critical or complacent they may be, it’s an underlying vein of optimism. I want to tap into that optimism, but direct it toward making sure we’re actually making things better, and not just for ourselves.

So I’m going to start to keep some notes, about the functional, pragmatic things we can do to make sure our technologies, and the community that creates those technologies, become far more humane. The conversation about the tech industry has changed profoundly in the past few years. It is no longer radical to raise issues of ethics or civics when evaluating a new product or company. But that’s the simplest starting point, a basic acknowledgment that what we do matters and actually affects people.

We have to think about inclusion, acceptance and diversity, to start. We need to think deeply about our language and communications, and the way we express what technology does. We need to question the mythologies we build around concepts like “founders” or “inventions” or even “startups”. We need to challenge our definitions of success and progress, and to stop considering our work in solely commercial terms. We need to radically improve our systems of compensation, to be responsible about credit and attribution, and to be generous and fair with reward and remuneration. We need to consider the impact our work has on the planet. We need to consider the impact our work has on civic and academic institutions, on artistic expression, on culture.

I’m optimistic, but I think this is going to continue to require a lot of hard work over a long period of time. My first step is to start taking notes about the goal we’re working toward. Let’s get to work."
anildash  2016  technology  siliconvalley  inclusion  inclusivity  diversity  acceptance  gender  language  communication  compensation  responsibility  attribution  environment  privacy  security  inequality  incomeinequality  law  legal  disruption  culture  society 
january 2016 by robertogreco
Smart Pipe | Infomercials | Adult Swim - YouTube
"Everything in our lives is connected to the internet, so why not our toilets? Take a tour of Smart Pipe, the hot new tech startup that turns your waste into valuable information and fun social connectivity."
adultswim  designfiction  2014  data  bigdata  privacy  smartcities  internetofthings  iot  information  connectivity 
december 2015 by robertogreco
Speak Up & Stay Safe(r): | A Guide to Protecting Yourself From Online Harassment
"NOTE: This guide contains things we’ve learned about how to keep yourself safe from individuals, loosely organized groups & cybermobs online. If you’re concerned with attacks from governments, major corporations, or other massively organized and/or resourced institutions, we recommend this great guide.

This guide is for anyone who fears they might be targeted, or who is already under attack, for speaking their mind online, but is especially designed for women, people of color, trans and genderqueer people, and everyone else whose existing oppressions are made worse by digital violence. It details best security practices for social media, email, online gaming, website platforms, and ensuring privacy of personal information online, as well as the documentation and reporting of harassment, and caring for yourself emotionally during an online attack. You don’t need any specialized knowledge to use this guide – just basic computer and internet skills.

The authors of the guide have all been targets of cyber attacks ourselves, so we’ve written the guide we wish had already existed when the attacks on us began. We’re all based in the US, but we’ve done our best to make it useful no matter where you live.

We wish we didn’t have to write this. Going through even some of these steps to protect your online safety will cost you real time and sometimes money. It’s a tax on women, people of color, queer and trans people and other oppressed groups for daring to express our opinions in public.

None of this is fair. It should not be our meticulous labor and precious funds that keep us safe, it should be our basic humanity. But that has proven heartbreakingly, maddeningly insufficient more times than we can count. So below are some of the things that we’ve learned that can help, even though we shouldn’t have to do any of them. While we fight for a just world, this is the one we’re living in, and we want to share what we know.

We also want to acknowledge that people with more financial and leisure-based privilege will have better access to implementing comprehensive strategies — a structural unfairness that highlights how unjust online harassment is. It’s also true that none of these are foolproof — you could employ all of these strategies and still be targeted.

And just to be crystal clear: if someone attacks, harasses or threatens you online, it’s not your fault, even if you haven’t previously taken any safety precautions.

It’s never your fault. Never. Ever."
harassment  safety  security  privacy  internet  online 
december 2015 by robertogreco
Goodbye privacy, hello Alexa: here's to Amazon echo, the home robot who hears it all | Technology | The Guardian
"We had Rory Carroll invite ‘Alexa’ aka the Echo into his home. There was helpful cooking assistance, endless facts and figures, an amusing misunderstanding – and concerns over what exactly Amazon does with all that interaction data"



"People who think about technology for a living have a wide range of views on Alexa. “With Amazon Echo, it was love at first sight,” wrote Re/code’s Joe Brown. “The allure of Alexa is her companionship. She’s like a genie in a sci-fi-looking bottle – one not quite at the peak of her powers, and with a tiny bit of an attitude.”

In an interview Ronald Arkin, a robot ethicist and director of the Mobile Robot Laboratory at the Georgia Institute of Technology, was more phlegmatic. Technology advances bring benefits and drawbacks – you can’t stop the tide but can choose whether to stay out, paddle or plunge in, he said.

“Amazon and Google have all sorts of data about our preferences. You don’t have to use their products. If you do, you’re saying OK, I’m willing to allow this potential violation of my privacy. No one is forcing this on anyone. It’s not mandated à la 1984.”

It is up to us if artificial intelligence technology makes us smarter or dumber, more industrious or lazy, says Arkin. “It is changing us, the way we operate. The question is, how much control do you want to relinquish?”

The Echo, says Arkin, is a well-engineered advance in voice recognition. “What’s interesting is it’s another step into turning our homes into robots.” The prospect does not alarm him. “You see this in sci-fi: Star Trek, Knight Rider. It’s the natural progression.”

Ellen Ullman, a writer and computer programmer in San Francisco, sounded much more worried. The more the internet penetrates your home, car or body, the greater the danger, she said. “The boundary between the outside world and the self is penetrated. And the boundary between your home and the outside world is penetrated.”

Ullman thinks people are mad to use email supplied by big corporations – “on the internet there is no place to hide and everything can be hacked” – and even madder to embrace something like Alexa.

Such devices exist to supply data to corporate masters: “It’s going to give you services, and whatever services you get will become data. It’s sucked up. It’s a huge new profession, data science. Machine learning. It seems benign. But if you add it all up to what they know about you ... they know what you eat.”

Ullman, the author of Close to the Machine: Technophilia and Its Discontents, is no luddite. She writes code. But, she warned, every time we become attached to a device our sense of our lives is changed. “With every advance you have to look over your shoulder and know what you’re giving up – look over your shoulder and look at what falls away.”

Ullman’s warning sounds prescient. Yet I’m not rushing to banish Alexa. She still perches in my living room, perhaps counting down the days until her Guardian media embed ends and she can return to Seattle.

She turns my musings and requests into data and uploads them to the cloud, possibly into the maw of Amazon algorithms. But she’s useful. And I am weak.

I bow to the god of convenience. A day will come when I’m alone in the kitchen, cooking with sticky fingers, and I’ll need reminding how many teaspoons are in a tablespoon."
amazon  alexa  echo  privacy  data  technology  cortana  microsoft  2015  amazonecho  ethics  surveillance  technophilia  internet  ellenullman 
december 2015 by robertogreco
Cops are asking Ancestry.com and 23andMe for their customers’ DNA | Fusion
"When companies like Ancestry.com and 23andMe first invited people to send in their DNA for genealogy tracing and medical diagnostic tests, privacy advocates warned about the creation of giant genetic databases that might one day be used against participants by law enforcement. DNA, after all, can be a key to solving crimes. It “has serious information about you and your family,” genetic privacy advocate Jeremy Gruber told me back in 2010 when such services were just getting popular.

Now, five years later, when 23andMe and Ancestry both have over a million customers, those warnings are looking prescient. “Your relative’s DNA could turn you into a suspect,” warns Wired, writing about a case from earlier this year, in which New Orleans filmmaker Michael Usry became a suspect in an unsolved murder case after cops did a familial genetic search using semen collected in 1996. The cops searched an Ancestry.com database and got a familial match to a saliva sample Usry’s father had given years earlier. Usry was ultimately determined to be innocent and the Electronic Frontier Foundation called it a “wild goose chase” that demonstrated “the very real threats to privacy and civil liberties posed by law enforcement access to private genetic databases.”

The FBI maintains a national genetic database with samples from convicts and arrestees, but this was the most public example of cops turning to private genetic databases to find a suspect. But it’s not the only time it’s happened, and it means that people who submitted genetic samples for reasons of health, curiosity, or to advance science could now end up in a genetic line-up of criminal suspects.

Both Ancestry.com and 23andMe stipulate in their privacy policies that they will turn information over to law enforcement if served with a court order. 23andMe says it’s received a couple of requests from both state law enforcement and the FBI, but that it has “successfully resisted them.”

23andMe’s first privacy officer Kate Black, who joined the company in February, says 23andMe plans to launch a transparency report, like those published by Google, Facebook and Twitter, within the next month or so. The report, she says, will reveal how many government requests for information the company has received, and presumably, how many it complies with. (Update: The company released the report a week later.)

“In the event we are required by law to make a disclosure, we will notify the affected customer through the contact information provided to us, unless doing so would violate the law or a court order,” said Black by email.

Ancestry.com would not say specifically how many requests it’s gotten from law enforcement. It wanted to clarify that in the Usry case, the particular database searched was a publicly available one that Ancestry has since taken offline with a message about the site being “used for purposes other than that which it was intended.” Police came to Ancestry.com with a warrant to get the name that matched the DNA.

“On occasion when required by law to do so, and in this instance we were, we have cooperated with law enforcement and the courts to provide only the specific information requested but we don’t comment on the specifics of cases,” said a spokesperson.

As NYU law professor Erin Murphy told the New Orleans Advocate regarding the Usry case, gathering DNA information is “a series of totally reasonable steps by law enforcement.” If you’re a cop trying to solve a crime, and you have DNA at your disposal, you’re going to want to use it to further your investigation. But the fact that your signing up for 23andMe or Ancestry.com means that you and all of your current and future family members could become genetic criminal suspects is not something most users probably have in mind when trying to find out where their ancestors came from.

“It has this really Orwellian state feeling to it,” Murphy said to the Advocate.

If the idea of investigators poking through your DNA freaks you out, both Ancestry.com and 23andMe have options to delete your information with the sites. 23andMe says it will delete information within 30 days upon request."
23andme  dna  police  privacy  lawenforcement  ancestry.com  kashmirhill 
november 2015 by robertogreco
Doing Something About the ‘Impossible Problem’ of Abuse in Online Games | Re/code
"It’s often easier to turn a blind eye than confront the ugliness of negative online behaviors. However, online society has become an integral part of life, from conversations via Snapchat to networking on LinkedIn. As we spend more and more of our time online, we need to acknowledge that online harassment and toxicity is not an impossible problem, and that it is a problem worth spending time on.

For the past three years, a team of game designers and cross-discipline scientists at Riot Games have been doing just that, combining efforts to study online behavior in its game League of Legends. It might surprise some people that a video game could be shedding light on what’s seen as a hopeless cause, but with League’s highly competitive gameplay and more than 67 million players around the world giving it their all in-game, the team has uncovered a wealth of interactions that have led to remarkable insights.

Our team found that if you classified online citizens from negative to positive, the vast majority of negative behavior (which ranges from trash talk to non-extreme but still generally offensive language) did not originate from the persistently negative online citizens; in fact, 87 percent of online toxicity came from the neutral and positive citizens just having a bad day here or there.

Given this finding, the team realized that pairing negative players against each other only creates a downward spiral of escalated negative behaviors. The answer had to be community-wide reform of cultural norms. We had to change how people thought about online society and change their expectations of what was acceptable.

But that led to a big question: How do you introduce structure and governance into a society that didn’t have one before? The answer wasn’t as simple as abolishing anonymity. Privacy has become increasingly important online as data becomes more widely available, and numerous studies have shown that anonymity is not the strongest cause of online toxicity. While anonymity can be a catalyst for online toxicity, we focused on the more powerful factor of whether or not there are consequences (both negative and positive) for behaviors.

To deliver meaningful consequences, we had to focus on the speed and clarity of feedback. At Riot, we built a system called the “Tribunal,” which automatically created “case files” of behaviors that players reported as unacceptable in the community. The system allowed players to review game data and chat logs and vote on whether the behaviors were okay or not. (Later this year, the system will also create positive “case files” so players can vote on the full spectrum of behaviors). These cases were public, so players could see and discuss the behaviors, and the results were inspiring. The vast majority of online citizens were against hate speech of all kinds; in fact, in North America, homophobic slurs were the most rejected phrases in the English language.

It turns out that people just need a voice, a way to enact change.

100 million Tribunal votes later, we turned machine learning loose on the dataset to see if we could classify words and phrases in 15 different languages from negative to positive. Just classifying words was easy, but what about more advanced linguistics such as whether something was sarcastic or passive-aggressive? What about more positive concepts, like phrases that supported conflict resolution?

To tackle the more challenging problems, we wanted to collaborate with world-class labs. We offered the chance to work on these datasets and solve these problems with us. Scientists leapt at the chance to make a difference and the breakthroughs followed. We began to better understand collaboration between strangers, how language evolves over time and the relationship between age and toxicity; surprisingly, there was no link between age and toxicity in online societies.

By opening our doors to the academic community, we’ve started collaborations that are redefining how research is conducted in the future, and we hope other companies follow this lead.

In League of Legends, we’re now able to deliver feedback to players in near-real-time. Every single time a player “reports” another player in the game for a negative act, it informs the machine-learning system. Every time a player “honors” another player in the game for a positive act, it also trains the machine-learning system. As soon as we detect these behaviors in-game, we can deliver the appropriate consequence, whether it is a customized penalty or an incentive. Critically, players in the society are driving the decisions behind the machine-learning feedback system — their votes determine what is considered acceptable behavior in this online society.

As a result of these governance systems changing online cultural norms, incidences of homophobia, sexism and racism in League of Legends have fallen to a combined 2 percent of all games. Verbal abuse has dropped by more than 40 percent, and 91.6 percent of negative players change their act and never commit another offense after just one reported penalty.

These results have inspired us, because we realize that this isn’t an impossible problem after all.

In the office, I still have a copy of a letter a boy wrote me after receiving in-game feedback from his peers about his usage of racial slurs: “Dr. Lyte, this is the first time someone told me that you should not say the ‘N’ word online. I am sorry and I will never say it again.” I remember forwarding this letter to the entire team, because this was the moment we realized that we had started a journey that would end beyond games.

Is it our responsibility to make online society a better place? Of course it is, for all of us. It is our society. As we collaborate with those outside of games, we are realizing that the concepts we’re using in games can apply in any online context. We are at a pivotal point in the timeline of online platforms and societies, and it is time to make a difference."
riotgames  edg  srg  games  gaming  abuse  online  web  internet  racism  sexism  homophobia  2015  leagueoflegends  privacy  culture  jeffreylin  socialsystems  behavior  community  language  harassment  communitymanagement  verbalabuse 
november 2015 by robertogreco
Haunted By Data
[https://www.youtube.com/watch?v=GAXLHM-1Psk
https://www.oreilly.com/ideas/haunted-by-data ]

"You're thinking, okay Maciej, your twelve minutes of sophistry and labored analogies have convinced me that my entire professional life is a lie. What should I do about it?

I hope to make you believe data collection is a trade-off. It hurts the people whose data you collect, but it also hurts your ability to think clearly. Make sure that it's worth it!

I'm not claiming that the sponsors of this conference are selling you a bill of goods. I'm just heavily implying it.

Here's what I want you do specifically:

Don't collect it!

If you can get away with it, just don't collect it! Just like you don't worry about getting mugged if you don't have any money, your problems with data disappear if you stop collecting it.

Switch from the hoarder's mentality of 'keep everything in case it comes in handy' to a minimalist approach of collecting only what you need.

Your marketing team will love you. They can go tell your users you care about privacy!

If you have to collect it, don't store it!

Instead of stocks and data mining, think in terms of sampling and flows. "Sampling and flows" even sounds cooler. It sounds like hip-hop!

You can get a lot of mileage out of ephemeral data. There's an added benefit that people will be willing to share things with you they wouldn't otherwise share, as long as they can believe you won't store it. All kinds of interesting applications come into play.

If you have to store it, don't keep it!

Certainly don't keep it forever. Don't sell it to Acxiom! Don't put it in Amazon glacier and forget it.

I believe there should be a law that limits behavioral data collection to 90 days, not because I want to ruin Christmas for your children, but because I think it will give us all better data while clawing back some semblance of privacy.

Finally, don't be surprised. The current model of total surveillance and permanent storage is not tenable.

If we keep it up, we'll have our own version of Three Mile Island, some widely-publicized failure that galvanizes popular opinion against the technology.

At that point people who are angry, mistrustful, and may not understand a thing about computers will regulate your industry into the ground. You'll be left like those poor saps who work in the nuclear plants, who have to fill out a form in triplicate anytime they want to sharpen a pencil.

You don't want that. Even I don't want that.

We can have that radiant future but it will require self-control, circumspection, and much more concern for safety that we've been willing to show.

It's time for us all to take a deep breath and pull off those radium underpants.

Thank you very much for your time, and please enjoy the rest of your big data conference."
maciejceglowski  data  privacy  surveillance  bigdata  2015  storage  radioactivity  datacollection  maciejcegłowski 
october 2015 by robertogreco
A Guide to Grounding Helicopter Parents
"Considering things like school websites, where parents can track grades, are schools actually enabling helicopter parents – and hurting students’ chances to be independent?

Because Lythcott-Haim’s book inspired Anna’s question, I thought she’d be a great person to field it. Some of her responses have been edited for length.

Lythcott-Haims: School leaders and teachers are in a really tough spot these days, particularly in communities where parents are used to doing a lot of hand-holding for their children and exerting influence. Still, I agree with Anna that yes, in many ways they’ve become enablers of overparenting behaviors and are inhibiting opportunities for kids to develop independence – such as the example of the principal setting the independence bar for his middle-schoolers absurdly low.

Middle-schoolers can handle things far more challenging than packing their own backpack. Take registration – reviewing the forms, signing them and turning them in. Middle-schoolers can handle that, and they probably should, particularly if we want them to be capable of handling it when they’re in high school, or college.

When my eldest began middle school, I caved to the overparenting mindset by filling out the forms and going to registration with him, which meant standing in long lines with hundreds of other parents doing the same. (The lines were so long, in part, because an excessive number of people were there instead of just the new middle-schoolers). When my second child was starting middle school two years later, I’d learned my lesson. She filled out the forms, asked me and her dad for signatures as needed and went off to registration by herself. The point is, life is full of bureaucracy and our kids have to learn to navigate it.

In terms of counteracting overparenting instead of enabling it, I’ve seen progress at the level of the individual teacher (who, for example, might announce at Back to School Night that parental involvement in homework is absolutely not allowed and a child’s grade will be docked a few points if there’s evidence of any such thing). But in my view, the bolder step would be adopting a school-wide and even district-wide philosophy that proclaims that part of getting an education is taking responsibility and being accountable for one’s own actions, and that as a result, parents doing things kids should be able to do for themselves is highly discouraged and might even be penalized (e.g. completing homework and projects, bringing homework and lunch to school, talking with teachers about the course material and concerns over grades).

MK: How do “parent portals” or school websites factor into over-parenting?

Lythcott-Haims: Parents obsessively checking the school website/portal isn’t good for the teacher, child or parent. Yes, the portal can deliver information quickly when we need it. The question we must ask ourselves as parents is, how frequently do we really need that information? Like the ability to track our children via GPS at all moments, just because the technology is there doesn’t mean we should use it all the time. …

Obsessively checking up on our kids’ performance means we then end up talking with our kids about their academic performance on a weekly or even daily basis – which sends a rather insidious message that their worth and value to us is based on grades – instead of what they’re learning and enjoying about school. Instead of building a relationship of trust with our kids where we’d expect them to inform us when they are struggling or need help, it erodes trust, raises anxiety and makes our kids feel that every single homework assignment or quiz is a “make or break” moment for their entire future.

As for me, I refuse to look at the online portal. I’m fine with a quarterly report. I expect my kids to update me as needed, and if they don’t, and it turns out there’s a greater consequence such as failing a class, I accept that that’s a part of childhood and something we’ll just have to work through when that time comes. To me, the developmental benefits to my kids that come from having greater autonomy, privacy and personal responsibility are more important than whatever short-term “win” I could achieve by trying to fix every micro-moment of imperfection."
parenting  helicopterparents  helicopterparenting  2015  mariokoran  children  schools  education  autonomy  independence  julielythcott-haims  responsibility  privacy 
september 2015 by robertogreco
The Terror of the Archive | Hazlitt
"The digitally inflected individual is often not quite an individual, not quite alone. Our past selves seem to be suspended around us like ghostly, shimmering holograms, versions of who we were lingering like memories made manifest in digital, diaphanous bodies. For me, many of those past selves are people I would like to put behind me—that same person who idly signed up for Ashley Madison is someone who hurt others by being careless and self-involved. Now, over a decade on, I’m left wondering to what extent that avatar of my past still stands for or defines me—of the statute of limitations on past wrongs. Though we’ve always been an accumulation of our past acts, now that digital can splay out our many, often contradictory selves in such an obvious fashion, judging who we are has become more fraught and complicated than ever. How, I wonder, do we ethically evaluate ourselves when the conflation of past and present has made things so murky?

*

Sometimes, I aimlessly trawl through old and present email accounts, and it turns out I am often inadvertently mining for awfulness. In one instance—in a Hotmail account I named after my love for The Simpsons—I find myself angrily and thoughtlessly shoving off a woman’s renewed affection because I am, I tell her, “sick of this.” I reassure myself that I am not that person anymore—that I now have the awareness and the humility to not react that way. Most days, looking at how I’ve grown since then, I almost believe this is true.

Yet, to be human is to constantly make mistakes and, as a result, we often hurt others, if not through our acts then certainly our inaction. There is for each of us, if we are honest, a steady stream of things we could have done differently or better: could have stopped to offer a hand; could have asked why that person on the subway was crying; could have been kinder, better, could have taken that leap. But, we say, we are only who we are.

We joke about the horror of having our Google searches publicized, or our Twitter DMs revealed, but in truth, we know the mere existence of such a digital database makes it likely that something will emerge from the murky space in which digital functions as a canvas for our fantasies or guilt.

That is how we justify ourselves. Our sense of who we are is subject to a kind of recency bias, and a confirmation bias, too—a selection of memories from the recent past that conform to the fantasy of the self as we wish it to be. Yet the slow accretion of selective acts that forms our self-image is also largely an illusion—a convenient curation of happenings that flatters our ego, our desire to believe we are slowly getting better. As it turns out, grace and forgiveness aren’t the purview of some supernatural being, but temporality—the simple erasure of thought and feeling that comes from the forward passage of time."



"The line between evasiveness and forgiveness, cowardice and grace, is thin, often difficult to locate, but absolutely vital. It seems, though, that our ethical structures may slowly be slipping out of step with our subjectivities. If we have abandoned the clean but totalitarian simplicity of Kant’s categorical imperative, instead embracing that postmodern cliché of a fluid morality, we still cling to the idea that the self being morally judged is a singular ethical entity, either good or bad. It’s common on social media, for example, for someone to be dismissed permanently for one transgression—some comedian or actor who is good at race but bad at gender (or vice versa) to be moved from the accepted pile to the trash heap. If our concept of morality is fluid, our idea of moral judgment is not similarly so.

That notion of self assumes morality is accretive and cumulative: that we can get better over time, but nevertheless remain a sum of the things we’ve done. Obviously, for the Bill Cosbys or Jian Ghomeshis or Jared Fogles of the world, this is fine. In those cases, it is the repetition of heinous, predatory behaviour over time that makes forgiveness almost impossible—the fact that there is no distance between past and present is precisely the point. For most of us, though, that simple idea of identity assumes that selves are singular, totalized things, coherent entities with neat boundaries and linear histories that arrived here in the present as complete. Even if that ever were true, what digitality helps lay bare is that who we are is actually a multiplicity, a conglomeration of acts, often contradictory, that slips backward and forward and sideways through time incessantly."



"Is the difficulty of digitality for our ethics, then, not the multiplicity of the person judged, but our Janus-faced relation to the icebergs of our psyches—the fact that our various avatars are actually interfaces for our subconscious, exploratory mechanisms for what we cannot admit to others or ourselves?

Freud said that we endlessly repeat past hurts, forever re-enacting the same patterns in a futile attempt to patch the un-healable wound. This, more than anything, is the terror of the personal, digital archive: not that it reveals some awful act from the past, some old self that no longer stands for us, but that it reminds us that who we are is in fact a repetition, a cycle, a circular relation of multiple selves to multiple injuries. It’s the self as a bundle of trauma, forever acting out the same tropes in the hopes that we might one day change.

What I would like to tell you is that I am a better man now than when, years ago, I tried my best to hide from the world and myself. In many ways that is true. Yet, all those years ago, what dragged me out of my depressive spiral was meeting someone—a beautiful, kind, warm person with whom, a decade later, I would repeat similar mistakes. I was callous again: took her for granted, pushed her away when I wanted to, and couldn’t take responsibility for either my or her emotions. Now, when a piece of the past pushes its way through the ether to remind me of who I was or am, I can try to push it down—but in a quiet moment, I might be struck by the terror that some darker, more cowardly part of me is still too close for comfort, still there inside me. The hologram of my past self, its face a distorted, shadowy reflection of me with large, dark eyes, is my mirror, my muse. And any judgment of my character depends not on whether I, in some simple sense, am still that person, but whether I—whether we, multiple and overlapped—can reckon with, can meet and return the gaze of the ghosts of our past."
navneetalang  archives  internet  memory  grace  forgiveness  circulation  change  past  present  mistakes  ashleymadison  twitter  email  privacy  facebook  socialmedia  dropbox  google  secrets  instagram  self  ethics  morality  judgement  identity 
september 2015 by robertogreco
Ashley Madison leak exposes a prurient and uncaring society - Eureka Street
"The blithe disregard for such questions suggests the kiss up, kick down culture prevailing in the media and seemingly internalised throughout society as a whole.

We're increasingly acclimatised to the wealthy and the powerful facing no sanctions whatsoever for their wrongdoing, even as the poor are ground into the dirt for minor transgressions."
jeffsparrow  via:anne  ashelymadison  humiliation  trandsgressions  morality  comeuppance  masochism  punishment  society  cruelty  wealth  power  inequality  surveillance  privacy 
august 2015 by robertogreco
pinboard private tags //5880.me (–⅃-)
"Holy smokes!

I ... just learned about private tags on Pinboard.

If you start a tag with a dot, only you will see it

As someone who works on client projects, I am so thrilled to learn there's a way to tag what I learn on a project with that project name without making the link itself private. Stoked."
pinboard  tags  tagging  privacy  maxfenton  2015 
august 2015 by robertogreco
Teaching Machines and Turing Machines: The History of the Future of Labor and Learning
"In all things, all tasks, all jobs, women are expected to perform affective labor – caring, listening, smiling, reassuring, comforting, supporting. This work is not valued; often it is unpaid. But affective labor has become a core part of the teaching profession – even though it is, no doubt, “inefficient.” It is what we expect – stereotypically, perhaps – teachers to do. (We can debate, I think, if it’s what we reward professors for doing. We can interrogate too whether all students receive care and support; some get “no excuses,” depending on race and class.)

What happens to affective teaching labor when it runs up against robots, against automation? Even the tasks that education technology purports to now be able to automate – teaching, testing, grading – are shot through with emotion when done by humans, or at least when done by a person who’s supposed to have a caring, supportive relationship with their students. Grading essays isn’t necessarily burdensome because it’s menial, for example; grading essays is burdensome because it is affective labor; it is emotionally and intellectually exhausting.

This is part of our conundrum: teaching labor is affective not simply intellectual. Affective labor is not valued. Intellectual labor is valued in research. At both the K12 and college level, teaching of content is often seen as menial, routine, and as such replaceable by machine. Intelligent machines will soon handle the task of cultivating human intellect, or so we’re told.

Of course, we should ask what happens when we remove care from education – this is a question about labor and learning. What happens to thinking and writing when robots grade students’ essays, for example. What happens when testing is standardized, automated? What happens when the whole educational process is offloaded to the machines – to “intelligent tutoring systems,” “adaptive learning systems,” or whatever the latest description may be? What sorts of signals are we sending students?

And what sorts of signals are the machines gathering in turn? What are they learning to do?
Often, of course, we do not know the answer to those last two questions, as the code and the algorithms in education technologies (most technologies, truth be told) are hidden from us. We are becoming as law professor Frank Pasquale argues a “black box society.” And the irony is hardly lost on me that one of the promises of massive collection of student data under the guise of education technology and learning analytics is to crack open the “black box” of the human brain.

We still know so little about how the brain works, and yet, we’ve adopted a number of metaphors from our understanding of that organ to explain how computers operate: memory, language, intelligence. Of course, our notion of intelligence – its measurability – has its own history, one wrapped up in eugenics and, of course, testing (and teaching) machines. Machines now both frame and are framed by this question of intelligence, with little reflection on the intellectual and ideological baggage that we carry forward and hard-code into them."



"We’re told by some automation proponents that instead of a future of work, we will find ourselves with a future of leisure. Once the robots replace us, we will have immense personal freedom, so they say – the freedom to pursue “unproductive” tasks, the freedom to do nothing at all even, except I imagine, to continue to buy things.
On one hand that means that we must address questions of unemployment. What will we do without work? How will we make ends meet? How will this affect identity, intellectual development?

Yet despite predictions about the end of work, we are all working more. As games theorist Ian Bogost and others have observed, we seem to be in a period of hyper-employment, where we find ourselves not only working numerous jobs, but working all the time on and for technology platforms. There is no escaping email, no escaping social media. Professionally, personally – no matter what you say in your Twitter bio that your Tweets do not represent the opinions of your employer – we are always working. Computers and AI do not (yet) mark the end of work. Indeed, they may mark the opposite: we are overworked by and for machines (for, to be clear, their corporate owners).

Often, we volunteer to do this work. We are not paid for our status updates on Twitter. We are not compensated for our check-in’s in Foursquare. We don’t get kick-backs for leaving a review on Yelp. We don’t get royalties from our photos on Flickr.

We ask our students to do this volunteer labor too. They are not compensated for the data and content that they generate that is used in turn to feed the algorithms that run TurnItIn, Blackboard, Knewton, Pearson, Google, and the like. Free labor fuels our technologies: Forum moderation on Reddit – done by volunteers. Translation of the courses on Coursera and of the videos on Khan Academy – done by volunteers. The content on pretty much every “Web 2.0” platform – done by volunteers.

We are working all the time; we are working for free.

It’s being framed, as of late, as the “gig economy,” the “freelance economy,” the “sharing economy” – but mostly it’s the service economy that now comes with an app and that’s creeping into our personal not just professional lives thanks to billions of dollars in venture capital. Work is still precarious. It is low-prestige. It remains unpaid or underpaid. It is short-term. It is feminized.

We all do affective labor now, cultivating and caring for our networks. We respond to the machines, the latest version of ELIZA, typing and chatting away hoping that someone or something responds, that someone or something cares. It’s a performance of care, disguising what is the extraction of our personal data."



"Personalization. Automation. Management. The algorithms will be crafted, based on our data, ostensibly to suit us individually, more likely to suit power structures in turn that are increasingly opaque.

Programmatically, the world’s interfaces will be crafted for each of us, individually, alone. As such, I fear, we will lose our capacity to experience collectivity and resist together. I do not know what the future of unions looks like – pretty grim, I fear; but I do know that we must enhance collective action in order to resist a future of technological exploitation, dehumanization, and economic precarity. We must fight at the level of infrastructure – political infrastructure, social infrastructure, and yes technical infrastructure.

It isn’t simply that we need to resist “robots taking our jobs,” but we need to challenge the ideologies, the systems that loath collectivity, care, and creativity, and that champion some sort of Randian individual. And I think the three strands at this event – networks, identity, and praxis – can and should be leveraged to precisely those ends.

A future of teaching humans not teaching machines depends on how we respond, how we design a critical ethos for ed-tech, one that recognizes, for example, the very gendered questions at the heart of the Turing Machine’s imagined capabilities, a parlor game that tricks us into believing that machines can actually love, learn, or care."
2015  audreywatters  education  technology  academia  labor  work  emotionallabor  affect  edtech  history  highered  highereducation  teaching  schools  automation  bfskinner  behaviorism  sexism  howweteach  alanturing  turingtest  frankpasquale  eliza  ai  artificialintelligence  robots  sharingeconomy  power  control  economics  exploitation  edwardthorndike  thomasedison  bobdylan  socialmedia  ianbogost  unemployment  employment  freelancing  gigeconomy  serviceeconomy  caring  care  love  loving  learning  praxis  identity  networks  privacy  algorithms  freedom  danagoldstein  adjuncts  unions  herbertsimon  kevinkelly  arthurcclarke  sebastianthrun  ellenlagemann  sidneypressey  matthewyglesias  karelčapek  productivity  efficiency  bots  chatbots  sherryturkle 
august 2015 by robertogreco
The Web We Need to Give Students — Bright — Medium
"Giving students their own digital domain is a radical act. It gives them the ability to work on the Web and with the Web, to have their scholarship be meaningful and accessible by others. It allows them to demonstrate their learning to others beyond the classroom walls. To own one’s domain gives students an understanding of how Web technologies work. It puts them in a much better position to control their work, their data, their identity online."

[See also: http://bavatuesdays.com/domains-and-the-cost-of-innovation/

"All this said, I know this is a broader reality playing out across higher education right now. The late logic of capital has come home to roost in academia: do more with less, lucky to have a job, tenuous tenure, the mission, austerity, budget cuts, everyone’s expendable, etc. But the fact is I firmly believe none of us at UMW are expendable. It really was, is, and will continue to be about the people. So if anyone out there is considering a Domain of One’s Own project, know this, the tech can be very, very cheap. It’s the right people that will be expensive, and for good reason—they determine its success. And success means integrating a digital-based curriculum across a university culture—this takes support, resources, and a concerted effort of talent. If you’re thinking about doing something like this I highly recommend you invest in some excellent people, pay them what they deserve, and trust them to do great things. Major kudos to VCU’s ALT Lab in this regard, they have been creating positions at really competitive salaries. Not sure how Gardner Campbell is doing it, but it lifts us all up." ]
adomainofoneson  audreywatters  education  schools  2015  online  internet  technology  edtech  ownership  jimgroom  web  identity  privacy  data 
july 2015 by robertogreco
Is It Time to Give Up on Computers in Schools?
"This is a version of the talk I gave at ISTE today on a panel titled "Is It Time to Give Up on Computers in Schools?" with Gary Stager, Will Richardson, Martin Levins, David Thornburg, and Wayne D'Orio. It was pretty damn fun.

Take one step into that massive shit-show called the Expo Hall and it’s hard not to agree: “yes, it is time to give up on computers in schools.”

Perhaps, once upon a time, we could believe ed-tech would change things. But as Seymour Papert noted in The Children’s Machine,
Little by little the subversive features of the computer were eroded away: … the computer was now used to reinforce School’s ways. What had started as a subversive instrument of change was neutralized by the system and converted into an instrument of consolidation.

I think we were naive when we ever thought otherwise.

Sure, there are subversive features, but I think the computers also involve neoliberalism, imperialism, libertarianism, and environmental destruction. They now involve high stakes investment by the global 1% – it’s going to be a $60 billion market by 2018, we’re told. Computers are implicated in the systematic de-funding and dismantling of a public school system and a devaluation of human labor. They involve the consolidation of corporate and governmental power. They involve scientific management. They are designed by white men for white men. They re-inscribe inequality.

And so I think it’s time now to recognize that if we want education that is more just and more equitable and more sustainable, that we need to get the ideologies that are hardwired into computers out of the classroom.

In the early days of educational computing, it was often up to innovative, progressive teachers to put a personal computer in their classroom, even paying for the computer out of their own pocket. These were days of experimentation, and as Seymour teaches us, a re-imagining of what these powerful machines could enable students to do.

And then came the network and, again, the mainframe.

You’ll often hear the Internet hailed as one of the greatest inventions of mankind – something that connects us all and that has, thanks to the World Wide Web, enabled the publishing and sharing of ideas at an unprecedented pace and scale.

What “the network” introduced in educational technology was also a more centralized control of computers. No longer was it up to the individual teacher to have a computer in her classroom. It was up to the district, the Central Office, IT. The sorts of hardware and software that was purchased had to meet those needs – the needs and the desire of the administration, not the needs and the desires of innovative educators, and certainly not the needs and desires of students.

The mainframe never went away. And now, virtualized, we call it “the cloud.”

Computers and mainframes and networks are points of control. They are tools of surveillance. Databases and data are how we are disciplined and punished. Quite to the contrary of Seymour’s hopes that computers will liberate learners, this will be how we are monitored and managed. Teachers. Students. Principals. Citizens. All of us.

If we look at the history of computers, we shouldn’t be that surprised. The computers’ origins are as weapons of war: Alan Turing, Bletchley Park, code-breakers and cryptography. IBM in Germany and its development of machines and databases that it sold to the Nazis in order to efficiently collect the identity and whereabouts of Jews.

The latter should give us great pause as we tout programs and policies that collect massive amounts of data – “big data.” The algorithms that computers facilitate drive more and more of our lives. We live in what law professor Frank Pasquale calls “the black box society.” We are tracked by technology; we are tracked by companies; we are tracked by our employers; we are tracked by the government, and “we have no clear idea of just how far much of this information can travel, how it is used, or its consequences.” When we compel the use of ed-tech, we are doing this to our students.

Our access to information is constrained by these algorithms. Our choices, our students’ choices are constrained by these algorithms – and we do not even recognize it, let alone challenge it.

We have convinced ourselves, for example, that we can trust Google with its mission: “To organize the world’s information and make it universally accessible and useful.” I call “bullshit.”

Google is at the heart of two things that computer-using educators should care deeply and think much more critically about: the collection of massive amounts of our personal data and the control over our access to knowledge.

Neither of these are neutral. Again, these are driven by ideology and by algorithms.

You’ll hear the ed-tech industry gleefully call this “personalization.” More data collection and analysis, they contend, will mean that the software bends to the student. To the contrary, as Seymour pointed out long ago, instead we find the computer programming the child. If we do not unpack the ideology, if the algorithms are all black-boxed, then “personalization” will be discriminatory. As Tressie McMillan Cottom has argued “a ‘personalized’ platform can never be democratizing when the platform operates in a society defined by inequalities.”

If we want schools to be democratizing, then we need to stop and consider how computers are likely to entrench the very opposite. Unless we stop them.

In the 1960s, the punchcard – an older piece of “ed-tech” – had become a symbol of our dehumanization by computers and by a system – an educational system – that was inflexible, impersonal. We were being reduced to numbers. We were becoming alienated. These new machines were increasing the efficiency of a system that was setting us up for a life of drudgery and that were sending us off to war. We could not be trusted with our data or with our freedoms or with the machines themselves, we were told, as the punchcards cautioned: “Do not fold, spindle, or mutilate.”

Students fought back.

Let me quote here from Mario Savio, speaking on the stairs of Sproul Hall at UC Berkeley in 1964 – over fifty years ago, yes, but I think still one of the most relevant messages for us as we consider the state and the ideology of education technology:
We’re human beings!

There is a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can’t take part; you can’t even passively take part, and you’ve got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you’ve got to make it stop. And you’ve got to indicate to the people who run it, to the people who own it, that unless you’re free, the machine will be prevented from working at all!

We’ve upgraded from punchcards to iPads. But underneath, a dangerous ideology – a reduction to 1s and 0s – remains. And so we need to stop this ed-tech machine."
edtech  education  audreywatters  bias  mariosavio  politics  schools  learning  tressuemcmillancottom  algorithms  seymourpapert  personalization  data  security  privacy  howwteach  howwelearn  subversion  computers  computing  lms  neoliberalism  imperialism  environment  labor  publicschools  funding  networks  cloud  bigdata  google  history 
july 2015 by robertogreco
New Topics in Social Computing: Data and Education by EyebeamNYC
"In this discussion, we will consider how younger generations are growing up with data collection normalized and with increasingly limited opportunities to opt-out. Issues of surveillance, privacy, and consent have particular implications in the context of school systems. As education and technology writer Audrey Watters explains, “many journalists, politicians, entrepreneurs, government officials, researchers, and others … argue that through mining and modeling, we can enhance student learning and predict student success.” Administrators, even working with the best intentions, might exaggerate systemic biases or create other unintended consequences through use of new technologies. We we consider new structural obstacles involving metrics like learning analytics, the labor politics of data, and issues of data privacy and ownership.

Panelists: Sava Saheli Singh, Tressie McMillan Cottom, and Karen Gregory"
savasahelisingh  tressiemcmillancottom  karengregory  education  personalization  race  class  gender  2015  publicschools  testing  privacy  government  audreywatters  politics  policy  surveillance  consent  social  journalism  learning  howwelearn  howweteach  labor  work  citizenship  civics  learninganalytics  technology  edtech  data  society  socialcontract 
july 2015 by robertogreco
San Jose Museum of Art: Covert Operations: Investigating the Known Unknowns
"Part 1: June 30, 2015 through January 10, 2016
Part 2: August 29, 2015 through January 10, 2016

The world is a very different place after 9/11. Surveillance, security, data collection, and privacy have become everyday concerns. Covert Operations is the first survey of a generation of artists who respond to the uncertainties of the post-9/11 world. They employ the tools of democracy to bear witness to attacks on liberty and the abuse of power: constitutional ideals, open government, safety, and civil rights are primary values here. They unearth, collect, and explore previously covert information, using legal procedures as well as resources such as the Freedom of Information Act, government archives, field research, and insider connections. In thirty-five powerful works, international artists push our idea of art beyond conventional thinking.

Many of the artists examine the complicity behind human rights violations or pry into the hidden economy of the United States’ intelligence community and so-called “black sites,” locations of clandestine governmental operations. Covert Operations sheds light on the complicated relationship between freedom and security, individuals and the state, fundamental extremism and democracy. The first phase of Covert Operations, opening June 30, showcases artists’ stylistic use of technology, gaming, and computer-generated imagery. It will include works by Electronic Disturbance Theater 2.0, Harun Farocki, and collaborators Anne-Marie Schleiner and Luis Hernandez Galvan. The second phase will open August 29 with works by Ahmed Basiony, Thomas Demand, Hasan Elahi, Jenny Holzer, Trevor Paglen, Taryn Simon, and Kerry Tribe.

Covert Operations: Investigating the Known Unknowns was organized by the Scottsdale Museum of Contemporary Art.

This exhibition is made possible by an Emily Hall Tremaine Exhibition Award. The Exhibition Award program was founded in 1998 to honor Emily Hall Tremaine. It rewards innovation and experimentation among curators by supporting thematic exhibitions that challenge audiences and expand the boundaries of contemporary art. Additional support for the exhibition catalogue was provided by Walter and Karla Goldschmidt Foundation."
sanjose  tosee  2015  art  surveillance  security  data  datacollection  privacy  exhibits  togo  government  democracy  harunfarocki  anne-marieschleiner  luishernandezgalvan  ahmedbasiony  thomasdemand  hasanelahi  jennyholzer  trevorpaglen  tarynsimon  kerrytribe  covertoperations  us  blacksites  liberty  freedom 
july 2015 by robertogreco
Crowdforcing: When What I “Share” Is Yours
"One phenomenon that has so far flown under the radar in discussions of peer-to-peer production and the sharing economy but that demands recognition on its own is one for which I think an apt name would be crowdforcing. Crowdforcing in the sense I am using it refers to practices in which one or more persons decides for one or more others whether he or she will share his or her resources, without the other person’s consent or even, perhaps more worryingly, knowledge. While this process has analogs and has even itself occurred prior to the digital revolution and the widespread use of computational tools, it has positively exploded thanks to them, and thus in the digital age may well constitute a difference in kind as well as amount.

Once we conceptualize it this way, crowdforcing can be found with remarkable frequency in current digital practice."



"Crowdforcing effects also overlap with phenomena researchers refer to by names like “neighborhood effects” and “social contagion.” In each of these, what some people do ends up affecting what many other people do, in a way that goes much beyond the ordinary majoritarian aspects of democratic culture. That is, we know that only one candidate will win an election, and that therefore those who did not vote for that candidate will be (temporarily) forced to acknowledge the political rule of people with whom they don’t agree. But this happens in the open, with the knowledge and even the formal consent of all those involved, even if that consent is not always completely understood.

Externalities produced by economic transactions often look something like crowdforcing. For example, when people with means routinely hire tutors and coaches for their children for standardized tests, they end up skewing the results even more in their favor, thus impacting those without means in ways they frequently do not understand and may not be aware of. This can happen in all sorts of markets, even in cultural markets (fashion, beauty, privilege, skills, experience). But it is only the advent of society-wide digital data collection and analysis techniques that makes it so easy to sell your neighbor out without their knowledge and consent, and to have what is sold be so central to their lifeworld.

Dealing with this problem requires, first of all, conceptualizing it as a problem. That’s all I’ve tried to do here: suggest the shape of a problem that, while not entirely new, comes into stark relief and becomes widespread due to the availability of exactly the tools that are routinely promoted as “crowdsourcing” and “collective intelligence” and “networks.” As always, this is by no means to deny the many positive effects these tools and methods can have; it is to suggest that we are currently overly committed to finding those positive effects and not to exploring or dwelling on the negative effects, as profound as they may be. As the examples I’ve presented here show, the potential for crowdforcing effects on the whole population are massive, disturbing, and only increasing in scope.

In a time when so much cultural energy is devoted to the self, maximizing, promoting, decorating and sharing it, it has become hard to think with anything like the scrutiny required about how our actions impact others. From an ethical perspective, this is typically the most important question we can ask: arguably it is the foundation of ethics itself. Despite the rhetoric of sharing, we are doing our best to turn away from examining how our actions impact others. Our world could do with a lot more, rather than less, of that kind of thinking."

[Quote below relevant to a specific concern in my neighborhood]

"Sharing pictures of your minor children on Facebook is already an interesting enough issue. Obviously, you have the parental right to decide whether or not to post photos of your minor children, but parents likely do not understand all the ramifications of such sharing for themselves, let alone for their children, not least since none of us know what Facebook and the data it harvests will be like in 10 or 20 years. Yet an even more pressing issue occurs when people share pictures on Facebook and elsewhere of other peoples’ minor children, without the consent or even knowledge of those parents. Facebook makes it easy to tag photos with the names of people who don’t belong to it. The refrain we hear ad nauseum—“if you’re concerned about Facebook, don’t use it”—is false in many ways, among which the most critical may be that those most concerned about Facebook, who have therefore chosen not to use it, may thereby have virtually no control over not just the “shadow profile” Facebook reportedly maintains for everyone in the countries where it operates, but even what appears to be ordinary sharing data that can be used by all the data brokers and other social analytic providers. Thus while you may make a positive, overt decision not to share about yourself, and even less about the minor children of whom you have legal guardianship, others can and routinely do decide you are going to anyway."

[related to that concern: http://soheresus.com/2015/06/12/down-syndrome-genoma-copyright-infringement/ ]
davidgolumbia  crowdforcing  crowdsourcing  collaboration  access  data  2015  photography  privacy  sharingeconomy  externalities  airbnb  uber  economics  neighborhoodeffects  socialcontagion  children  photogaphy  facebook  socialmedia  internet  online  web  socialnetworks 
june 2015 by robertogreco
What's Your Algorithmic Citizenship? | Citizen Ex
"Every time you connect to the internet, you pass through time, space, and law. Information is sent out from your computer all over the world, and sent back from there. This information is stored and tracked in multiple locations, and used to make decisions about you, and determine your rights. These decisions are made by people, companies, countries and machines, in many countries and legal jurisdictions. Citizen Ex shows you where those places are.

Your Algorithmic Citizenship is how you appear to the internet, as a collection of data extending across many nations, with a different citizenship and different rights in every place. One day perhaps we will all live like we do on the internet. Until then, there's Citizen Ex."

[http://citizen-ex.com/download

"Citizen Ex is a browser extension for Chrome, Firefox, and Safari, which shows you where on the web you really are, and what that means."]
geolocation  identity  immigration  jamesbridle  internet  web  privacy  law  time  space  data  location  legal  extensions  browsers  chrome  safari  firefox  citizenship  browser 
june 2015 by robertogreco
Continuous Monuments and Imaginable Alternatives - Amateur Cities
"In 1969, Superstudio, a radical Italian design group, made a proposal for what they called the ‘Continuous Monument’. It was a homogenous block of architecture that would encircle the earth depicting the global and total dimension of design and architecture of that time. We currently live in the time of a similar monument that harvests and feeds off ‘data’ – the golden ambrosia of the 2010s."



"In an ideal situation the model of a mesh network has the potential to become a platform for broadly horizontal networked politics as defined by its inherent structure. It could bring a new type of commons in the face of the death of network neutrality, government and corporate surveillance and exploitation as embodied by the current network structure.

Too often we are confronted with visions and stories of the future that say: ‘In the future everyone will live this way or that way. In the future everyone will have these things. In the future everyone will want that thing.’ This can often lead to acceptance of the idea that the future has been predetermined by powers greater than us. We need to imagine instead, what futures might bring. There are dozens of other small, niggling but significant alternatives that can challenge the theoretical basis for how the future might open up to a plethora of possible imaginable alternatives. Take for instance; domestic solar power, crypto currencies, end-to-end encryption or personal manufacturing. They are but a few that have the potential to either become incredibly empowering or to be sucked into our current continuous monument.

It is often said by military strategists, business leaders and alike that knowledge is the most powerful weapon. But imagination is also a significant one.

The political theorist David Graeber writes about how, since the protests of the late 1960s, the same entities pursuing the project of legibility have pursued a ‘. . . relentless campaign against the human imagination.’ It has resulted in ‘. . . the imposition of an apparatus of hopelessness, designed to squelch any sense of an alternative future.’

Activating imagination in everyday practice is hard. Financial imperatives and competition do not give space and time to explore alternatives and freely play with ideas without consequences. But there is a great reward in giving time to exploration. Inspiration can be found in things like mesh networks, but there are other examples. Jugaad culture – the repurposing of technology predominantly occurring in India is an excellent example. It provides an alternative by giving a particular design a different lifespan and shows how, in William Gibson’s words – ‘the street find its own uses.’ The speculative design cannon proposes objects and systems that are not intended for our world. They aim to stimulate our imagination about the hidden effects and repercussions of our design culture.

The purpose of such design and of introducing imagination is to widen the scope of possibilities. It could prevent the carte blanche master plan of the Smart City to become the inevitable endpoint of the current technological narrative. Furthermore it could perhaps lead to the development of real, functioning designs, such as mesh networks that will work better for people.

Knowledge of the systems, structures and technologies at play in our own continuous monument is vital for technologists, designers, urbanists, architects and everyone involved. It is impossible to be a wholesome practitioner and to remain ignorant of the wider context in which one situates one’s work. But what is equally as important is the activation of imagination; imagining beyond the given context to what could be, not just what, as is often presented, inevitably will be."
tobiasrevell  superstudio  architecture  government  resistance  cities  data  jamescscott  seeinglikeastate  davidgraeber  infrastructure  internet  privacy  surveillance  technology  design  systemsthinking  smartcities  legibility  illegibility  imagination  meshnetworks  2015 
may 2015 by robertogreco
Three Moments With WeChat | 八八吧 · 88 Bar
"Despite being only four years old, WeChat is more popular in China than Facebook is in the US: 72% of all Chinese people with mobile devices use it, versus the 67% penetration rate Facebook has among American internet users. Yet its Facebook-esque feature, Moments, manages to avoid feeling like the Walmart of social interaction. When my soon-to-be cousin-in-law posted that photo, he no doubt received both sincere congratulations from his professional contacts and older relatives as well as jokes from his closer friends. On Moments, however, each user can only see activity from their own contacts: not even a total count of Likes is available to anyone other than the original poster. This automated privacy curtain means that group social dynamics can remain hidden in plain sight without any moderation effort required from the original poster. In other words, my cousin-in-law could perform his groomal duties without worrying about messy (and potentially embarrassing) context collapse.

This decision to prioritize context separation over the ability to perform social popularity is an important concession to what sociologist Tricia Wang calls the Elastic Self. In a culture where connections are everything, many of WeChat’s features are subtly optimized for “saving face” in complicated situations. You can chat with people without adding them as contacts: someone you met on a chat-coordinated dinner doesn’t automatically become a Contact with access to details about your social life. Even while adding someone as a Contact, there is an option to secretly prevent them from seeing your Moments updates. There’s also a conspicuous lack of presence and typing status indicators as compared to iMessage and other apps, allowing the receiver some measure of plausible deniability about when each message is received.

These days, the buzz around WeChat centers on its impressive sprawl into an entire operating system of features: in certain regions, a user can hail a cab, shop, and even manage their bank accounts all in the app. But these features, introduced in late 2013, only work because they capitalize on WeChat’s already dizzying adoption rate. What lies at the core of WeChat’s success is a series of smart design decisions that reflect the culture they were created in and, together, generate a unique experience that is as functional as it is addictive."



"WeChat privileges another mode of communication equally to text: “Hold to Talk.” This featured, used by almost as many people as texting, allows the sender to record a short voice message which is then sent in the conversation. The receiver taps it when they want to hear it, and if there are multiple messages, each subsequent one autoplays. It’s a brilliant feature that marries the intimacy and simplicity of voice with the convenience of asynchronicity that makes texting so appealing.

“Hold to Talk” may have been created for its convenience, but it’s also a powerfully expressive feature with interesting affordances of its own. In the process of writing this piece, I was thinking about a Chinese phrase I only half-remembered. Forgetting a language is funny — there are some words I can read but not pronounce, and others that I can parse while listening but not recognize visually. I remembered the vague shape and meaning of the phrase, so I sent two voice clips to my mom, fumbling the words awkwardly. An hour later, she responded with a voice clip of her own. I listened to her laugh and rib me about my illiteracy, and chuckled alongside it as if she were next to me."



"Periodically, one of our hosts would pull out his phone (a Samsung Galaxy S4, possibly shanzhai) to shoot video clips of the gathering, documenting everyone who was there. Other relatives crowded around the phone afterwards, watching all of the videos on the phone. They were so interested in the videos taken of our hosts’ lives in Beijing, where they lived for most of the year as migrant workers, that they went to desperate measures to attempt to copy them.

WeChat natively supports a surprising number of media formats: images, custom animated stickers, uploaded videos, natively captured short videos called Sights, and even PowerPoint and Word documents. It also facilitates passing these files from one conversation to another through a prominent “forwarding” option for files.

Now that my 80 year old grandmother is on WeChat, the whole family forwards anything amusing they find to the group chat we share so that she can see it. Often, it’s jokes, articles, and photos of ourselves and our food."



"Scrolling through my WeChat today, I see pictures of my cousin and cousin-in-law surfing and glowing on their honeymoon, pictures of my parents from a friend’s graduation ceremony, at least five jokes I can’t quite grok, and even the occasional dispatch from Nanzhai village. Using a chat app to hail a cab with your phone is cool, but at the end of the day the killer feature of WeChat will always be its ability to shorten distances and navigate social situations as deftly as we need to."

[via: http://tumblr.iamdanw.com/post/119597750700/despite-being-only-four-years-old-wechat-is-more ]
christinaxu  socialmedia  facebook  2015  wechat  china  contextcollapse  privacy  metrics  socialdynamics  social  interaction  moderation  mobile  application  socialnetworks  communication  tumblr  vine 
may 2015 by robertogreco
« earlier      
per page:    204080120160

related tags

1:1  1to1  3dprinting  9eyes  18f  21stcenturyskills  23andme  29c3  aaronsorkin  aaronstraupcope  absence  abundance  abuse  academia  acceptableuse  acceptance  access  accessibility  accountability  accounts  aclu  action  activism  adamgreenfield  adamharvey  adblockers  adblocking  adbusters  addiction  addiewagenknecht  addons  adjuncts  administration  admissions  adolescence  adomainofone'sown  adomainofoneson  adoption  adrianelapointe  adrianlamo  adriennerlafrance  ads  adtracking  adultswim  advertising  advice  advocacy  affect  affluence  affordability  afghanistan  agameforsomeone  agency  agesegregation  aggregator  aging  agneschang  ahmedbasiony  ai  airbnb  airplanes  aiweiwei  ala  alancooper  alanjacobs  alanturing  alenamaher  alexa  alexandergalloway  alexandradaisy-ginsberg  alexandranyskova  alexisloyd  alexismadrigal  alexkaufman  algorithms  alicemarwick  alicerawsthorn  aliciagarza  alienation  allentan  allisonburtch  amazon  amazonecho  amazonneighbors  amberalert  ambercase  ambient  ambientexposure  ambientintimacy  amtrak  anabjain  analytics  anarchism  anarchists  anarchy  ancestry.com  andreazeller  android  anildash  animalcrossing  animals  anjaliramachandran  annabitkin  anne-marieschleiner  annebalsamo  annewojcicki  annotation  anonymity  anonymizer  anonymous  anthonydunne  anthonywagner  anthropology  antoinelesur  antoninscalia  antoniovillaraigosa  anxiaomina  anxiety  anybot  aol  apatternlanguage  api  apml  apple  application  applications  appropriation  ar  arabellesicardi  arabspring  aralbalkan  arambartholl  architecture  archives  archiving  arduino  are.na  arg  arizona  arnoldmann  art  arthurcclarke  artificialintelligence  artists  artlabor  artleisure  ashelymadison  ashleymadison  assessment  atheism  atomization  attendance  attention  attribution  audience  audiencesofone  audioanalytic  audreywatters  augerloizeau  augmentedreality  aup  austerity  authentication  authenticity  authoritarianism  authority  autodidacts  autographer  automation  autonomy  availability  availabot  aversion  awareness  backup  bahavior  balance  bangalore  bangwithfriends  banjamindoxtdator  banking  barackoabama  barackobama  barbarafister  bayarea  beacon  beauty  beckstern  becomingtarden  behavior  behaviorism  bejamingrosser  belief  beliefs  bellhooks  benefits  bengaluru  benjaminbratton  berg  berglondon  berkmancenter  berniesanders  bertrandrussell  betsyhaibel  bfskinner  bgi  bias  bigbrother  bigdata  bigdog  billbinney  billfitzgerald  billkeller  billmoggridge  billofrights  binary  binladen  biobricks  bioengineering  biology  biometricdata  biometrics  bionicrequiem  biotech  biotechnology  bitcoins  bittorrent  blackberries  blackblock  blackmarket  blackmirror  blacksites  blink  blippy  blocking  blogging  blogs  bluetooth  bobbiestauffacher  bobdylan  bodies  body  bodycams  boingboing  books  border  borders  bots  bradleymanning  brain  brainstorming  branding  brands  bregtjevanderhaak  brettgaylor  brianlamb  broadcast  broken  brooklyn  browser  browsers  bruceschneier  brucesterling  brunolatour  bryanalexander  buddhism  bullying  business  buzz  byod  cafeculture  california  californianideology  cameras  camouflage  canada  canary  canaryproject  cancer  canon  capitalism  captcha  care  careerose  caring  carlosvainer  carolbecker  carolinesinders  cars  cartography  carymcclelland  caseygollan  caseyjohnston  cash  cashless  categorization  caterinafake  cathyo'neil  cats  cc  cctv  celebrities  celebrity  cellphones  censorship  census  centralization  certainty  change  chaos  character  chargers  charitableindustrialcomplex  charity  charliestross  charterschools  chat  chatbots  chatroulette  chernofffaces  chicago  childhood  children  chile  china  choice  chrisanderson  chrisgilliard  christinaxu  christopheralexander  chrome  chromebooks  chromeos  chulavista  circulation  circumspection  cities  citizen  citizenapp  citizenship  citysense  civicresponsibilities  civics  civilinattention  civility  civilization  civilliberties  civilrights  clarity  class  classdojo  classideas  classifieddocuments  classroom  classrooms  claudelevi-strauss  clayburell  clayshirky  climatechange  clivethompson  clothes  clothing  cloud  cloud-computing  cloudcomputing  clubhouses  code  codeswitching  coding  coercion  cognition  cohousing  collaboration  collage  collections  collectives  collectivism  collectivity  colleges  collegiality  colonialism  colonization  color  comeuppance  comfort  commencementaddresses  commencementspeeches  commentary  commenting  committees  commodification  commodities  commoditization  commons  commonsensemedia  communalism  communes  communication  communism  communities  communition  community  communityexpectations  communitymanagement  compendium  compensation  competition  competitions  complacency  complexity  compliance  complicity  compsci  compubody  compulsory  computer  computers  computing  conferences  confrontation  connectedlearning  connectedness  connectedthings  connection  connections  connectivism  connectivity  conscience  consciousness  consensus  consent  consequences  conservatism  conservativeneutrality  consistency  constantnieuwenheuys  constitution  consulting  consumer  consumerism  consumerist  consumers  consumption  contacts  content  contentstrategy  context  context-aware  contextawareness  contextcollapse  continuouspartialattention  continuouspartialfriendship  control  controversy  convenience  convention  convergence  conversation  conviviality  cookies  cookieswapping  cooper-hewitt  cooperatives  cooperunion  coophimmelblau  coppa  copresence  copying  copyright  coral  corneliusvanderbilt  corporatesurveillance  corporations  corporatism  corporatization  correspondencecourses  corruption  cortana  corydoctoow  corydoctorow  corysilverberg  counterculture  counterdeclarations  countersurveillance  courage  courtneymartin  courtneyvonhippel  covertoperations  coworkers  coworking  craft  craiglist  craigslist  crapdetection  creation  creativecommons  creativity  creditcards  crime  criminaljustice  crisis  criticaleducation  criticalmass  criticalpedagogy  criticaltheory  criticalthinking  criticism  critique  crowdforcing  crowdfunding  crowds  crowdsourcing  cruelty  cryptocurrencies  cryptography  csiap  culture  culturecreation  curiosity  currency  curriculum  cv  cyberbullying  cybercide  cybercultue  cyberfeminism  cyberoptimism  cyberpunk  cybersafety  cyberspace  cyberunplug  cyborg  cyphertools  dalailama  danagoldstein  danahboyd  danajohnson  danger  danielellsberg  danielhowe  daniellecitron  daniellesucher  danielsolove  dannyo'brien  danschultz  daphnedragona  daraóbriain  daringfireball  dariusthegreat  darkmatter  darkweb  darpa  darrenpasemko  data  database  databases  datacenters  datacollection  datadrama  dataexhaust  datahavens  datamining  dataportability  datavisualization  daveeggers  davidadjaye  davidbrin  davidbrooks  davidgolumbia  davidgraeber  davidnoble  davidpogue  davidsmith  davidwalsh  davidweinberger  davidzeig  dazzle  dconstruct  dconstruct2014  death  debate  debchachra  debian  decentralization  decentralized  decentralizedcomputing  declarations  deeplab  deepweb  defiance  del.icio.us  delete  deleting  democracy  demographics  density  depression  deschooling  desensitization  design  designfiction  desks  detection  development  dexterthomas  df  DIA  dianeravitch  diconnectivity  dictatorship  difference  digital  digitalart  digitalcitizenship  digitaldivide  digitalexhaust  digitalfootprint  digitalhumanities  digitalidentity  digitallife  digitalliteracy  digitalnatives  digitization  dignity  disappearing  disappearingink  discipline  disconnectivity  discourse  discrimination  discussion  disney  disneyland  disorder  disorientation  disparity  display  disruption  dissent  distancelearning  distraction  distributed  diversity  diy  diybio  dml  dna  dnadreams  dns  docsearls  documentary  documentation  dollhouses  domesticlabor  don'tbeevil  donaldtrump  donatellameadows  donellameadows  dontapscott  doom  dopplr  doubelshaw  doublestandards  doubt  download  dpla  draw.io  dresses  drm  droneproject  drones  drop.io  dropbox  duckduckgo  dunne&raby  dynamicgenetics  dysthymia  dystopia  e-safety  eating  eavesdropping  ebenmoglen  ebooks  echo  echonest  economics  edg  edge  edges  edtech  education  educationmetaphors  edupunk  edwardsaid  edwardsnowden  edwardthorndike  eecummings  eff  efficiency  effort  effortlessness  egypt  eleanorsaitta  elections  electricity  elishacohen  elitism  eliza  elizabethstanton  ellenlagemann  ellenullman  ello  elonmusk  email  embedded  emergentcurriculum  emergentlearning  emilybadger  emmagoldman  emotion  emotionallabor  emotions  empathy  employment  empowerment  encryption  endings  endorsement  energy  energybinding  engagement  enryption  entertainment  enthusiasm  environment  ephemeral  ephemeralconversation  ephemerality  epiphinator  equality  equity  erase  ereaders  ericgarcetti  ericklinenberg  ericschmidt  erinkissane  ervinggoffman  espionage  estadomínimo  estherdyson  eszterhargittai  ethanzuckerman  ethercalc  etherpad  ethics  ethnography  etiquette  eugenics  europe  evanratliff  events  evernote  everyday  everydaylife  everyware  evgenymorozov  evidence  evil  evite  exclusion  exhibits  exodus  expectations  experience  experiencedesign  experientialeducation  experimentation  experiments  expiration  expirationdates  exploitation  exposure  expression  expressions  extensions  extenstions  externalities  extinction  f2f  fabric  fabrica  facebook  facebooks  facelessness  facerecognition  faces  facialidentification  facialrecognition  failure  fame  familiarity  families  family  faridavis  fascism  fashion  fatigue  fbi  fear  feedom  feral  ferguson  ferpa  fiction  files  filesharing  filetype:pdf  film