robertogreco + psychology   1416

Against Economics | by David Graeber | The New York Review of Books
“There is a growing feeling, among those who have the responsibility of managing large economies, that the discipline of economics is no longer fit for purpose. It is beginning to look like a science designed to solve problems that no longer exist.

A good example is the obsession with inflation. Economists still teach their students that the primary economic role of government—many would insist, its only really proper economic role—is to guarantee price stability. We must be constantly vigilant over the dangers of inflation. For governments to simply print money is therefore inherently sinful. If, however, inflation is kept at bay through the coordinated action of government and central bankers, the market should find its “natural rate of unemployment,” and investors, taking advantage of clear price signals, should be able to ensure healthy growth. These assumptions came with the monetarism of the 1980s, the idea that government should restrict itself to managing the money supply, and by the 1990s had come to be accepted as such elementary common sense that pretty much all political debate had to set out from a ritual acknowledgment of the perils of government spending. This continues to be the case, despite the fact that, since the 2008 recession, central banks have been printing money frantically in an attempt to create inflation and compel the rich to do something useful with their money, and have been largely unsuccessful in both endeavors.

We now live in a different economic universe than we did before the crash. Falling unemployment no longer drives up wages. Printing money does not cause inflation. Yet the language of public debate, and the wisdom conveyed in economic textbooks, remain almost entirely unchanged.

One expects a certain institutional lag. Mainstream economists nowadays might not be particularly good at predicting financial crashes, facilitating general prosperity, or coming up with models for preventing climate change, but when it comes to establishing themselves in positions of intellectual authority, unaffected by such failings, their success is unparalleled. One would have to look at the history of religions to find anything like it. To this day, economics continues to be taught not as a story of arguments—not, like any other social science, as a welter of often warring theoretical perspectives—but rather as something more like physics, the gradual realization of universal, unimpeachable mathematical truths. “Heterodox” theories of economics do, of course, exist (institutionalist, Marxist, feminist, “Austrian,” post-Keynesian…), but their exponents have been almost completely locked out of what are considered “serious” departments, and even outright rebellions by economics students (from the post-autistic economics movement in France to post-crash economics in Britain) have largely failed to force them into the core curriculum.

As a result, heterodox economists continue to be treated as just a step or two away from crackpots, despite the fact that they often have a much better record of predicting real-world economic events. What’s more, the basic psychological assumptions on which mainstream (neoclassical) economics is based—though they have long since been disproved by actual psychologists—have colonized the rest of the academy, and have had a profound impact on popular understandings of the world.”



“Economic theory as it exists increasingly resembles a shed full of broken tools. This is not to say there are no useful insights here, but fundamentally the existing discipline is designed to solve another century’s problems. The problem of how to determine the optimal distribution of work and resources to create high levels of economic growth is simply not the same problem we are now facing: i.e., how to deal with increasing technological productivity, decreasing real demand for labor, and the effective management of care work, without also destroying the Earth. This demands a different science. The “microfoundations” of current economics are precisely what is standing in the way of this. Any new, viable science will either have to draw on the accumulated knowledge of feminism, behavioral economics, psychology, and even anthropology to come up with theories based on how people actually behave, or once again embrace the notion of emergent levels of complexity—or, most likely, both.

Intellectually, this won’t be easy. Politically, it will be even more difficult. Breaking through neoclassical economics’ lock on major institutions, and its near-theological hold over the media—not to mention all the subtle ways it has come to define our conceptions of human motivations and the horizons of human possibility—is a daunting prospect. Presumably, some kind of shock would be required. What might it take? Another 2008-style collapse? Some radical political shift in a major world government? A global youth rebellion? However it will come about, books like this—and quite possibly this book—will play a crucial part.”
davidgraeber  2019  robertskidelsky  economics  economists  criticism  finances  policy  psychology  socialsciences  feminism  science  growth  productivity  change  theory  praxis  microfoundations  anthropology  behavior  humanism  complexity  simplicity  modeling  understanding  marxism  mainstream  politics  wisdom  knowledge  failure  government  governance  monetarypolicy  inflation 
4 days ago by robertogreco
Inhumanism Rising - Benjamin H Bratton - YouTube
[See also:
https://trust.support/watch/inhumanism-rising

“Benjamin H. Bratton considers the role ideologies play in technical systems that operate at scales beyond human perception. Deep time, deep learning, deep ecology and deep states force a redrawing of political divisions. What previously may have been called left and right comes to reflect various positions on what it means to be, and want to be, human. Bratton is a design theorist as much as he is a philosopher. In his work remodelling our operating system, he shows how humans might be the medium, rather than the message, in planetary-scale ways of knowing.

Benjamin H. Bratton's work spans Philosophy, Art, Design and Computer Science. He is Professor of Visual Arts and Director of the Center for Design and Geopolitics at the University of California, San Diego. He is Program Director of the Strelka Institute of Media, Architecture and Design in Moscow. He is also a Professor of Digital Design at The European Graduate School and Visiting Faculty at SCI_Arc (The Southern California Institute of Architecture)

In The Stack: On Software and Sovereignty (MIT Press, 2016. 503 pages) Bratton outlines a new theory for the age of global computation and algorithmic governance. He proposes that different genres of planetary-scale computation – smart grids, cloud platforms, mobile apps, smart cities, the Internet of Things, automation – can be seen not as so many species evolving on their own, but as forming a coherent whole: an accidental megastructure that is both a computational infrastructure and a new governing architecture. The book plots an expansive interdisciplinary design brief for The Stack-to-Come.

His current research project, Theory and Design in the Age of Machine Intelligence, is on the unexpected and uncomfortable design challenges posed by A.I in various guises: from machine vision to synthetic cognition and sensation, and the macroeconomics of robotics to everyday geoengineering.”]
benjaminbratton  libertarianism  technology  botcoin  blockchain  peterthiel  society  technodeterminism  organization  anarchism  anarchy  jamesbridle  2019  power  powerlessness  control  inhumanism  ecology  capitalism  fascism  interdependence  surveillance  economics  data  computation  ai  artificialintelligence  californianideology  ideology  philosophy  occult  deeplearning  deepecology  magic  deepstate  politics  agency  theory  conspiracytheories  jordanpeterson  johnmichaelgreer  anxiety  software  automation  science  psychology  meaning  meaningfulness  apophenia  posthumanism  robotics  privilege  revelation  cities  canon  tools  beatrizcolomina  markwigley  markfisher  design  transhumanism  multispecies  cybotgs  syntheticbiology  intelligence  biology  matter  machines  industry  morethanhuman  literacy  metaphysics  carlschmitt  chantalmouffe  human-centereddesign  human-centered  experience  systems  access  intuition  abstraction  expedience  ideals  users  systemsthinking  aesthetics  accessibility  singularity  primitivism  communism  duty  sovietunion  ussr  luxury  ianhacking 
4 days ago by robertogreco
There is a word for the trauma caused by distance from nature — Quartz
"Disconnection from nature can be bad for our mental health. But there was no name for this particular malaise until Australian sustainability professor Glenn Albrecht coined the term psychoterratic, creating the beginning of a vocabulary to discuss the relationship between mental health and environment.

Since then, he’s thought up a whole lexicon. In May, Albrecht’s book, Earth Emotions: New Words for a New World, will be published by Cornell University Press. It includes gems like the word ecoagnosy, a term created to describe environmental ignorance or indifference to the ecology. Then there’s solastalgia, the psychic pain of climate change and missing a home that’s transforming before your eyes."
words  forestbathing  nature  solastalgia  ecoagnosy  psycoterratic  language  vocabulary  morethanhuman  multispecies  prescribingnature  ecology  psychology 
4 days ago by robertogreco
teachers at the margins – Snakes and Ladders
“Lisa Marchiano, a psychoanalyst, describing her encounter with a student who had a “panic attack” during an exam and didn’t want to take any more exams:
I asked this young patient of mine what in fact had happened during the first exam. She responded again, I had a panic attack. I lightly pressed her to move beyond the jargon and tell me about her actual experience as she took the exam. Eventually, she was able to tell me that, as the papers were being handed out, she become flushed and light-headed. Her heart was pounding, and her hands felt clammy. What happened then? I asked. She felt like running out of the room, but she was able to calm herself down enough to take the test. Though she successfully completed the first exam — and did okay on it — the fear that she might have another “panic attack” had prevented her from attempting the second exam.

What had happened here? One way of understanding this young person’s experience is indeed that she had had a limited-symptom panic attack. According to the diagnostic criteria for panic attacks in the Diagnostic and Statistical Manual of Mental Disorders (DSM), a limited-symptom panic attack can be diagnosed based on a pounding heart, sweating, and shaking. Of course, as anyone knows who has ever taken an exam, performed in front of an audience, or asked someone they like out on a date, these are in fact utterly normal reactions to feeling nervous. I gently attempted to reflect this back to my young patient. “So you were nervous about taking the exam, but you didn’t run out of the room. You did it. You pushed through the fear feelings.” I wanted her to see this as a success, one that she could build on, that could help alter her stuck story that tells her she is too anxious to function adequately. Her response to my positive reframing was telling. She looked up at me from under her brows and held my gaze. “Yes,” she responded firmly. “But I had a panic attack.”

Reflecting on this experience, Marchiano raises a key issue: “I found myself wondering where she had learned that she ought not to be expected to tolerate ordinary distress or discomfort. How have we come to the point where we believe that emotional disquiet will cause harm, that we ought to be soothed and tranquil at all times?”

Some years ago I had a student — I’ll call her M — who came to me and said that she could no longer take the reading quizzes that I give at the beginning of many classes. If she had to take them, she preferred to do so in the office on campus that deals with students who have disabilities, even if that meant missing most or all of my classes. And M clearly, though in no way angrily or aggressively, expected that I would do as she preferred.

I ended up talking with the case worker assigned to M, and the case worker told me that M was anxious about not having time to finish the quizzes, and, further, that M had problems, not to be disclosed to me, that made it necessary for me to accommodate her preferences.

Several elements of this situation puzzled me. First, M was usually among the first to complete her quizzes. Second, she had the highest quiz average in the class, and it wasn’t even close. Third, her very intelligent contributions to class discussions about the quizzes added significantly to the value of our class time. And fourth: those facts, and my observations, had absolutely no bearing on the expectations my university had for me. M’s feelings and preferences, as interpreted by her case worker, were all that mattered — I was strongly discouraged from sharing with M any of my thoughts, no matter how positive.

I didn’t know what else to do, so I agreed to make any accommodation necessary. But M kept coming to class, kept taking the quizzes, and kept excelling in them. Why she didn’t follow through on her request I can’t say. Maybe her knowledge that I would do what she wanted was enough to relieve the pressure the had been feeling.

I’m glad M stayed in class, and that there was a peaceful resolution to the situation, but the whole sequence of events troubled me then and troubles me now. The first, and larger, problem is that we’re now in a moment at which any attempt to resist the pathologizing of perfectly ordinary experiences of nervousness or uncertainty is tagged as indifference (at best) or cruelty (at worst). To encourage students to believe that they can overcome their anxieties is, it appears, now a form of abuse.

And second — perhaps not as important but still significant to me — there is the marginalization of the teacher-student relationship. It was made very clear to me that the case worker — who had never been in my class, who had never observed either M or me — could dictate the response to M’s concerns. I didn’t push back, because I didn’t want to bring any further anxiety to a student who was already anxious, but I wonder what would have happened if I had insisted that my own view of the matter, which was after all backed by some experience, should be taken into account.

More seriously, it seemed to me that the case worker was constructing, or allowing M to construct, a narrative in which I was M’s antagonist and it was the case worker’s job to intervene to assist M in her struggle against her antagonist. The idea that I might be on M’s side and want to help her, and indeed should, as part of my job, help her was never considered.

The work done by the “bias prevention units” or “diversity offices” that have proliferated in many universities might seem to be a very different phenomenon, but that work has a similar effect on the relationship between teachers and students. A key premise — sometimes unstated but sometimes quite explicit — of such administrative offices is that faculty are often the enemies of diversity and the perpetrators of bias, and therefore these programs must step in to correct the injustices inherent in the system. Again the faculty member is cast as the students’ antagonist, or at least as a possible antagonist. I do not know of any circumstances in which the “learnings” or “training modules” produced by these offices — which are often mandatory for all students — have received any faculty input, though I suppose some faculty may occasionally be involved. The “learnings” seem to be designed to emphasize the untrustworthiness of teachers.

I think students in general have a pretty good grasp of these dynamics. My observations suggest that disgruntled students these days rarely take their complaints to department chairs or deans, but rather to these amorphous “offices” which exist independently of the faculty structure and are typically empowered by the university to impose decisions without consulting anyone in that faculty structure.

I also think that this way of doing our academic business exacerbates, quite dramatically, one of the worst features of academic life, which is its legalism. Knowing that they are being overseen by these distant and almost invisible “offices,” faculty end up writing more and more detailed syllabuses, working to close every possible loophole which might be exploited by students to get what they want even when, from the faculty point of view, they don’t deserve it. And the more desperately faculty look to close such loopholes, the more the students search for them. It’s no way to run a university — at least if the university cares about learning.

There were certainly flaws in the old way of doing these things, in which individual teachers almost certainly had too much power. But certain experiences of learning were possible in that system that the current, or emerging, system is rapidly making impossible. The marginalizing of the student-faculty relationship is not a good recipe for addressing those old flaws.”
alanjacobs  2019  teaching  howwelearn  howweteach  highered  highereducation  academia  administration  management  services  anxiety  relationships  lisamarchiano  psychology  trust  power 
25 days ago by robertogreco
▶ Audrey Watters | Gettin' Air with Terry Greene
"Audrey Watters (@audreywatters) is an ed-tech folk hero who writes at Hack Education @hackeducation where, for the past nine years, she has taken the lead in keeping the field on its toes in regards to educational technology's "progress". Her long awaited and much anticipated book, "Teaching Machines", will be out in the new year."
2019  audreywatters  edtech  terrygreene  bfskinner  technology  schools  education  turnitin  history  learning  behaviorism  cognition  cognitivescience  psychology  automation  standardization  khanacademy  howweteach  liberation  relationships  agency  curiosity  inquiry  justice  economics  journalism  criticism  vr  facebook  venturecapital  capitalism  research  fabulism  contrafabulism  siliconvalley  archives  elonmusk  markzuckerberg  gatesfoundation  billgates 
june 2019 by robertogreco
Anne Galloway 'Speculative Design and Glass Slaughterhouses' - This is HCD
"Andy: You’ve got quite an interesting background. I’m going to ask you about in a second. I wanted to start with the quote from Ursula Le Guin that you have on your website. It’s from the Lathe of Heaven. “We’re in the world, not against it. It doesn’t work to try and stand outside things and run them that way, it just doesn’t work. It goes against life. There is a way, but you have to follow it, the world is, no matter how we think it ought to be, you have to be with it, you have to let it be.

Then on the More Than Human website, you have these three questions. What if we refuse to uncouple nature and culture? What if we deny that human beings are exceptional? What if we stop speaking and listening only to ourselves? The More Than Human lab explores everyday entanglements of humans and non-humans and imagines more sustainable ways of thinking, making, and doing. Anne, let’s get started by first talking about what do you mean by all of that?

Anne: The Ursula Le Guin quote I love mostly because a critical perspective or an activist perspective, anything that says we ought to be changing the world in any way, it always assumes that we need to fix something, that the world is broken and that designers especially are well-suited to be able to solve some of these problems. I like thinking about what it means to respond to injustice by accepting it, not in the sense of believing that it’s okay or right, because clearly, it’s been identify as unjust. I love Le Guin’s attention to the fact that there is a way to be in the world.

As soon as we think that we’re outside of it, any choices or decisions or actions that we take are, well, they sit outside of it as well. I like being embedded in the trouble. I like Donna Haraway’s idea of staying with the trouble. It’s not that we have to accept that things are problematic, but rather that we have to work within the structures that already exist. Not to keep them that way, in fact, many should be dismantled or changed. Rather, to accept that there is a flow to the universe.

Of course, Le Guin was talking about Taoism, but here what I wanted to draw attention to is often our imperative to fix or to solve or to change things comes with a belief that we’re not part of the world that we’re trying to fix and change. It’s that that I want to highlight. That when we start asking difficult questions about the world, we can never remove ourselves from them. We’re complicit, we are on the receiving end of things. We’re never distant from it. I think that subtle but important shift in deciding how we approach our work is really important."



"Andy: Yes, okay. I was thinking about this, I was reading, in conjunction, this little Le Guin quote, I was trying to think, it’s unusual in the sense that it’s a discipline or a practice of design that uses its own practice to critique itself. It’s using design to critique design in many respects. A lot of what speculative design is talking about is, look what happens when we put stuff into the world, in some way, without much thought. I was trying to think if there was another discipline that does that. I think probably in the humanities there are, and certainly in sociology I think there probably is, where it uses its own discipline to critique itself. It’s a fairly unusual setup.

Anne: I would think actually it’s quite common in the humanities, perhaps the social sciences, where it’s not common is in the sciences. Any reflexive turn in any of the humanities would have used the discipline. Historiography is that sort of thing. Applied philosophy is that sort of thing. Reflexive anthropology is that sort of thing. I think it’s actually quite common, just not in the sciences, and design often tries to align itself with the sciences instead.

Andy: Yes, there was a great piece in the Aeon the other day, about how science doesn’t have an adequate description or explanation for consciousness. Yet, it’s the only thing it can be certain of. With that, it also doesn’t really seem to come up in the technology industry that much, because it’s so heavily aligned with science. Technology, and you’ve got this background in culture studies and science and technology and society, technology is a really strong vein throughout speculative design. Indeed, your work, right? Counting sheep is about the Internet of Things, and sheep. Do you want to tell us a little bit about that and why I am talking to you from the picture things to the Lord of the Rings, it basically looks like you’re living in part of the Shire in Middle Earth?

Anne: I do live in a place that looks remarkably like the Shire. It’s a bit disconcerting at times. The science and technology question in speculative design I think is first of all a matter of convenience. Science fiction, speculation, they lean historically, habitually towards science and tech. It becomes an easy target for critique. Not that it’s not necessary, but it’s right there, so why not? There’s that element to it. It has an easier ability to be transformed into something fanciful or terrifying, which allows for certain kinds of storytelling through speculation, that I think people, both creators and audiences or readers really enjoy.

Now, the irony of all of this, of course is that arguably one of the greatest concerns that people have would be tied to technological determinism, the idea that we’re going to have these technologies anyway, so what are we going to do about it? Now, when you speculate using these technologies, what you’re doing is actually reinforcing the idea that these technologies are coming, you play right into the same technological determinism that you’re trying to critique. In fact, one of the counting sheep scenarios was designed specifically to avoid the technology. It was the one that got the most positive responses."



"Andy: With all of this, and I may this pop at the beginning, just before we were recording, that there’s a sense of, because of everything going on in the world, that if only designers could run the world, everything would be fine, right, because we can see all of the solutions to everything. What would you want designers to get out of this kind of work or this kind of perspective?

Anne: Humility. That simple. I am one of those people. It’s because of being an ethnographer as well and doing participant observation and interviewing many people and their ideas about design. I’ve run into far more people who think that designers are arrogant than ones who don’t. This has always really interested me. What is it that designers do that seems to rub non-designers the wrong way? Part of it is this sense of, or implication that they know better than the rest of us, or that a designer will come in and say, “Let me fix your problem”, before even asking if there is a problem that the person wants fixed.

I actually gave a guest lecture in a class just the other day, where I suggested that there were people in the world who thought that designers were arrogant. One of the post-graduate students in the class really took umbrage at this and wanted to know why it was that designers were arrogant for offering to fix problems, but a builder wasn’t, or a doctor wasn’t.

Andy: What was your answer?

Anne: Well, my answer was, generally speaking, people go to them first and say, “I have this problem, I need help.” Whereas, designers come up with a problem, go find people that they think have it and then tell them they’d like to solve it. I think just on a social level, that is profoundly anti-social. That is not how people enjoy socially interacting with people.

Andy: I can completely see that and I think that I would say that argument has also levelled, quite rightly, a lot of Silicon Valley, which is the answer to everything is some kind of technology engineering startup to fix all the problems that all the other technology and engineering startups that are no longer startups have created. It’s probably true of quite a lot of areas of business and finance, as well, and politics, for that matter. The counter, I could imagine a designer saying, “Well, that’s not really true”, because one of the things as human-centred designers, the first thing we do, we go out, we do design ethnography, we go and speak to people, we go and observe, we go and do all of that stuff. We really understand their problems. We’re not just telling people what needs to be fixed. We’re going there and understanding things. What’s your response to that?

Anne: Well, my first response is, yes, that’s absolutely true. There are lots of very good designers in the world who do precisely that. Because I work in an academic institution though, I’m training students. What my job involves is getting the to the point where they know the difference between telling somebody something and asking somebody something. what it means to actually understand their client or their user. I prefer to just refer to them as people. What it is that people want or need. One of the things that I offer in all of my classes is, after doing the participant observation, my students always have the opportunity to submit a rationale for no design intervention whatsoever.

That’s not something that is offered to people in a lot of business contexts because there’s a business case that’s being made. Whereas, I want my students to understand that sometimes the research demonstrates that people are actually okay, and that even if they have little problems, they’re still okay with that, that people are quite okay with living with contradictions and that they will accept some issues because it allows for other things to emerge. That if they want, they can provide the evidence for saying, “Actually, the worst thing we could do in this scenario is design anything and I refuse to design.”

Andy: Right, that and the people made trade-offs all the time because of the pain of change is much … [more]
annegalloway  design  2019  speculativefiction  designethnography  morethanhuman  ursulaleguin  livestock  agriculture  farming  sheep  meat  morethanhumanlab  activism  criticaldesign  donnaharaway  stayingwiththetrouble  taoism  flow  change  changemaking  systemsthinking  complicity  catherinecaudwell  injustice  justice  dunneandraby  consciousness  science  technology  society  speculation  speculativedesign  questioning  fiction  future  criticalthinking  whatif  anthropology  humanities  reflexiveanthropology  newzealand  socialsciences  davidgrape  powersoften  animals  cows  genevievebell  markpesce  technologicaldeterminism  dogs  cats  ethnography  cooperation  human-animalrelations  human-animalrelationships  slow  slowness  time  perception  psychology  humility  problemsolving  contentment  presence  peacefulness  workaholism  northamerica  europe  studsterkel  protestantworkethic  labor  capitalism  passion  pets  domestication 
june 2019 by robertogreco
1980s Metalhead Kids Are Alright: Scientific Study Shows That They Became Well-Adjusted Adults | Open Culture
"In the 1980s, The Parents Music Resource Center (PMRC), an organization co-founded by Tipper Gore and the wives of several other Washington power brokers, launched a political campaign against pop music, hoping to put warning labels on records that promoted Sex, Violence, Drug and Alcohol Use. Along the way, the PMRC issued "the Filthy Fifteen," a list of 15 particularly objectionable songs. Hits by Madonna, Prince and Cyndi Lauper made the list. But the list really took aim at heavy metal bands from the 80s -- namely, Judas Priest, Mötley Crüe, Twisted Sister, W.A.S.P., Def Leppard, Black Sabbath, and Venom. (Interesting footnote: the Soviets separately created a list of blackballed rock bands, and it looked pretty much the same.)

Above, you can watch Twisted Sister's Dee Snider appear before Congress in 1985 and accuse the PMRC of misinterpreting his band's lyrics and waging a false war against metal music. The evidence 30 years later suggests that Snider perhaps had a point.

A study by psychology researchers at Humboldt State, Ohio State, UC Riverside and UT Austin "examined 1980s heavy metal groupies, musicians, and fans at middle age" -- 377 participants in total -- and found that, although metal enthusiasts certainly lived riskier lives as kids, they were nonetheless "significantly happier in their youth and better adjusted currently than either middle-aged or current college-age youth comparison groups." This left the researchers to contemplate one possible conclusion: "participation in fringe style cultures may enhance identity development in troubled youth." Not to mention that heavy metal lyrics don't easily turn kids into damaged goods.

You can read the report, Three Decades Later: The Life Experiences and Mid-Life Functioning of 1980s Heavy Metal Groupies here. And, right above, listen to an interview with one of the researchers, Tasha Howe, a former headbanger herself, who spoke yesterday with Michael Krasny on KQED radio in San Francisco.

Note: An earlier version of this post appeared on our site in July 2015."
1980s  cv  metalheads  heavymetal  music  adolescence  youth  pmrc  tippergore  psychology 
june 2019 by robertogreco
Why Your Brain Needs Idle Time – Elemental
"Mental idle time, meanwhile, seems to facilitate creativity and problem-solving. “Our research has found that mind-wandering may foster a particular kind of productivity,” says Jonathan Schooler, a professor of psychological and brain sciences at the University of California, Santa Barbara who has studied mind-wandering extensively. He says overcoming impasses — including what he calls “a-ha!” moments — often happen when people’s minds are free to roam.

Schooler mentions the common experience of not being able to recall a word that’s on the tip of your tongue — no matter how hard you try to think of it. But as soon as you move onto another mental task, the word pops into your head. “I think it’s very possible that some unconscious processes are going on during mind-wandering, and the insights these processes produce then bubble up to the surface,” he says.

It’s also possible that depriving the brain of free time stifles its ability to complete this unconscious work. “I think we need to recognize that the brain’s internal train of thought can be of value in itself,” Schooler says. “In the same way we can experience a sleep deficit, I think we can experience a mind-wandering deficit.”

“Many people find it difficult or stressful to do absolutely nothing,” he adds. Instead, Schooler says “non-demanding” tasks that don’t require much mental engagement seem to be best at fostering “productive” mind-wandering. He mentions activities like going for a walk in a quiet place, doing the dishes, or folding laundry — chores that may occupy your hands or body but that don’t require much from your brain.

While a wandering mind can slip into some unhelpful and unhealthy states of rumination, that doesn’t mean blocking these thoughts with constant distraction is the way to go. “I think it’s about finding balance between being occupied and in the present and letting your mind wander — [and] about thinking positive thoughts and thinking about obstacles that may stand in your way,” says Schooler.

There may be no optimal amount of time you can commit to mental freedom to strike that balance. But if you feel like it takes “remarkable effort” for you to disengage from all your favorite sources of mental stimulation, that’s probably a good sign you need to give your brain more free time, Immordino-Yang says. “To just sit and think is not pleasant when your brain is trained out of practicing that, but that’s really important for well-being,” she adds.

Frank recommends starting small — maybe take a 15-minute, distraction-free walk in the middle of your day. “You might find your world changes,” he says."
brain  jonathnschooler  idleness  2019  cognition  psychology  neuroscience  downtime  daydreaming  mindwandering  walking  quiet  chores  mentalload  cognitiveload  thinking  howwethink  epiphanies  creativity  problemsolving  mentalhealth  attention  distraction  doingnothing 
may 2019 by robertogreco
The Design Thinking Movement is Absurd – Lee Vinsel – Medium
"A couple of years ago, I saw a presentation from a group known as the University Innovation Fellows at a conference in Washington, DC. The presentation was one of the weirder and more disturbing things I’ve witnessed in an academic setting.

The University Innovation Fellows, its webpage states, “empowers students to become leaders of change in higher education. Fellows are creating a global movement to ensure that all students gain the necessary attitudes, skills, and knowledge to compete in the economy of the future.” You’ll notice this statement presumes that students aren’t getting the “attitudes, skills, and knowledge” they need and that, more magically, the students know what “attitudes, skills, and knowledge” they themselves need for . . . the future.

The UIF was originally funded by the National Science Foundation and led by VentureWell, a non-profit organization that “funds and trains faculty and student innovators to create successful, socially beneficial businesses.” VentureWell was founded by Jerome Lemelson, who some people call “one of the most prolific American inventors of all time” but who really is most famous for virtually inventing patent trolling. Could you imagine a more beautiful metaphor for how Design Thinkers see innovation? Socially beneficial, indeed.

Eventually, the UIF came to find a home in . . . you guessed it, the d.school.

It’s not at all clear what the UIF change agents do on their campuses . . . beyond recruiting other people to the “movement.” A blog post titled, “Only Students Could Have This Kind of Impact,” describes how in 2012 the TEDx student representatives at Wake Forest University had done a great job recruiting students to their event. It was such a good job that it was hard to see other would match it the next year. But, good news, the 2013 students were “killing it!” Then comes this line (bolding and capitalization in the original):

*THIS* is Why We Believe Students Can Change the World

Because they can fill audiences for TED talks, apparently. The post goes on, “Students are customers of the educational experiences colleges and universities are providing them. They know what other students need to hear and who they need to hear it from. . . . Students can leverage their peer-to-peer marketing abilities to create a movement on campus.”

Meanwhile, the UIF blog posts with titles like, “Columbia University — Biomedical Engineering Faculty Contribute to Global Health,” that examine the creation of potentially important new things mostly focus on individuals with the abbreviation “Dr.” before their names, which is what you’d expect given that making noteworthy contributions to science and engineering typically takes years of hard work.

At its gatherings, the UIF inducts students into all kinds of innovation-speak and paraphernalia. They stand around in circles, filling whiteboards with Post-It Notes. Unsurprisingly, the gatherings including sessions on topics like “lean startups” and Design Thinking. The students learn crucial skills during these Design Thinking sessions. As one participant recounted, “I just learned how to host my own TEDx event in literally 15 minutes from one of the other fellows.”

The UIF has many aspects of classic cult indoctrination, including periods of intense emotional highs, giving individuals a special lingo barely recognizable to outsiders, and telling its members that they are different and better than ordinary others — they are part of a “movement.” Whether the UIF also keeps its fellows from getting decent sleep and feeds them only peanut butter sandwiches is unknown.

This UIF publicity video contains many of the ideas and trappings so far described in this essay. Watch for all the Post-It notes, whiteboards, hoodies, look-alike black t-shirts, and jargon, like change agents.

When I showed a friend this video, after nearly falling out of his chair, he exclaimed, “My God, it’s the Hitlerjugend of contemporary bullshit!”

Tough but fair? Personally, I think that’s a little strong. A much better analogy to my mind is Chairman Mao’s Cultural Revolution.

When I saw the University Innovation Fellows speak in Washington, DC, a group of college students got up in front of the room and told all of us that they were change agents bringing innovation and entrepreneurship to their respective universities. One of the students, a spritely slip of a man, said something like, “Usually professors are kind of like this,” and then he made a little mocking weeny voice — wee, wee, wee, wee. The message was that college faculty and administrators are backwards thinking barriers that get in the way of this troop of thought leaders.

After the presentation, a female economist who was sitting next to me told the UIFers that she had been a professor for nearly two decades, had worked on the topic of innovation that entire time, and had done a great deal to nurture and advance the careers of her students. She found the UIF’s presentation presumptuous and offensive. When the Q&A period was over, one of UIF’s founders and co-directors, Humera Fasihuddin, and the students came running over to insist that they didn’t mean faculty members were sluggards and stragglers. But those of us sitting at the table were like, “Well then, why did you say it?”

You might think that this student’s antics were a result of being overly enthusiastic and getting carried away, but you would be wrong. This cultivated disrespect is what the UIF teaches its fellows. That young man was just parroting what he’d been taught to say.

A UIF blog post titled “Appealing to Your University’s Faculty and Staff” lays it all out. The author refers to Fasihuddin as a kind of guru figure, “If you participated in the Fall 2013 cohort, you may recall Humera repeating a common statement throughout session 5, ‘By connecting to other campuses that have been successful, and borrowing from those ideas you hear from your UIF peers, it removes the fear of the unknown for the faculty.”

Where does the faculty’s fear come from? The blog post explains, “The unfortunate truth in [Humera’s] statement is that universities are laggards (i.e. extremely slow adopters). The ironic part is universities shouldn’t be, and we as University Innovation Fellows, understand this.”

Now, on the one hand, this is just Millennial entitlement all hopped up on crystal meth. But on the other hand, there is something deeper and more troubling going on here. The early innovation studies thinker Everett Rogers used the term “laggard” in this way to refer to the last individuals to adopt new technologies. But in the UIF, Rogers’ vision becomes connected to the more potent ideology of neoliberalism: through bodies of thought like Chicago School economics and public choice theory, neoliberalism sees established actors as self-serving agents who only look to maintain their turf and, thus, resist change.

This mindset is quite widespread among Silicon Valley leaders. It’s what led billionaire Ayn Rand fan Peter Thiel to put $1.7 million into The Seasteading Institute, an organization that, it says, “empowers people to build floating startup societies with innovative governance models.” Seasteaders want to build cities that would float around oceans, so they can escape existing governments and live in libertarian, free market paradise. It’s the same notion undergirding the Silicon Valley “startup accelerator” YCombinator’s plan to build entire cities from scratch because old ones are too hard to fix. Elon Musk pushes this view when he tweets things, like “Permits are harder than technology,” implying that the only thing in the way of his genius inventions are other human beings — laggards, no doubt. Individuals celebrated this ideological vision, which holds that existing organizations and rules are mere barriers to entrepreneurial action, when Uber-leader Travis Kalanick used a piece of software to break city laws. And then they were shocked, shocked, shocked when Kalanick turned out to be a total creep.

Now, if you have never been frustrated by bureaucracy, you have not lived.Moreover, when I was young, I often believed my elders were old and in the way. But once you grow up and start getting over yourself, you come to realize that other people have a lot to teach you, even when — especially when — they disagree with you.

This isn’t how the UIF sees things. The blog post “Appealing to Your University’s Faculty and Staff” advises fellows to watch faculty members’ body language and tone of voice. If these signs hint that the faculty member isn’t into what you’re saying — or if he or she speaks as if you are not an “equal” or “down at you” — the UIF tells you to move on and find a more receptive audience. The important thing is to build the movement. “So I close with the same recurring statement,” the blog post ends, “By connecting to other campuses that have been successful . . . it removes the fear of the unknown for faculty.”

Is there any possibility that the students themselves could just be off-base? Sure, if while you are talking someone’s body tightens up or her head looks like it’s going to explode or her voice changes or she talks down to you and doesn’t treat you as an equal, it could be because she is a demonic, laggard-y enemy of progress, or it could be because you are being a fucking moron — an always-embarrassing realization that I have about myself far more often than I’d like to admit. Design Thinkers and the UIF teach a thoroughly adolescent conception of culture.

Edmund Burke once wrote, “You had all of these advantages . . . but you chose to act as if you had never been molded into civil society, and had everything to begin anew. You began ill, because you began by despising everything that belonged to you.” The brain-rotting … [more]
leevinsel  designthinking  2018  d.school  tedtalks  tedx  cults  innovation  daveevans  design  d.life  humerafasihuddin  edmundburke  natashajen  herbertsimon  peterrowe  robertmckim  petermiller  liberalarts  newage  humanpotentialmovement  esaleninstitute  stanford  hassoplattner  davidkelly  johnhennessy  business  education  crit  post-its  siliconvalley  architecture  art  learning  elitism  designimperialism  ideo  playpump  openideo  thommoran  colonialism  imperialism  swiffer  andrewrussell  empathy  problemsolving  delusion  johnleary  stem  steam  margaretbrindle  peterstearns  christophermckenna  georgeorwell  thinking  howwwethink  highered  highereducation  tomkelly  nathanrosenberg  davidmowery  stevenklepper  davidhounshell  patrickmccray  marianamazzucato  commercialization  civilrightsmovement  criticism  bullshit  jeromelemelson  venturewell  maintenance  themaintainers  maintainers  cbt  psychology  hucksterism  novelty  ruthschwartzcowan  davidedgerton 
may 2019 by robertogreco
Anxiety ‘epidemic’ brewing on college campuses | University of California
"The number of 18- to 26-year-old students who report suffering from anxiety disorder has doubled since 2008, perhaps as a result of rising financial stress and increased time spent on digital devices, according to preliminary findings released Thursday by a team of UC Berkeley researchers.

The percentage of all students nationally who reported being diagnosed with or treated for anxiety disorder climbed from 10 percent in 2008 to 20 percent in 2018, according to the findings by a research team led by Richard Scheffler, a professor at the Goldman School of Public Policy and School of Public Health.

Rates of anxiety disorder grew at higher rates for students who identified as transgender, Latinx and black, and they increased the closer all students got to graduation.

“It is what I am calling a ‘new epidemic,’ and that the data supports using that term, on college campuses,” Scheffler said. “We need a heightened national awareness of this very serious epidemic.”

Scheffler and his team examined nine years of data from the annual student National College Health Assessment survey and the National Longitudinal Survey of Youth — two nationwide examinations of student well-being. The group also conducted 45-minute interviews with 30 UC Berkeley students who identified as suffering from anxiety.

While Scheffler said he cannot firmly establish the causes for the rise in anxiety, he found strong correlations between anxiety disorder and financial instability, the amount of leisure time spent on digital devices and the level of education attained by a young adult’s mother.

“The correlations and the data are pretty powerful,” he said.

Factors increasing anxiety

Specifically, the findings show that:

• Young adults who come from families that have trouble paying bills are 2.7 times more likely to have anxiety than students who come from families that have no difficulty paying bills.

• Young adults who spend more than 20 hours of leisure time per week on digital devices were 53 percent more likely to have anxiety than young adults who spend fewer than 5 hours a week on digital devices.

• Young adults with mothers who had at least an undergraduate degree had a 45 percent greater chance of having anxiety than young adults whose mothers had less than a college degree. The surveys used in the analysis did not ask about the fathers’ level of education.

Scheffler also found that anxiety is associated with other serious problems beyond the overwhelming feelings of worry or nervousness associated with the disorder.

A student with anxiety is 3.2 times more likely to abuse alcohol or drugs, the findings show. Other negative outcomes correlated with anxiety included increased probability of having been sexually assaulted or attempting suicide.

All factors being equal, Scheffler also found that between 2008 and 2014, young adults with anxiety earned 11 percent less than those without anxiety.

“Anxiety has really very dire consequences for these students,” Scheffler said. “That’s a lot of pain and suffering.”

‘Something’s going on here’
Scheffler, who joined UC Berkeley’s faculty in 1981, said he first began thinking about student anxiety 10 years ago, when he looked out at the 100 students in his lecture hall and saw faces stricken with worry.

“More than half the students were not looking at me, they were looking at their phones or their computers,” Scheffler said. “I told everyone to turn their phones off and put their computers away. I had four or five students who were so addicted, they could not do it. I actually had to go and take their phones away from them.”

“I said to myself, ‘You know, something’s going on here,’” he added. “That was the beginning. And then I watched for several years.”

While Scheffler doesn’t make policy recommendations in his preliminary findings, he said the first step in dealing with the rise in anxiety is increasing awareness among faculty and college administrators.

“I want the faculty and the university leadership here at UC Berkeley and across the country to know that this epidemic is out there, and they need to understand it,” he said. “The students need help.”

To that end, the Berkeley Institute for the Future of Young Americans, a research center affiliated with the Goldman School of Public Policy, is hosting a panel discussion on the findings Thursday afternoon that will feature Chancellor Carol Christ, student leaders and UC Berkeley health system administrators.

“We know that Millennials and GenZ are experiencing anxiety like no previous generation,” said Sarah Swanbeck, the executive director of the institute. “While there’s a lot we still don’t know about what’s causing the spike in anxiety disorder, this report highlights that the problem is actually getting worse. It’s an important signal to college administrators that much more must be done to tackle this issue.”

Scheffler will present his findings and hopes the audience of deans, counselors, students and program coordinators will take his message to heart.

UC Berkeley offers a number of resources for students, including counseling at the Tang Center, help meeting basic needs and other programs."
education  highered  highereducation  colleges  universities  anxiety  mentalhealth  psychology  2019 
april 2019 by robertogreco
Are.na Blog / Workshop Debrief: How to Use the Internet Mindfully
"Last weekend I got to collaborate with Willa Köerner of The Creative Independent (TCI) to facilitate a workshop at IAM Weekend, called “How to Use the Internet Mindfully.” The workshop built on an essay series TCI and Are.na published together last year, which asked a group of artists to reflect on the habits and philosophies that help them contend with the online attention economy. This time we wanted to do something similar in person, in a space where creative internet people could talk about our feelings together.

We asked participants to complete a worksheet designed to help them get a better handle on their internet and technology habits. (You can download the worksheet if you’d like to try this—it takes about 35 minutes to complete). The first step was making a mind map of one’s various screen-based activities. Using different colors, everyone then labeled those activities as either harmful or helpful on a personal level. Finally, people jotted down a few “relationship goals” between them and the Internet and brainstormed practical steps for building up their personal agency.

We spent the last part of the workshop sharing results with one another and thinking about reclaiming the web as an intimate, creative social space. Lots of interesting ideas emerged in our conversation, so I want to highlight a few things here that stood out in particular:

1. We often have mixed feelings about certain tools (and specific ways of using those tools). For example, posting to Instagram can be an exploratory and rewarding creative process. But the anxiety about “likes” that comes afterward usually feels empty and harmful. It’s hard to reconcile these opposing feelings within the realm of personal behavior. While we know that we’re ultimately in control of our own behavior, we also know that apps like Instagram are designed to promote certain patterns of use. We don’t want to quit altogether, but we’re struggling to swim against the current of “persuasive” tech.

2. We don’t have enough spaces for talking about the emotional side effects of living with the web. Before we really dug into strategies for using the Internet more mindfully, participants really wanted to share their feelings about social media, Internet burnout, and how the two are connected. We talked about mental health and how hard it is to feel in control of apps that are essentially designed for dependency. We discussed how few of us feel happy with our habits, even though everyone’s experience is different. We wondered about the stigma that surrounds any form of “addiction,” and whether it’s ok to talk about widespread Internet use in those terms. I’m really glad these questions bubbled up, since they helped build enough trust in the room to share the more personal elements of each person’s mind map.

3. We all want to feel personal autonomy, which takes many different forms. We had a lively exchange about different ways to limit the amount of digital junk food we allow ourselves to consume. Apple’s new screen-time tracker was one example that drew mixed responses. Some people felt that a subtle reminder helped, while others felt it was totally ineffective. Some preferred to impose a hard limit on themselves through a tool like Self Control, while others rejected the premise of measuring screen time in the first place. A lot of participants focused on wanting to control their own experience, whether by owning one’s own content or simply feeling enough agency to decide how to navigate the web. We talked a bit about the dilemma of feeling like our decision-making psychology has been “hacked” by addictive design, and how crappy it feels to replace our own intuition with another technical solution. We also acknowledged that setting our own boundaries means spending even more time and emotional capital than our apps have already taken from us. That additional effort is labor we consumers complete for free, even if we don’t usually see it that way.

4. The web feels too big for healthy interaction. We also talked about how using mainstream social media platforms these days can feel like shouting into a giant room with everyone else on Earth. Many of the healthy spaces where participants felt they could genuinely share ideas were ones where they put considerable time and emotional labor into building an intimate social context. People had a lot to say about the fact that users are locked in to their online personas with all kinds of personal and professional incentives. You simply can’t stop looking, or downsize your social circles, or abandon your long-term presence, without breaking an informal social contract you never realized you signed.

The context of the conference also made me think about how we frame the work we put into our relationship with technology. When we get in front of a group, what kind of “solutions” should we be advocating? At what point to individual strategies lead to politics and advocacy?

When you focus on personal habits for long enough, it’s easy to process societal issues as problems originating in your own behavior. But as with other kinds of “self-help,” this is a framing that ignores a grotesque power dynamic. Addiction and burnout are not only matters of consumer choice, but the costs of business decisions made by enormous technology companies. The tech industry – like big tobacco and big oil – has knowingly caused a set of serious social problems and then pushed the work of remediating them onto individual consumers. Now it’s up to users to defend themselves with tools like browser plug-ins and VPNs and finstas and time trackers. As we keep talking about using the internet mindfully, I hope we can connect the dots between this kind of individual action and the larger project of securing universal rights to privacy, anonymity, and personal autonomy. By asking ourselves which tools we want to use, and how we want to use them, hopefully we can open up a broader conversation about how we move beyond surveillance capitalism itself.

I’d be interested in talking more about these connections between individual and collective actions if we get to repeat the workshop. It would be great to work with a smaller group, simplify the worksheet slightly, and get really specific about what questions we’re trying to answer. I’d like to draw on a few other ways of thinking as well, like the Human Systems framework for example. If you’d be interested in collaborating, or just have thoughts on any of this, please send one of us an email: leo@are.na or willa@kickstarter.com. We’d love to hear your thoughts."
internet  mindfulness  are.na  2019  leoshaw  willaköerner  web  online  autonomy  technology  politics  advocacy  browsers  extensions  plug-ins  vpns  finstas  trackers  surveillancecapitalism  surveillance  self-help  power  socialmedia  presence  socialcontract  attention  psychology  burnout  addiction  instagram  creativity  likes  behavior 
april 2019 by robertogreco
How Inuit Parents Raise Kids Without Yelling — And Teach Them To Control Anger : Goats and Soda : NPR
"Across the board, all the moms mention one golden rule: Don't shout or yell at small children.

Traditional Inuit parenting is incredibly nurturing and tender. If you took all the parenting styles around the world and ranked them by their gentleness, the Inuit approach would likely rank near the top. (They even have a special kiss for babies, where you put your nose against the cheek and sniff the skin.)

The culture views scolding — or even speaking to children in an angry voice — as inappropriate, says Lisa Ipeelie, a radio producer and mom who grew up with 12 siblings. "When they're little, it doesn't help to raise your voice," she says. "It will just make your own heart rate go up."

Even if the child hits you or bites you, there's no raising your voice?

"No," Ipeelie says with a giggle that seems to emphasize how silly my question is. "With little kids, you often think they're pushing your buttons, but that's not what's going on. They're upset about something, and you have to figure out what it is."

Traditionally, the Inuit saw yelling at a small child as demeaning. It's as if the adult is having a tantrum; it's basically stooping to the level of the child, Briggs documented.

Elders I spoke with say intense colonization over the past century is damaging these traditions. And, so, the community is working hard to keep the parenting approach intact.

Goota Jaw is at the front line of this effort. She teaches the parenting class at the Arctic College. Her own parenting style is so gentle that she doesn't even believe in giving a child a timeout for misbehaving.

"Shouting, 'Think about what you just did. Go to your room!' " Jaw says. "I disagree with that. That's not how we teach our children. Instead you are just teaching children to run away."

And you are teaching them to be angry, says clinical psychologist and author Laura Markham. "When we yell at a child — or even threaten with something like 'I'm starting to get angry,' we're training the child to yell," says Markham. "We're training them to yell when they get upset and that yelling solves problems."

In contrast, parents who control their own anger are helping their children learn to do the same, Markham says. "Kids learn emotional regulation from us."

I asked Markham if the Inuit's no-yelling policy might be their first secret of raising cool-headed kids. "Absolutely," she says."



"What Briggs documented is a central component to raising cool-headed kids.

When a child in the camp acted in anger — hit someone or had a tantrum — there was no punishment. Instead, the parents waited for the child to calm down and then, in a peaceful moment, did something that Shakespeare would understand all too well: They put on a drama. (As the Bard once wrote, "the play's the thing wherein I'll catch the conscience of the king.")

"The idea is to give the child experiences that will lead the child to develop rational thinking," Briggs told the CBC in 2011.

In a nutshell, the parent would act out what happened when the child misbehaved, including the real-life consequences of that behavior.

The parent always had a playful, fun tone. And typically the performance starts with a question, tempting the child to misbehave.

For example, if the child is hitting others, the mom may start a drama by asking: "Why don't you hit me?"

Then the child has to think: "What should I do?" If the child takes the bait and hits the mom, she doesn't scold or yell but instead acts out the consequences. "Ow, that hurts!" she might exclaim.

The mom continues to emphasize the consequences by asking a follow-up question. For example: "Don't you like me?" or "Are you a baby?" She is getting across the idea that hitting hurts people's feelings, and "big girls" wouldn't hit. But, again, all questions are asked with a hint of playfulness.

The parent repeats the drama from time to time until the child stops hitting the mom during the dramas and the misbehavior ends.

Ishulutak says these dramas teach children not to be provoked easily. "They teach you to be strong emotionally," she says, "to not take everything so seriously or to be scared of teasing."

Psychologist Peggy Miller, at the University of Illinois, agrees: "When you're little, you learn that people will provoke you, and these dramas teach you to think and maintain some equilibrium."

In other words, the dramas offer kids a chance to practice controlling their anger, Miller says, during times when they're not actually angry.

This practice is likely critical for children learning to control their anger. Because here's the thing about anger: Once someone is already angry, it is not easy for that person to squelch it — even for adults.

"When you try to control or change your emotions in the moment, that's a really hard thing to do," says Lisa Feldman Barrett, a psychologist at Northeastern University who studies how emotions work.

But if you practice having a different response or a different emotion at times when you're not angry, you'll have a better chance of managing your anger in those hot-button moments, Feldman Barrett says.

"That practice is essentially helping to rewire your brain to be able to make a different emotion [besides anger] much more easily," she says.

This emotional practice may be even more important for children, says psychologist Markham, because kids' brains are still developing the circuitry needed for self-control.

"Children have all kinds of big emotions," she says. "They don't have much prefrontal cortex yet. So what we do in responding to our child's emotions shapes their brain."

Markham recommends an approach close to that used by Inuit parents. When the kid misbehaves, she suggests, wait until everyone is calm. Then in a peaceful moment, go over what happened with the child. You can simply tell them the story about what occurred or use two stuffed animals to act it out.

"Those approaches develop self-control," Markham says.

Just be sure you do two things when you replay the misbehavior, she says. First, keep the child involved by asking many questions. For example, if the child has a hitting problem, you might stop midway through the puppet show and ask,"Bobby, wants to hit right now. Should he?"

Second, be sure to keep it fun. Many parents overlook play as a tool for discipline, Markham says. But fantasy play offers oodles of opportunities to teach children proper behavior.

"Play is their work," Markham says. "That's how they learn about the world and about their experiences."

Which seems to be something the Inuit have known for hundreds, perhaps even, thousands of years."
anger  parenting  2019  anthropology  psychology  inuit  children  yelling  self-control  punishment  emotions  behavior 
april 2019 by robertogreco
For Anxious Kids, Parents May Need To Learn To Let Them Face Their Fears : Shots - Health News : NPR
"For instance, when Joseph would get scared about sleeping alone, Jessica and her husband, Chris Calise, did what he asked and comforted him. "In my mind, I was doing the right thing," she says. "I would say, 'I'm right outside the door' or 'Come sleep in my bed.' I'd do whatever I could to make him feel not anxious or worried."

But this comforting — something psychologists call accommodation — can actually be counterproductive for children with anxiety disorders, Lebowitz says.

"These accommodations lead to worse anxiety in their child, rather than less anxiety," he says. That's because the child is always relying on the parents, he explains, so kids never learn to deal with stressful situations on their own and never learn they have the ability to cope with these moments.

"When you provide a lot of accommodation, the unspoken message is, 'You can't do this, so I'm going to help you,' " he says.

Lebowitz wondered if it would help to train parents to change that message and to encourage their children to face anxieties rather than flee from them.

Currently the established treatment for childhood anxiety is cognitive behavioral therapy delivered directly to the child.

When researchers have tried to involve parents in their child's therapy in the past, the outcomes from studies suggested that training parents in cognitive behavioral therapy didn't make much of a difference for the child's recovery. Lebowitz says that this might be because cognitive behavioral therapy asks the child to change their behavior. "When you ask the parents to change their child's behavior, you are setting them up for a very difficult interaction," he says.

Instead, Lebowitz's research explores whether training only the parents without including direct child therapy can help. He is running experiments to compare cognitive behavioral therapy for the child with parent-only training. A study of the approach appeared in the Journal of the American Academy of Child & Adolescent Psychiatry last month."
children  parenting  anxiety  2019  elilebowitz  fear  psychology  accommodation  comfort  behavior 
april 2019 by robertogreco
Ben Franklin Effect: Ask someone for a favor to make them like you - Business Insider
"No one likes to feel like a mooch.

Which is why asking someone to do you a favor— proofread your résumé, walk your dog, loan you $20 because you forgot this was a cash-only restaurant — can be so stressful.

But if you're stressing because you feel like the person helping you out will find you annoying and like you less, don't. There's a psychological phenomenon commonly known as the "Ben Franklin Effect" that explains why people wind up liking you more when they do you a favor.

David McRaney, author of the book "You Are Not So Smart," explains how the phenomenon got its name on YouAreNotSoSmart.com. Supposedly, Benjamin Franklin had a hater — someone he considered a "gentleman of fortune and education" who would probably become influential in government.

In order to recruit the hater to his side, Franklin decided to ask the man if he could borrow one of the books from his library. The man was flattered and lent it; Franklin returned it one week later with a thank-you note.

The next time they saw each other, the man was exceedingly friendly to Franklin and Franklin said they stayed friends until the man died.

When psychologists tested the Ben Franklin effect in 1969, they found the effect really did hold water. For the small study, volunteers participated in a study in which they could win money.

One-third of the volunteers were then approached by a secretary who said that the psychology department had paid for the study and funds were running out, and asked the volunteer to return the payment. One-third were approached by the experimenter and told that he himself had paid for the study and funds were running out, and asked the volunteer to return the payment. The final third were allowed to keep their money.

Results showed that volunteers liked the experimenter most when they'd done him the favor of returning his money, and least when they'd gotten to keep their money.

In other words, the researchers concluded, doing someone a favor makes us like that person more. The researchers suspected that the Ben Franklin effect works because of "cognitive dissonance": We find it difficult to reconcile the fact that we did someone a favor and we hate them, so we assume that we like them.

More recently, another psychologist conducted a similar, small study on the Ben Franklin effect in the United States and Japan.

Participants in both countries ended up liking another person who was presumably working on the same task more when he asked for help completing a project than when he didn't. Interestingly, however, they didn't like that person more when the experimenter asked them to help that person.

The psychologist behind this study, Yu Niiya of Hosei University in Tokyo, therefore suggests that the Ben Franklin effect isn't a result of cognitive dissonance. Instead, she says it happens because the person being asked for help can sense that the person asking for help wants to get chummy with them and in turn reciprocates the liking.

Regardless of the specific mechanism behind the Ben Franklin Effect, the bottom line is that you shouldn't freak out every time you ask someone to lend a hand. In fact, you can deploy your requests for help strategically, a la Franklin, to win over detractors."
psychology  2016  favors  vulnerability  relationships 
march 2019 by robertogreco
I Embraced Screen Time With My Daughter—and I Love It | WIRED
I often turn to my sister, Mimi Ito, for advice on these issues. She has raised two well-adjusted kids and directs the Connected Learning Lab at UC Irvine, where researchers conduct extensive research on children and technology. Her opinion is that “most tech-privileged parents should be less concerned with controlling their kids’ tech use and more about being connected to their digital lives.” Mimi is glad that the American Association of Pediatrics (AAP) dropped its famous 2x2 rule—no screens for the first two years, and no more than two hours a day until a child hits 18. She argues that this rule fed into stigma and parent-shaming around screen time at the expense of what she calls “connected parenting”—guiding and engaging in kids’ digital interests.

One example of my attempt at connected parenting is watching YouTube together with Kio, singing along with Elmo as Kio shows off the new dance moves she’s learned. Everyday, Kio has more new videos and favorite characters that she is excited to share when I come home, and the songs and activities follow us into our ritual of goofing off in bed as a family before she goes to sleep. Her grandmother in Japan is usually part of this ritual in a surreal situation where she is participating via FaceTime on my wife’s iPhone, watching Kio watching videos and singing along and cheering her on. I can’t imagine depriving us of these ways of connecting with her.

The (Unfounded) War on Screens

The anti-screen narrative can sometimes read like the War on Drugs. Perhaps the best example is Glow Kids, in which Nicholas Kardaras tells us that screens deliver a dopamine rush rather like sex. He calls screens “digital heroin” and uses the term “addiction” when referring to children unable to self-regulate their time online.

More sober (and less breathlessly alarmist) assessments by child psychologists and data analysts offer a more balanced view of the impact of technology on our kids. Psychologist and baby observer Alison Gopnik, for instance, notes: “There are plenty of mindless things that you could be doing on a screen. But there are also interactive, exploratory things that you could be doing.” Gopnik highlights how feeling good about digital connections is a normal part of psychology and child development. “If your friends give you a like, well, it would be bad if you didn’t produce dopamine,” she says.

Other research has found that the impact of screens on kids is relatively small, and even the conservative AAP says that cases of children who have trouble regulating their screen time are not the norm, representing just 4 percent to 8.5 percent of US children. This year, Andrew Przybylski and Amy Orben conducted a rigorous analysis of data on more than 350,000 adolescents and found a nearly negligible effect on psychological well-being at the aggregate level.

In their research on digital parenting, Sonia Livingstone and Alicia Blum-Ross found widespread concern among parents about screen time. They posit, however, that “screen time” is an unhelpful catchall term and recommend that parents focus instead on quality and joint engagement rather than just quantity. The Connected Learning Lab’s Candice Odgers, a professor of psychological sciences, reviewed the research on adolescents and devices and found as many positive as negative effects. She points to the consequences of unbalanced attention on the negative ones. “The real threat isn’t smartphones. It’s this campaign of misinformation and the generation of fear among parents and educators.”

We need to immediately begin rigorous, longitudinal studies on the effects of devices and the underlying algorithms that guide their interfaces and their interactions with and recommendations for children. Then we can make evidence-based decisions about how these systems should be designed, optimized for, and deployed among children, and not put all the burden on parents to do the monitoring and regulation.

My guess is that for most kids, this issue of screen time is statistically insignificant in the context of all the other issues we face as parents—education, health, day care—and for those outside my elite tech circles even more so. Parents like me, and other tech leaders profiled in a recent New York Times series about tech elites keeping their kids off devices, can afford to hire nannies to keep their kids off screens. Our kids are the least likely to suffer the harms of excessive screen time. We are also the ones least qualified to be judgmental about other families who may need to rely on screens in different ways. We should be creating technology that makes screen entertainment healthier and fun for all families, especially those who don’t have nannies.

I’m not ignoring the kids and families for whom digital devices are a real problem, but I believe that even in those cases, focusing on relationships may be more important than focusing on controlling access to screens.

Keep It Positive

One metaphor for screen time that my sister uses is sugar. We know sugar is generally bad for you and has many side effects and can be addictive to kids. However, the occasional bonding ritual over milk and cookies might have more benefit to a family than an outright ban on sugar. Bans can also backfire, fueling binges and shame as well as mistrust and secrecy between parents and kids.

When parents allow kids to use computers, they often use spying tools, and many teens feel parental surveillance is invasive to their privacy. One study showed that using screen time to punish or reward behavior actually increased net screen time use by kids. Another study by Common Sense Media shows what seems intuitively obvious: Parents use screens as much as kids. Kids model their parents—and have a laserlike focus on parental hypocrisy.

In Alone Together, Sherry Turkle describes the fracturing of family cohesion because of the attention that devices get and how this has disintegrated family interaction. While I agree that there are situations where devices are a distraction—I often declare “laptops closed” in class, and I feel that texting during dinner is generally rude—I do not feel that iPhones necessarily draw families apart.

In the days before the proliferation of screens, I ran away from kindergarten every day until they kicked me out. I missed more classes than any other student in my high school and barely managed to graduate. I also started more extracurricular clubs in high school than any other student. My mother actively supported my inability to follow rules and my obsessive tendency to pursue my interests and hobbies over those things I was supposed to do. In the process, she fostered a highly supportive trust relationship that allowed me to learn through failure and sometimes get lost without feeling abandoned or ashamed.

It turns out my mother intuitively knew that it’s more important to stay grounded in the fundamentals of positive parenting. “Research consistently finds that children benefit from parents who are sensitive, responsive, affectionate, consistent, and communicative” says education professor Stephanie Reich, another member of the Connected Learning Lab who specializes in parenting, media, and early childhood. One study shows measurable cognitive benefits from warm and less restrictive parenting.

When I watch my little girl learning dance moves from every earworm video that YouTube serves up, I imagine my mother looking at me while I spent every waking hour playing games online, which was my pathway to developing my global network of colleagues and exploring the internet and its potential early on. I wonder what wonderful as well as awful things will have happened by the time my daughter is my age, and I hope a good relationship with screens and the world beyond them can prepare her for this future."
joiito  parenting  screentime  mimiito  techology  screens  children  alisongopnik  2019  computers  computing  tablets  phones  smartphones  mobile  nicholaskardaras  addiction  prohibition  andrewprzybylski  aliciablum-ross  sonialvingstone  amyorben  adolescence  psychology  candiceodgers  research  stephaniereich  connectedlearning  learning  schools  sherryturkle  trust 
march 2019 by robertogreco
Yong Zhao "What Works May Hurt: Side Effects in Education" - YouTube
"Proponents of standardized testing and privatization in education have sought to prove their effectiveness in improving education with an abundance of evidence. These efforts, however, can have dangerous side effects, causing long-lasting damage to children, teachers, and schools. Yong Zhao, Foundation Distinguished Professor in the School of Education at the University of Kansas, will argue that education interventions are like medical products: They can have serious, sometimes detrimental, side effects while also providing cures. Using standardized testing and privatization as examples, Zhao, author of the internationally bestselling Who’s Afraid of the Big Bad Dragon? Why China Has the Best (and Worst) Education System in the World, will talk about his new book on why and how pursuing a narrow set of short-term outcomes causes irreparable harm in education."
yongzhao  2018  schools  schooling  pisa  education  testing  standardizedtesting  standardization  china  us  history  testscores  children  teaching  howweteach  howwelearn  sideeffects  privatization  tims  math  reading  confidence  assessment  economics  depression  diversity  entrepreneurship  japan  creativity  korea  vietnam  homogenization  intolerance  prosperity  tolerance  filtering  sorting  humans  meritocracy  effort  inheritance  numeracy  literacy  achievementgap  kindergarten  nclb  rttt  policy  data  homogeneity  selectivity  charterschools  centralization  decentralization  local  control  inequity  curriculum  autonomy  learning  memorization  directinstruction  instruction  poverty  outcomes  tfa  teachforamerica  finland  singapore  miltonfriedman  vouchers  resilience  growthmindset  motivation  psychology  research  positivepsychology  caroldweck  intrinsicmotivation  choice  neoliberalism  high-stakestesting 
march 2019 by robertogreco
On Instagram, Seeing Between the (Gender) Lines - The New York Times
"SOCIAL MEDIA HAS TURNED OUT TO BE THE PERFECT TOOL FOR NONBINARY PEOPLE TO FIND — AND MODEL — THEIR UNIQUE PLACES ON THE GENDER SPECTRUM."



"Around the same time, Moore became aware of a performance-and-poetry group (now disbanded) called Dark Matter. Moore became transfixed by videos of one of its members, Alok Vaid-Menon, who was able to eloquently dismiss conventional notions of gender, particularly the idea that there are only two. Seeing people like Vaid-Menon online gave Moore the courage to reconsider how they approached gender. Moore began experimenting with their outward appearance. Before Moore changed the pronoun they used, Moore had favored a more masculine, dandy-like aesthetic — close-cropped hair, button-down shirts and bow ties — in large part to fit in at work. Moore began wearing their hair longer and often chose less gender-specific clothing, like T-shirts or boxy tops, which felt more natural and comfortable to them. Vaid-Menon’s assuredness, Moore said, “boosted my confidence in terms of defining and asserting my own identity in public spaces.”

A shift in technology emboldened Moore, too. In 2014, Facebook updated its site to include nonbinary gender identities and pronouns, adding more than 50 options for users who don’t identify as male or female, including agender, gender-questioning and intersex. It was a profound moment for Moore. “They had options I didn’t even know about,” Moore told me. That summer, Moore selected “nonbinary,” alerting their wider social spheres, including childhood friends and family members who also used the site. For Moore, it saved them some of the energy of having to explain their name and pronoun shift. Moore also clarified their gender pronouns on Instagram. “I wrote it into my profile to make it more explicit.” To some, the act might seem small, but for Moore, their identity “felt crystallized, and important.”

Several societies and cultures understand gender as more varied than just man or woman, but in the United States, a gender binary has been the norm. “In our cultural history, we’ve never had anything close to a third category, or even the notion that you could be in between categories,” said Barbara Risman, a sociology professor at the University of Illinois at Chicago. Risman, who recently published a book called “Where the Millennials Will Take Us: A New Generation Wrestles With the Gender Structure,” contrasted her early research with what she is seeing now. Few of the people she interviewed for the book in 2012 and 2013 were openly using nongendered pronouns, if they even knew about them. Just four years later, she began researching nonbinary young adults because the landscape had changed so radically. “It was reflexive with their friends at school, social groups. Many colleges classes start out with ‘Name, major and preferred pronouns,’ ” Risman told me. In Risman’s experience, it used to take decades to introduce new ideas about sex, sexuality or gender, and even longer for them to trickle upstream into society. “What’s fascinating is how quickly the public conversation has led to legal changes,” Risman said. California and Washington, among others, now allow people to select “x” as their gender, instead of “male” or “female,” on identity documents. “And I am convinced that it has to do with — like everything else in society — the rapid flow of information.”

Helana Darwin, a sociologist at the State University of New York at Stony Brook who began researching nonbinary identities in 2014, found that the social-media community played an unparalleled role in people’s lives, especially those who were geographically isolated from other nonbinary people. “Either they were very confused about what was going on or just feeling crushingly lonely and without support, and their online community was the only support in their lives,” Darwin told me. “They turned to the site to understand they aren’t alone.” Most of her subjects said social media was instrumental in deepening their understanding of their identities. “A 61-year-old person in my sample told me that they lived the vast majority of their life as though they were a gay man and was mistaken often as a drag queen after coming out. They didn’t discover nonbinary until they were in their 50s, and it was a freeing moment of understanding that nothing is wrong. They didn’t have to force themselves into the gay-man or trans-woman box — they could just be them. They described it as transcendent.”

When Darwin began her study four years ago, she was shocked to discover that the body of research on nonbinary people was nearly nonexistent. “Even as nonbinary people are becoming increasing visible and vocal, there were still only a handful of articles published in the field of sociology that were even tangentially about nonbinary people and even fewer that were explicitly about nonbinary people.” What little research there was tended to lump the nonbinary experience into trans-woman and trans-man experience, even though all signs pointed to deep differences. The void in the field, she thinks, was due to society’s reliance on the notion that all humans engage in some sense of gender-based identity performance, which reaffirms the idea that gender exists. “There was an academic lag that isn’t keeping with the very urgent and exponentially profound gender revolution happening in our culture.”

Her research found that social media is a gathering place for discussing the logistics of gender — providing advice, reassurance and emotional support, as well as soliciting feedback about everything from voice modulation to hairstyles. The internet is a place where nonbinary people can learn about mixing masculine and feminine elements to the point of obscuring concrete identification as either. As one person she interviewed put it, “Every day someone can’t tell what I am is a good day.”

Nearly everyone Darwin interviewed remarked about the power of acquiring language that spoke to their identity, and they tended to find that language on the internet. But Harry Barbee, a nonbinary sociologist at Florida State University who studies sex, gender and sexuality, cautioned against treating social media as a curative. “When the world assumes you don’t exist, you’re forced to define yourself into existence if you want some semblance of recognition and social viability, and so the internet and social media helps achieve this,” Barbee said. “But it’s not a dream world where we are free to be you and me, because it can also be a mechanism for social control.” Barbee has been researching what it means to live as nonbinary in a binary world. Social media, Barbee said, is “one realm where they do feel free to share who they are, but they’re realistic about the limitations of the space. Even online, they are confronted by hostility and people who are telling them they’re just confused or that makes no sense, or want to talk to them about their genitals.”"



"Psychologists often posit that as children, we operate almost like scientists, experimenting and gathering information to make sense of our surroundings. Children use their available resources — generally limited to their immediate environment — to gather cues, including information about gender roles, to create a sense of self. Alison Gopnik, a renowned philosopher and child psychologist, told me that it’s not enough to simply tell children that other identities or ways of being exist. “That still won’t necessarily change their perspective,” she said. “They have to see it.”

In her 2009 book, “The Philosophical Baby,” Gopnik writes that “when we travel, we return to the wide-ranging curiosity of childhood, and we discover new things about ourselves.” In a new geographic area, our attention is heightened, and everything, from differently labeled condiments to streetwear, becomes riveting. “This new knowledge lets us imagine new ways that we could live ourselves,” she asserts. Flying over feeds in social media can feel like viewing portholes into new dimensions and realities, so I asked Gopnick if it’s possible that social media can function as a foreign country, where millions of new ideas and identities and habitats are on display — and whether that exposure can pry our calcified minds open in unexpected ways. “Absolutely,” she said. “Having a wider range of possibilities to look at gives people a sense of a wider range of possibilities, and those different experiences might lead to having different identities.”

When we dive into Instagram or Facebook, we are on exploratory missions, processing large volumes of information that help us shape our understanding of ourselves and one another. And this is a country that a majority of young adults are visiting on a regular basis. A Pew study from this year found that some 88 percent of 18-to-29-year-olds report using some form of social media, and 71 percent of Americans between ages 18 and 24 use Instagram. Social media is perhaps the most influential form of media they now have. They turn to it for the profound and the mundane — to shape their views and their aesthetics. Social media is a testing ground for expression, the locus of experimentation and exploration — particularly for those who cannot yet fully inhabit themselves offline for fear of discrimination, or worse. Because of that, it has become a lifeline for many people struggling to find others just like them."



"Although social media generally conditions users to share only their highlights — the success reel of their lives — Vaid-Menon thinks it’s important to share the reality of living in a gender-nonconforming body; they want people to understand what the daily experience can be like. “The majority of nonbinary, gender-nonconforming cannot manifest themselves because to do so would mean violence, death, harassment and punishment,” Vaid-Menon told me. … [more]
jennawortham  2018  instagam  internet  web  online  gender  gendernonconforming  culture  us  alisongopnik  maticemoore  alokvaid-memon  barbararisman  helanadarwin  psychology  learning  howwelearn  nonbinary  sexuality  jacobtobia  pidgeonpagonis  danezsmith  akwaekeemezi  jonelyxiumingaagaardandersson  ahomariturner  raindove  taylormason  asiakatedillon  twitter  instagram  children  dennisnorisii  naveenbhat  elisagerosenberg  sevaquinnparraharrington  ashleighshackelford  hengamehyagoobifarah  donaldtrump  socialmedia  socialnetworks  discrimination  fear  bullying  curiosity  childhood  identity  self  language 
february 2019 by robertogreco
Silicon Valley Thinks Everyone Feels the Same Six Emotions
"From Alexa to self-driving cars, emotion-detecting technologies are becoming ubiquitous—but they rely on out-of-date science"
emotions  ai  artificialintelligence  2018  psychology  richfirth-godbehere  faces 
january 2019 by robertogreco
Deprived, but not depraved: Prosocial behavior is an adaptive response to lower socioeconomic status. - PubMed - NCBI
"Individuals of lower socioeconomic status (SES) display increased attentiveness to others and greater prosocial behavior compared to individuals of higher SES. We situate these effects within Pepper & Nettle's contextually appropriate response framework of SES. We argue that increased prosocial behavior is a contextually adaptive response for lower-SES individuals that serves to increase control over their more threatening social environments."
generosity  2017  poverty  wealth  behavior  social  research  ses  socioeconomicststatus  society  mutualaid  unschooling  deschooling  economics  psychology  care  caring  helpfulness 
january 2019 by robertogreco
On Bullsh*t Jobs | David Graeber | RSA Replay - YouTube
"In 2013 David Graeber, professor of anthropology at LSE, wrote an excoriating essay on modern work for Strike! magazine. “On the Phenomenon of Bullshit Jobs” was read over a million times and the essay translated in seventeen different languages within weeks. Graeber visits the RSA to expand on this phenomenon, and will explore how the proliferation of meaningless jobs - more associated with the 20th-century Soviet Union than latter-day capitalism - has impacted modern society. In doing so, he looks at how we value work, and how, rather than being productive, work has become an end in itself; the way such work maintains the current broken system of finance capital; and, finally, how we can get out of it."
davidgraeber  bullshitjobs  employment  jobs  work  2018  economics  neoliberalism  capitalism  latecapitalism  sovietunion  bureaucracy  productivity  finance  policy  politics  unschooling  deschooling  labor  society  purpose  schooliness  debt  poverty  inequality  rules  anticapitalism  morality  wealth  power  control  technology  progress  consumerism  suffering  morals  psychology  specialization  complexity  systemsthinking  digitization  automation  middlemanagement  academia  highered  highereducation  management  administration  adminstrativebloat  minutia  universalbasicincome  ubi  supplysideeconomics  creativity  elitism  thecultofwork  anarchism  anarchy  zero-basedaccounting  leisure  taylorism  ethics  happiness  production  care  maintenance  marxism  caregiving  serviceindustry  gender  value  values  gdp  socialvalue  education  teaching  freedom  play  feminism  mentalhealth  measurement  fulfillment  supervision  autonomy  humans  humnnature  misery  canon  agency  identity  self-image  self-worth  depression  stress  anxiety  solidarity  camaraderie  respect  community 
january 2019 by robertogreco
The Stories We Were Told about Education Technology (2018)
"It’s been quite a year for education news, not that you’d know that by listening to much of the ed-tech industry (press). Subsidized by the Chan Zuckerberg Initiative, some publications have repeatedly run overtly and covertly sponsored articles that hawk the future of learning as “personalized,” as focused on “the whole child.” Some of these attempt to stretch a contemporary high-tech vision of social emotional surveillance so it can map onto a strange vision of progressive education, overlooking no doubt how the history of progressive education has so often been intertwined with race science and eugenics.

Meanwhile this year, immigrant, refugee children at the United States border were separated from their parents and kept in cages, deprived of legal counsel, deprived of access to education, deprived in some cases of water.

“Whole child” and cages – it’s hardly the only jarring juxtaposition I could point to.

2018 was another year of #MeToo, when revelations about sexual assault and sexual harassment shook almost every section of society – the media and the tech industries, unsurprisingly, but the education sector as well – higher ed, K–12, and non-profits alike, as well school sports all saw major and devastating reports about cultures and patterns of sexual violence. These behaviors were, once again, part of the hearings and debates about a Supreme Court Justice nominee – a sickening deja vu not only for those of us that remember Anita Hill ’s testimony decades ago but for those of us who have experienced something similar at the hands of powerful people. And on and on and on.

And yet the education/technology industry (press) kept up with its rosy repetition that social equality is surely its priority, a product feature even – that VR, for example, a technology it has for so long promised is “on the horizon,” is poised to help everyone, particularly teachers and students, become more empathetic. Meanwhile, the founder of Oculus Rift is now selling surveillance technology for a virtual border wall between the US and Mexico.

2018 was a year in which public school teachers all over the US rose up in protest over pay, working conditions, and funding, striking in red states like West Virginia, Kentucky, and Oklahoma despite an anti-union ruling by the Supreme Court.

And yet the education/technology industry (press) was wowed by teacher influencers and teacher PD on Instagram, touting the promise for more income via a side-hustle like tutoring rather by structural or institutional agitation. Don’t worry, teachers. Robots won’t replace you, the press repeatedly said. Unsaid: robots will just de-professionalize, outsource, or privatize the work. Or, as the AI makers like to say, robots will make us all work harder (and no doubt, with no unions, cheaper).

2018 was a year of ongoing and increased hate speech and bullying – racism and anti-Semitism – on campuses and online.

And yet the education/technology industry (press) still maintained that blockchain would surely revolutionize the transcript and help insure that no one lies about who they are or what they know. Blockchain would enhance “smart spending” and teach financial literacy, the ed-tech industry (press) insisted, never once mentioning the deep entanglements between anti-Semitism and the alt-right and blockchain (specifically Bitcoin) backers.

2018 was a year in which hate and misinformation, magnified and spread by technology giants, continued to plague the world. Their algorithmic recommendation engines peddled conspiracy theories (to kids, to teens, to adults). “YouTube, the Great Radicalizer” as sociologist Zeynep Tufekci put it in a NYT op-ed.

And yet the education/technology industry (press) still talked about YouTube as the future of education, cheerfully highlighting (that is, spreading) its viral bullshit. Folks still retyped the press releases Google issued and retyped the press releases Facebook issued, lauding these companies’ (and their founders’) efforts to reshape the curriculum and reshape the classroom.

This is the ninth year that I’ve reviewed the stories we’re being told about education technology. Typically, this has been a ten (or more) part series. But I just can’t do it any more. Some people think it’s hilarious that I’m ed-tech’s Cassandra, but it’s not funny at all. It’s depressing, and it’s painful. And no one fucking listens.

If I look back at what I’ve written in previous years, I feel like I’ve already covered everything I could say about 2018. Hell, I’ve already written about the whole notion of the “zombie idea” in ed-tech – that bad ideas never seem to go away, that just get rebranded and repackaged. I’ve written about misinformation and ed-tech (and ed-tech as misinformation). I’ve written about the innovation gospel that makes people pitch dangerously bad ideas like “Uber for education” or “Alexa for babysitting.” I’ve written about the tech industry’s attempts to reshape the school system as its personal job training provider. I’ve written about the promise to “rethink the transcript” and to “revolutionize credentialing.” I’ve written about outsourcing and online education. I’ve written about coding bootcamps as the “new” for-profit higher ed, with all the exploitation that entails. I’ve written about the dangers of data collection and data analysis, about the loss of privacy and the lack of security.

And yet here we are, with Mark Zuckerberg – education philanthropist and investor – blinking before Congress, promising that AI will fix everything, while the biased algorithms keep churning out bias, while the education/technology industry (press) continues to be so blinded by “disruption” it doesn’t notice (or care) what’s happened to desegregation, and with so many data breaches and privacy gaffes that they barely make headlines anymore.

Folks. I’m done.

I’m also writing a book, and frankly that’s where my time and energy is going.

There is some delicious irony, I suppose, in the fact that there isn’t much that’s interesting or “innovative” to talk about in ed-tech, particularly since industry folks want to sell us on the story that tech is moving faster than it’s ever moved before, so fast in fact that the ol’ factory model school system simply cannot keep up.

I’ve always considered these year-in-review articles to be mini-histories of sorts – history of the very, very recent past. Now, instead, I plan to spend my time taking a longer, deeper look at the history of education technology, with particular attention for the next few months, as the title of my book suggests, to teaching machines – to the promises that machines will augment, automate, standardize, and individualize instruction. My focus is on the teaching machines of the mid-twentieth century, but clearly there are echoes – echoes of behaviorism and personalization, namely – still today.

In his 1954 book La Technique (published in English a decade later as The Technological Society), the sociologist Jacques Ellul observes how education had become oriented towards creating technicians, less interested in intellectual development than in personality development – a new “psychopedagogy” that he links to Maria Montessori. “The human brain must be made to conform to the much more advanced brain of the machine,” Ellul writes. “And education will no longer be an unpredictable and exciting adventure in human enlightenment , but an exercise in conformity and apprenticeship to whatever gadgetry is useful in a technical world.” I believe today we call this "social emotional learning" and once again (and so insistently by the ed-tech press and its billionaire backers), Montessori’s name is invoked as the key to preparing students for their place in the technological society.

Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.

This is the only good grit:

[image of Gritty]

If I were writing a lengthier series on the year in ed-tech, I’d spend much more time talking about the promises made about personalization and social emotional learning. I’ll just note here that the most important “innovator” in this area this year (other than Gritty) was surely the e-cigarette maker Juul, which offered a mindfulness curriculum to schools – offered them the curriculum and $20,000, that is – to talk about vaping. “‘The message: Our thoughts are powerful and can set action in motion,’ the lesson plan states.”

The most important event in ed-tech this year might have occurred on February 14, when a gunman opened fire on his former classmates at Marjory Stone Douglas High School in Parkland, Florida, killing 17 students and staff and injuring 17 others. (I chose this particular school shooting because of the student activism it unleashed.)

Oh, I know, I know – school shootings and school security aren’t ed-tech, ed-tech evangelists have long tried to insist, an argument I’ve heard far too often. But this year – the worst year on record for school shootings (according to some calculations) – I think that argument started to shift a bit. Perhaps because there’s clearly a lot of money to be made in selling schools “security” products and services: shooting simulation software, facial recognition technology, metal detectors, cameras, social media surveillance software, panic buttons, clear backpacks, bulletproof backpacks, … [more]
audreywatters  education  technology  edtech  2018  surveillance  privacy  personalization  progressive  schools  quantification  gamification  wholechild  montessori  mariamontessori  eugenics  psychology  siliconvalley  history  venturecapital  highereducation  highered  guns  gunviolence  children  youth  teens  shootings  money  influence  policy  politics  society  economics  capitalism  mindfulness  juul  marketing  gritty  innovation  genetics  psychotechnologies  gender  race  racism  sexism  research  socialemotional  psychopedagogy  pedagogy  teaching  howweteach  learning  howwelearn  teachingmachines  nonprofits  nonprofit  media  journalism  access  donaldtrump  bias  algorithms  facebook  amazon  disruption  data  bigdata  security  jacquesellul  sociology  activism  sel  socialemotionallearning 
december 2018 by robertogreco
Laziness Does Not Exist – Devon Price – Medium
"I’ve been a psychology professor since 2012. In the past six years, I’ve witnessed students of all ages procrastinate on papers, skip presentation days, miss assignments, and let due dates fly by. I’ve seen promising prospective grad students fail to get applications in on time; I’ve watched PhD candidates take months or years revising a single dissertation draft; I once had a student who enrolled in the same class of mine two semesters in a row, and never turned in anything either time.

I don’t think laziness was ever at fault.

Ever.

In fact, I don’t believe that laziness exists.



I’m a social psychologist, so I’m interested primarily in the situational and contextual factors that drive human behavior. When you’re seeking to predict or explain a person’s actions, looking at the social norms, and the person’s context, is usually a pretty safe bet. Situational constraints typically predict behavior far better than personality, intelligence, or other individual-level traits.

So when I see a student failing to complete assignments, missing deadlines, or not delivering results in other aspects of their life, I’m moved to ask: what are the situational factors holding this student back? What needs are currently not being met? And, when it comes to behavioral “laziness”, I’m especially moved to ask: what are the barriers to action that I can’t see?

There are always barriers. Recognizing those barriers— and viewing them as legitimate — is often the first step to breaking “lazy” behavior patterns.



It’s really helpful to respond to a person’s ineffective behavior with curiosity rather than judgment. I learned this from a friend of mine, the writer and activist Kimberly Longhofer (who publishes under Mik Everett). Kim is passionate about the acceptance and accommodation of disabled people and homeless people. Their writing about both subjects is some of the most illuminating, bias-busting work I’ve ever encountered. Part of that is because Kim is brilliant, but it’s also because at various points in their life, Kim has been both disabled and homeless.

Kim is the person who taught me that judging a homeless person for wanting to buy alcohol or cigarettes is utter folly. When you’re homeless, the nights are cold, the world is unfriendly, and everything is painfully uncomfortable. Whether you’re sleeping under a bridge, in a tent, or at a shelter, it’s hard to rest easy. You are likely to have injuries or chronic conditions that bother you persistently, and little access to medical care to deal with it. You probably don’t have much healthy food.

In that chronically uncomfortable, over-stimulating context, needing a drink or some cigarettes makes fucking sense. As Kim explained to me, if you’re laying out in the freezing cold, drinking some alcohol may be the only way to warm up and get to sleep. If you’re under-nourished, a few smokes may be the only thing that kills the hunger pangs. And if you’re dealing with all this while also fighting an addiction, then yes, sometimes you just need to score whatever will make the withdrawal symptoms go away, so you can survive.


[image of cover of "Self-Published Kindling: The Memoirs of a Homeless Bookstore Owner," by Mik Everett with caption "Kim’s incredible book about their experiences being homeless while running a bookstore."]

Few people who haven’t been homeless think this way. They want to moralize the decisions of poor people, perhaps to comfort themselves about the injustices of the world. For many, it’s easier to think homeless people are, in part, responsible for their suffering than it is to acknowledge the situational factors.

And when you don’t fully understand a person’s context — what it feels like to be them every day, all the small annoyances and major traumas that define their life — it’s easy to impose abstract, rigid expectations on a person’s behavior. All homeless people should put down the bottle and get to work. Never mind that most of them have mental health symptoms and physical ailments, and are fighting constantly to be recognized as human. Never mind that they are unable to get a good night’s rest or a nourishing meal for weeks or months on end. Never mind that even in my comfortable, easy life, I can’t go a few days without craving a drink or making an irresponsible purchase. They have to do better.

But they’re already doing the best they can. I’ve known homeless people who worked full-time jobs, and who devoted themselves to the care of other people in their communities. A lot of homeless people have to navigate bureaucracies constantly, interfacing with social workers, case workers, police officers, shelter staff, Medicaid staff, and a slew of charities both well-meaning and condescending. It’s a lot of fucking work to be homeless. And when a homeless or poor person runs out of steam and makes a “bad decision”, there’s a damn good reason for it.

If a person’s behavior doesn’t make sense to you, it is because you are missing a part of their context. It’s that simple. I’m so grateful to Kim and their writing for making me aware of this fact. No psychology class, at any level, taught me that. But now that it is a lens that I have, I find myself applying it to all kinds of behaviors that are mistaken for signs of moral failure — and I’ve yet to find one that can’t be explained and empathized with.



Let’s look at a sign of academic “laziness” that I believe is anything but: procrastination.

People love to blame procrastinators for their behavior. Putting off work sure looks lazy, to an untrained eye. Even the people who are actively doing the procrastinating can mistake their behavior for laziness. You’re supposed to be doing something, and you’re not doing it — that’s a moral failure right? That means you’re weak-willed, unmotivated, and lazy, doesn’t it?

For decades, psychological research has been able to explain procrastination as a functioning problem, not a consequence of laziness. When a person fails to begin a project that they care about, it’s typically due to either a) anxiety about their attempts not being “good enough” or b) confusion about what the first steps of the task are. Not laziness. In fact, procrastination is more likely when the task is meaningful and the individual cares about doing it well.

When you’re paralyzed with fear of failure, or you don’t even know how to begin a massive, complicated undertaking, it’s damn hard to get shit done. It has nothing to do with desire, motivation, or moral upstandingness. Procastinators can will themselves to work for hours; they can sit in front of a blank word document, doing nothing else, and torture themselves; they can pile on the guilt again and again — none of it makes initiating the task any easier. In fact, their desire to get the damn thing done may worsen their stress and make starting the task harder.

The solution, instead, is to look for what is holding the procrastinator back. If anxiety is the major barrier, the procrastinator actually needs to walk away from the computer/book/word document and engage in a relaxing activity. Being branded “lazy” by other people is likely to lead to the exact opposite behavior.

Often, though, the barrier is that procrastinators have executive functioning challenges — they struggle to divide a large responsibility into a series of discrete, specific, and ordered tasks. Here’s an example of executive functioning in action: I completed my dissertation (from proposal to data collection to final defense) in a little over a year. I was able to write my dissertation pretty easily and quickly because I knew that I had to a) compile research on the topic, b) outline the paper, c) schedule regular writing periods, and d) chip away at the paper, section by section, day by day, according to a schedule I had pre-determined.

Nobody had to teach me to slice up tasks like that. And nobody had to force me to adhere to my schedule. Accomplishing tasks like this is consistent with how my analytical, hyper-focused, Autistic little brain works. Most people don’t have that ease. They need an external structure to keep them writing — regular writing group meetings with friends, for example — and deadlines set by someone else. When faced with a major, massive project, most people want advice for how to divide it into smaller tasks, and a timeline for completion. In order to track progress, most people require organizational tools, such as a to-do list, calendar, datebook, or syllabus.

Needing or benefiting from such things doesn’t make a person lazy. It just means they have needs. The more we embrace that, the more we can help people thrive.



I had a student who was skipping class. Sometimes I’d see her lingering near the building, right before class was about to start, looking tired. Class would start, and she wouldn’t show up. When she was present in class, she was a bit withdrawn; she sat in the back of the room, eyes down, energy low. She contributed during small group work, but never talked during larger class discussions.

A lot of my colleagues would look at this student and think she was lazy, disorganized, or apathetic. I know this because I’ve heard how they talk about under-performing students. There’s often rage and resentment in their words and tone — why won’t this student take my class seriously? Why won’t they make me feel important, interesting, smart?

But my class had a unit on mental health stigma. It’s a passion of mine, because I’m a neuroatypical psychologist. I know how unfair my field is to people like me. The class & I talked about the unfair judgments people levy against those with mental illness; how depression is interpreted as laziness, how mood swings are framed as manipulative, how people with “severe” mental illnesses are … [more]
devonprice  2018  laziness  procrastination  psychology  mikeverett  kimberlylonghofer  teaching  howweteach  howwelearn  learning  mentalhealth  executivefunctioning  neurodiversity  discrimination  stress  anxiety  trauma  colleges  universities  academia  unschooling  deschooling  depression  mentalillness 
december 2018 by robertogreco
Stanford professor: "The workplace is killing people and nobody cares"
"From the disappearance of good health insurance to the psychological effects of long hours, the modern workplace is taking its toll on all of us."
work  labor  health  2018  workplace  culture  capitalism  management  administration  psychology  stress  childcare  jeffreypfeffer  socialpollution  society  nuriachinchilla  isolation  fatigue  time  attention 
december 2018 by robertogreco
The Relentlessness of Modern Parenting - The New York Times
"Experts agree that investing in children is a positive thing — they benefit from time with their parents, stimulating activities and supportive parenting styles. As low-income parents have increased the time they spend teaching and reading to their children, the readiness gap between kindergarten students from rich and poor families has shrunk. As parental supervision has increased, most serious crimes against children have declined significantly.

But it’s also unclear how much of children’s success is actually determined by parenting.

“It’s still an open question whether it’s the parenting practices themselves that are making the difference, or is it simply growing up with college-educated parents in an environment that’s richer in many dimensions?” said Liana Sayer, a sociologist at the University of Maryland and director of the Time Use Laboratory there. “I don’t think any of these studies so far have been able to answer whether these kids would be doing well as adults regardless, simply because of resources.”

There has been a growing movement against the relentlessness of modern-day parenting. Utah passed a free-range parenting law, exempting parents from accusations of neglect if they let their children play or commute unattended.

Psychologists and others have raised alarms about children’s high levels of stress and dependence on their parents, and the need to develop independence, self-reliance and grit. Research has shown that children with hyper-involved parents have more anxiety and less satisfaction with life, and that when children play unsupervised, they build social skills, emotional maturity and executive function.

Parents, particularly mothers, feel stress, exhaustion and guilt at the demands of parenting this way, especially while holding a job. American time use diaries show that the time women spend parenting comes at the expense of sleep, time alone with their partners and friends, leisure time and housework. Some pause their careers or choose not to have children. Others, like Ms. Sentilles, live in a state of anxiety. She doesn’t want to hover, she said. But trying to oversee homework, limit screen time and attend to Isaac’s needs, she feels no choice.

“At any given moment, everything could just fall apart,” she said.

“On the one hand, I love my work,” she said. “But the way it’s structured in this country, where there’s not really child care and there’s this sense that something is wrong with you if you aren’t with your children every second when you’re not at work? It isn’t what I think feminists thought they were signing up for.”"
parenting  helicopterparents  anxiety  stress  surveillance  children  inequality  2018  schools  schooliness  glvo  hovering  capitalism  economics  freedom  free-rangeparenting  unschooling  deschooling  learning  youth  psychology  society  attention  helicopterparenting 
december 2018 by robertogreco
Should America Be Run by … Trader Joe’s? (Ep. 359) - Freakonomics Freakonomics
"ROBERTO: “I’d like to open a new kind of grocery store. We’re not going to have any branded items. It’s all going to be private label. We’re going to have no television advertising and no social media whatsoever. We’re never going to have anything on sale. We’re not going to accept coupons. We’ll have no loyalty card. We won’t have a circular that appears in the Sunday newspaper. We’ll have no self-checkout. We won’t have wide aisles or big parking lots. Would you invest in my company?”



"So we put on our Freakonomics goggles in an attempt to reverse-engineer the secrets of Trader Joe’s. Which, it turns out, are incredibly Freakonomical: things like choice architecture and decision theory. Things like nudging and an embrace of experimentation. In fact, if Freakonomics were a grocery store, it might be a Trader Joe’s, or at least try to be. It’s like a real-life case study of behavioral economics at work. So, here’s the big question: if Trader Joe’s is really so good, should their philosophy be applied elsewhere? Should Trader Joe’s — I can’t believe I’m going to say this, but … should Trader Joe’s be running America?"
traderjoes  2018  freakanomics  retail  groceries  psychology  choice  paradoxofchoice  decisionmaking  michaelroberto  competition  microsoft  satyanadella  markgardiner  sheenaiyengar  economics  behavior  hiring 
december 2018 by robertogreco
Skim reading is the new normal. The effect on society is profound | Maryanne Wolf | Opinion | The Guardian
"When the reading brain skims texts, we don’t have time to grasp complexity, to understand another’s feelings or to perceive beauty. We need a new literacy for the digital age"



"Look around on your next plane trip. The iPad is the new pacifier for babies and toddlers. Younger school-aged children read stories on smartphones; older boys don’t read at all, but hunch over video games. Parents and other passengers read on Kindles or skim a flotilla of email and news feeds. Unbeknownst to most of us, an invisible, game-changing transformation links everyone in this picture: the neuronal circuit that underlies the brain’s ability to read is subtly, rapidly changing - a change with implications for everyone from the pre-reading toddler to the expert adult.

As work in neurosciences indicates, the acquisition of literacy necessitated a new circuit in our species’ brain more than 6,000 years ago. That circuit evolved from a very simple mechanism for decoding basic information, like the number of goats in one’s herd, to the present, highly elaborated reading brain. My research depicts how the present reading brain enables the development of some of our most important intellectual and affective processes: internalized knowledge, analogical reasoning, and inference; perspective-taking and empathy; critical analysis and the generation of insight. Research surfacing in many parts of the world now cautions that each of these essential “deep reading” processes may be under threat as we move into digital-based modes of reading.

This is not a simple, binary issue of print vs digital reading and technological innovation. As MIT scholar Sherry Turkle has written, we do not err as a society when we innovate, but when we ignore what we disrupt or diminish while innovating. In this hinge moment between print and digital cultures, society needs to confront what is diminishing in the expert reading circuit, what our children and older students are not developing, and what we can do about it.

We know from research that the reading circuit is not given to human beings through a genetic blueprint like vision or language; it needs an environment to develop. Further, it will adapt to that environment’s requirements – from different writing systems to the characteristics of whatever medium is used. If the dominant medium advantages processes that are fast, multi-task oriented and well-suited for large volumes of information, like the current digital medium, so will the reading circuit. As UCLA psychologist Patricia Greenfield writes, the result is that less attention and time will be allocated to slower, time-demanding deep reading processes, like inference, critical analysis and empathy, all of which are indispensable to learning at any age.

Increasing reports from educators and from researchers in psychology and the humanities bear this out. English literature scholar and teacher Mark Edmundson describes how many college students actively avoid the classic literature of the 19th and 20th centuries because they no longer have the patience to read longer, denser, more difficult texts. We should be less concerned with students’ “cognitive impatience,” however, than by what may underlie it: the potential inability of large numbers of students to read with a level of critical analysis sufficient to comprehend the complexity of thought and argument found in more demanding texts, whether in literature and science in college, or in wills, contracts and the deliberately confusing public referendum questions citizens encounter in the voting booth.

Multiple studies show that digital screen use may be causing a variety of troubling downstream effects on reading comprehension in older high school and college students. In Stavanger, Norway, psychologist Anne Mangen and her colleagues studied how high school students comprehend the same material in different mediums. Mangen’s group asked subjects questions about a short story whose plot had universal student appeal (a lust-filled, love story); half of the students read Jenny, Mon Amour on a Kindle, the other half in paperback. Results indicated that students who read on print were superior in their comprehension to screen-reading peers, particularly in their ability to sequence detail and reconstruct the plot in chronological order.

Ziming Liu from San Jose State University has conducted a series of studies which indicate that the “new norm” in reading is skimming, with word-spotting and browsing through the text. Many readers now use an F or Z pattern when reading in which they sample the first line and then word-spot through the rest of the text. When the reading brain skims like this, it reduces time allocated to deep reading processes. In other words, we don’t have time to grasp complexity, to understand another’s feelings, to perceive beauty, and to create thoughts of the reader’s own.

Karin Littau and Andrew Piper have noted another dimension: physicality. Piper, Littau and Anne Mangen’s group emphasize that the sense of touch in print reading adds an important redundancy to information – a kind of “geometry” to words, and a spatial “thereness” for text. As Piper notes, human beings need a knowledge of where they are in time and space that allows them to return to things and learn from re-examination – what he calls the “technology of recurrence”. The importance of recurrence for both young and older readers involves the ability to go back, to check and evaluate one’s understanding of a text. The question, then, is what happens to comprehension when our youth skim on a screen whose lack of spatial thereness discourages “looking back.”

US media researchers Lisa Guernsey and Michael Levine, American University’s linguist Naomi Baron, and cognitive scientist Tami Katzir from Haifa University have examined the effects of different information mediums, particularly on the young. Katzir’s research has found that the negative effects of screen reading can appear as early as fourth and fifth grade - with implications not only for comprehension, but also on the growth of empathy.

The possibility that critical analysis, empathy and other deep reading processes could become the unintended “collateral damage” of our digital culture is not a simple binary issue about print vs digital reading. It is about how we all have begun to read on any medium and how that changes not only what we read, but also the purposes for why we read. Nor is it only about the young. The subtle atrophy of critical analysis and empathy affects us all. It affects our ability to navigate a constant bombardment of information. It incentivizes a retreat to the most familiar silos of unchecked information, which require and receive no analysis, leaving us susceptible to false information and demagoguery.

There’s an old rule in neuroscience that does not alter with age: use it or lose it. It is a very hopeful principle when applied to critical thought in the reading brain because it implies choice. The story of the changing reading brain is hardly finished. We possess both the science and the technology to identify and redress the changes in how we read before they become entrenched. If we work to understand exactly what we will lose, alongside the extraordinary new capacities that the digital world has brought us, there is as much reason for excitement as caution.

We need to cultivate a new kind of brain: a “bi-literate” reading brain capable of the deepest forms of thought in either digital or traditional mediums. A great deal hangs on it: the ability of citizens in a vibrant democracy to try on other perspectives and discern truth; the capacity of our children and grandchildren to appreciate and create beauty; and the ability in ourselves to go beyond our present glut of information to reach the knowledge and wisdom necessary to sustain a good society."
reading  howweread  skimming  digital  2018  maryannewolf  literacy  truth  meaning  karinlittau  andrewpiper  annemagen  patriciagreenfield  sherryturkle  attention  technology  screens  speed  psychology  behavior 
december 2018 by robertogreco
Opinion | What Straight-A Students Get Wrong - The New York Times
"A decade ago, at the end of my first semester teaching at Wharton, a student stopped by for office hours. He sat down and burst into tears. My mind started cycling through a list of events that could make a college junior cry: His girlfriend had dumped him; he had been accused of plagiarism. “I just got my first A-minus,” he said, his voice shaking.

Year after year, I watch in dismay as students obsess over getting straight A’s. Some sacrifice their health; a few have even tried to sue their school after falling short. All have joined the cult of perfectionism out of a conviction that top marks are a ticket to elite graduate schools and lucrative job offers.

I was one of them. I started college with the goal of graduating with a 4.0. It would be a reflection of my brainpower and willpower, revealing that I had the right stuff to succeed. But I was wrong.

The evidence is clear: Academic excellence is not a strong predictor of career excellence. Across industries, research shows that the correlation between grades and job performance is modest in the first year after college and trivial within a handful of years. For example, at Google, once employees are two or three years out of college, their grades have no bearing on their performance. (Of course, it must be said that if you got D’s, you probably didn’t end up at Google.)

Academic grades rarely assess qualities like creativity, leadership and teamwork skills, or social, emotional and political intelligence. Yes, straight-A students master cramming information and regurgitating it on exams. But career success is rarely about finding the right solution to a problem — it’s more about finding the right problem to solve.

In a classic 1962 study, a team of psychologists tracked down America’s most creative architects and compared them with their technically skilled but less original peers. One of the factors that distinguished the creative architects was a record of spiky grades. “In college our creative architects earned about a B average,” Donald MacKinnon wrote. “In work and courses which caught their interest they could turn in an A performance, but in courses that failed to strike their imagination, they were quite willing to do no work at all.” They paid attention to their curiosity and prioritized activities that they found intrinsically motivating — which ultimately served them well in their careers.

Getting straight A’s requires conformity. Having an influential career demands originality. In a study of students who graduated at the top of their class, the education researcher Karen Arnold found that although they usually had successful careers, they rarely reached the upper echelons. “Valedictorians aren’t likely to be the future’s visionaries,” Dr. Arnold explained. “They typically settle into the system instead of shaking it up.”

This might explain why Steve Jobs finished high school with a 2.65 G.P.A., J.K. Rowling graduated from the University of Exeter with roughly a C average, and the Rev. Dr. Martin Luther King Jr. got only one A in his four years at Morehouse.

If your goal is to graduate without a blemish on your transcript, you end up taking easier classes and staying within your comfort zone. If you’re willing to tolerate the occasional B, you can learn to program in Python while struggling to decipher “Finnegans Wake.” You gain experience coping with failures and setbacks, which builds resilience.

Straight-A students also miss out socially. More time studying in the library means less time to start lifelong friendships, join new clubs or volunteer. I know from experience. I didn’t meet my 4.0 goal; I graduated with a 3.78. (This is the first time I’ve shared my G.P.A. since applying to graduate school 16 years ago. Really, no one cares.) Looking back, I don’t wish my grades had been higher. If I could do it over again, I’d study less. The hours I wasted memorizing the inner workings of the eye would have been better spent trying out improv comedy and having more midnight conversations about the meaning of life.

So universities: Make it easier for students to take some intellectual risks. Graduate schools can be clear that they don’t care about the difference between a 3.7 and a 3.9. Colleges could just report letter grades without pluses and minuses, so that any G.P.A. above a 3.7 appears on transcripts as an A. It might also help to stop the madness of grade inflation, which creates an academic arms race that encourages too many students to strive for meaningless perfection. And why not let students wait until the end of the semester to declare a class pass-fail, instead of forcing them to decide in the first month?

Employers: Make it clear you value skills over straight A’s. Some recruiters are already on board: In a 2003 study of over 500 job postings, nearly 15 percent of recruiters actively selected against students with high G.P.A.s (perhaps questioning their priorities and life skills), while more than 40 percent put no weight on grades in initial screening.

Straight-A students: Recognize that underachieving in school can prepare you to overachieve in life. So maybe it’s time to apply your grit to a new goal — getting at least one B before you graduate."
education  grades  grading  colleges  universities  academia  2018  adamgrant  psychology  gpa  assessment  criticalthinking  anxiety  stress  learning  howwelearn  motivation  gradschool  jkrowling  stevejobs  martinlutherkingjr  perfectionism  srg  edg  mlk 
december 2018 by robertogreco
So cute you could crush it? | University of California
"Until now, research exploring how and why cute aggression occurs has been the domain of behavioral psychology, said Katherine Stavropoulos, an assistant professor of special education at the University of California, Riverside. But recently Stavropoulos, a licensed clinical psychologist with a background in neuroscience, has taken formal study of the phenomenon a few steps further.

In her research, Stavropoulos uses electrophysiology to evaluate surface-level electrical activity that arises from neurons firing in people’s brains. By studying that activity, she gauges neural responses to a range of external stimuli."



"Another result that Stavropoulos said lends weight to prior theories: The relationship between how cute something is and how much cute aggression someone experiences toward it appears to be tied to how overwhelmed that person is feeling.

“Essentially, for people who tend to experience the feeling of ‘not being able to take how cute something is,’ cute aggression happens,” Stavropoulos said. “Our study seems to underscore the idea that cute aggression is the brain’s way of ‘bringing us back down’ by mediating our feelings of being overwhelmed.”

Stavropoulos likened this process of mediation to an evolutionary adaptation. Such an adaptation may have developed as a means of ensuring people are able to continue taking care of creatures they consider particularly cute.

“For example, if you find yourself incapacitated by how cute a baby is — so much so that you simply can’t take care of it — that baby is going to starve,” Stavropoulos said. “Cute aggression may serve as a tempering mechanism that allows us to function and actually take care of something we might first perceive as overwhelmingly cute.”

In the future, Stavropoulos hopes to use electrophysiology to study the neural bases of cute aggression in a variety of populations and groups, such as mothers with postpartum depression, people with autism spectrum disorder, and participants with and without babies or pets.

“I think if you have a child and you’re looking at pictures of cute babies, you might exhibit more cute aggression and stronger neural reactions,” she said. “The same could be true for people who have pets and are looking pictures of cute puppies or other small animals.”"
nervio  cuteness  2018  psychology  katherinestavropoulos  neuroscience  cuteaggression 
december 2018 by robertogreco
When starting school, younger children are more likely to be diagnosed with ADHD, study says – Harvard Gazette
"Could a child’s birthday put him or her at risk for an ADHD misdiagnosis? The answer appears to be yes, at least among children born in August who start school in states where enrollment is cut off at a Sept. 1 birth date, according to a new study led by Harvard Medical School researchers.

The findings, published Nov. 28 in The New England Journal of Medicine, show that children born in August in those states are 30 percent more likely to receive an ADHD diagnosis, compared with their slightly older peers enrolled in the same grade.

The rate of ADHD diagnoses among children has risen dramatically over the past 20 years. In 2016 alone, more than 5 percent of U.S. children were being actively treated with medication for ADHD. Experts believe the rise is fueled by a combination of factors, including a greater recognition of the disorder, a true rise in the incidence of the condition and, in some cases, improper diagnosis.

The results of the new study underscore the notion that, at least in a subset of elementary school students, the diagnosis may be a factor of earlier school enrollment, the research team said.

“Our findings suggest the possibility that large numbers of kids are being overdiagnosed and overtreated for ADHD because they happen to be relatively immature compared to their older classmates in the early years of elementary school,” said study lead author Timothy Layton, assistant professor of health care policy in the Blavatnik Institute at Harvard Medical School.

Most states have arbitrary birth date cutoffs that determine which grade a child will be placed in and when they can start school. In states with a Sept. 1 cutoff, a child born on Aug. 31 will be nearly a full year younger on the first day of school than a classmate born on Sept. 1. At this age, Layton noted, the younger child might have a harder time sitting still and concentrating for long periods of time in class. That extra fidgeting may lead to a medical referral, Layton said, followed by diagnosis and treatment for ADHD.

For example, the researchers said, what may be normal behavior in a boisterous 6-year-old could seem abnormal relative to the behavior of older peers in the same classroom.

This dynamic may be particularly true among younger children given that an 11- or 12-month difference in age could lead to significant differences in behavior, the researchers added.

“As children grow older, small differences in age equalize and dissipate over time, but behaviorally speaking, the difference between a 6-year-old and a 7-year-old could be quite pronounced,” said study senior author Anupam Jena, the Ruth L. Newhouse Associate Professor of Health Care Policy at Harvard Medical School and an internal medicine physician at Massachusetts General Hospital. “A normal behavior may appear anomalous relative to the child’s peer group.”

Using the records of a large insurance database, the investigators compared the difference in ADHD diagnosis by birth month — August versus September — among more than 407,000 elementary school children born between 2007 and 2009, who were followed until the end of 2015.

In states that use Sept. 1 as a cutoff date for school enrollment, children born in August had a 30 percent greater chance of an ADHD diagnosis than children born in September, the analysis showed. No such differences were observed between children born in August and September in states with cutoff dates other than Sept. 1.

For example, 85 of 10,000 students born in August were either diagnosed with or treated for ADHD, compared with 64 students of 10,000 born in September. When investigators looked at ADHD treatment only, the difference was also large — 53 of 10,000 students born in August received ADHD medication, compared with 40 of 10,000 for those born in September.

Jena pointed to a similar phenomenon described in Malcolm Gladwell’s book “Outliers.” Canadian professional hockey players are much more likely to have been born early in the year, according to research cited in Gladwell’s book. Canadian youth hockey leagues use Jan. 1 as a cutoff date for age groups. In the formative early years of youth hockey, players born in the first few months of the year were older and more mature, and therefore likelier to be tracked into elite leagues, with better coaching, more time on the ice, and a more talented cohort of teammates. Over the years this cumulative advantage gave the relatively older players an edge over their younger competitors.

Similarly, Jena noted, a 2017 working paper from the National Bureau of Economic Research suggested that children born just after the cutoff date for starting school tended to have better long-term educational performance than their relatively younger peers born later in the year.

“In all of those scenarios, timing and age appear to be potent influencers of outcome,” Jena said.

Research has shown wide variations in ADHD diagnosis and treatment across different regions in the U.S. ADHD diagnosis and treatment rates have also climbed dramatically over the last 20 years. In 2016 alone, more than 5 percent of all children in the U.S. were taking medication for ADHD, the authors noted. All of these factors have fueled concerns about ADHD overdiagnosis and overtreatment.

The reasons for the rise in ADHD incidence are complex and multifactorial, Jena said. Arbitrary cutoff dates are likely just one of many variables driving this phenomenon, he added. In recent years, many states have adopted measures that hold schools accountable for identifying ADHD and give educators incentives to refer any child with symptoms suggesting ADHD for medical evaluation.“The diagnosis of this condition is not just related to the symptoms, it’s related to the context,” Jena said. “The relative age of the kids in class, laws and regulations, and other circumstances all come together.”

It is important to look at all of these factors before making a diagnosis and prescribing treatment, Jena said.

“A child’s age relative to his or her peers in the same grade should be taken into consideration and the reasons for referral carefully examined.”

Additonal co-authors include researchers from the Department of Health Care Policy, the National Bureau of Economic Research and the Department of Health Policy and Management, and Harvard T.H. Chan School of Public Health."
adhd  children  schools  schooling  schooliness  2018  psychology  health  drugs  diagnosis  behavior 
november 2018 by robertogreco
HEWN, No. 291
"Ed Yong wrote about that viral video of the baby bear and mama bear making their way across a snow-covered cliff. You know the one — the one that some educators have said shows the bear had “grit.” Yong points out that the bears were being filmed by a drone, and the mother would never have made her baby take such a precarious path had it not been for the technological intrusion. Come to think of it, the whole thing — the ignorance and dismissal of trauma, the lack of attention to structural violence, the use of technology to shape behavior — is a perfect analogy for how “grit” gets wielded in schools."
grit  audreywatters  2018  edtech  technology  schools  education  trauma  violence  behavior  psychology  intrusion  surveillance 
november 2018 by robertogreco
Dr. Michelle Fine on Willful Subjectivity and Strong Objectivity in Education Research - Long View on Education
"In this interview, Dr. Michelle Fine makes the argument for participatory action research as a sophisticated epistemology. Her work uncovers the willful subjectivity and radical wit of youth. In the last ten minutes, she gives some concrete recommendations for setting up a classroom that recognizes and values the gifts that students bring. Please check out her publications on ResearchGate [https://www.researchgate.net/profile/Michelle_Fine ] and her latest book Just Research in Contentious Times (Teachers College, 2018). [https://www.amazon.com/Just-Research-Contentious-Times-Methodological/dp/0807758736/ ]

Michelle Fine is a Distinguished Professor of Critical Psychology, Women’s Studies, American Studies and Urban Education at the Graduate Center CUNY.

Thank you to Dr. Kim Case and Professor Tanya L. Domi."
michellefine  reasearch  dispossession  privilege  resistance  solidarity  participatory  participatoryactionresearch  ethnography  education  benjamindoxtdatorcritical  pedagogy  race  racism  postcolonialism  criticaltheory  imf  epistemology  research  focusgroups  subjectivity  youth  teens  stories  socialjustice  criticalparticipatoryactionresearch  sexuality  centering  oppression  pointofview  action  quantitative  qualitative  injustice  gender  deficit  resilience  experience  radicalism  incarceration  billclinton  pellgrants  willfulsubjectivity  survivance  wit  radicalwit  indigeneity  queer  justice  inquiry  hannaharendt  criticalbifocality  psychology  context  history  structures  gigeconomy  progressive  grit  economics  victimblaming  schools  intersectionality  apolitical  neoliberalism  neutrality  curriculum  objectivity  contestedhistories  whiteprivilege  whitefragility  islamophobia  discrimination  alienation  conversation  disengagement  defensiveness  anger  hatred  complexity  diversity  self-definition  ethnicity 
november 2018 by robertogreco
Audrey Watters on Twitter: "I'm sorry. But I have a rant about "personalized learning" https://t.co/lgVgCZBae7"
"I'm sorry. But I have a rant about "personalized learning" https://www.npr.org/2018/11/16/657895964/the-future-of-learning-well-it-s-personal

"Personalized learning" is not new. Know your history. It predates "Silicon Valley" and it pre-dates educational computing and it most certainly pre-dates Khan Academy and it pre-dates Sal Khan.

Even the way in which Sal Khan describes "personalized learning" -- "students move at their own pace" until they've mastered a question or topic -- is very, very old.

Educational psychologists have been building machines to do this -- supposedly to function like a tutor -- for almost 100 years.

The push to "personalize" education *with machines* has been happening for over a century thanks to educational psychology AND of course educational testing. This push is also deeply intertwined with ideas about efficiency and individualism. (& as such it is profoundly American)

Stop acting like "personalized learning" is this brand new thing just because the ed-tech salespeople and ed reformers want you to buy it. Maybe start asking why all these efforts have failed in the past -- with and without machines. Ever heard of the Dalton Plan, for example?

And good god, don't say past efforts failed because computers are so amazing today. School software sucks. People who tell you otherwise are liars.

Also: as democracy seems to be collapsing all around us, perhaps it's not such a fine time to abandoned shared intellectual spaces and shared intellectual understanding, eh? Perhaps we should be talking about more communal, democratic practices and less personalized learning?

Also: stop taking people seriously who talk about the history of school and the only book they seem to have read on the topic is one by John Taylor Gatto. Thanks in advance.

(On the other hand, keep it up. This all makes a perfect Introduction for my book)"
personalization  personalizedlearning  2018  audreywatters  history  education  edtech  siliconvalley  memory  salkhan  khanacademy  psychology  testing  individualism  efficiency  democracy  daltonplan  johntaylorgatto  communalism  lcproject  openstudioproject  sfsh  tcsnmy  collectivism  us 
november 2018 by robertogreco
The Educational Tyranny of the Neurotypicals | WIRED
"Ben Draper, who runs the Macomber Center for Self Directed Learning, says that while the center is designed for all types of children, kids whose parents identify them as on the autism spectrum often thrive at the center when they’ve had difficulty in conventional schools. Ben is part of the so-called unschooling movement, which believes that not only should learning be self-directed, in fact we shouldn't even focus on guiding learning. Children will learn in the process of pursuing their passions, the reasoning goes, and so we just need to get out of their way, providing support as needed.

Many, of course, argue that such an approach is much too unstructured and verges on irresponsibility. In retrospect, though, I feel I certainly would have thrived on “unschooling.” In a recent paper, Ben and my colleague Andre Uhl, who first introduced me to unschooling, argue that it not only works for everyone, but that the current educational system, in addition to providing poor learning outcomes, impinges on the rights of children as individuals.

MIT is among a small number of institutions that, in the pre-internet era, provided a place for non-neurotypical types with extraordinary skills to gather and form community and culture. Even MIT, however, is still trying to improve to give these kids the diversity and flexibility they need, especially in our undergraduate program.

I'm not sure how I'd be diagnosed, but I was completely incapable of being traditionally educated. I love to learn, but I go about it almost exclusively through conversations and while working on projects. I somehow kludged together a world view and life with plenty of struggle, but also with many rewards. I recently wrote a PhD dissertation about my theory of the world and how I developed it. Not that anyone should generalize from my experience—one reader of my dissertation said that I’m so unusual, I should be considered a "human sub-species." While I take that as a compliment, I think there are others like me who weren’t as lucky and ended up going through the traditional system and mostly suffering rather than flourishing. In fact, most kids probably aren’t as lucky as me and while some types are more suited for success in the current configuration of society, a huge percentage of kids who fail in the current system have a tremendous amount to contribute that we aren’t tapping into.

In addition to equipping kids for basic literacy and civic engagement, industrial age schools were primarily focused on preparing kids to work in factories or perform repetitive white-collar jobs. It may have made sense to try to convert kids into (smart) robotlike individuals who could solve problems on standardized tests alone with no smartphone or the internet and just a No. 2 pencil. Sifting out non-neurotypical types or trying to remediate them with drugs or institutionalization may have seemed important for our industrial competitiveness. Also, the tools for instruction were also limited by the technology of the times. In a world where real robots are taking over many of those tasks, perhaps we need to embrace neurodiversity and encourage collaborative learning through passion, play, and projects, in other words, to start teaching kids to learn in ways that machines can’t. We can also use modern technology for connected learning that supports diverse interests and abilities and is integrated into our lives and communities of interest.

At the Media Lab, we have a research group called Lifelong Kindergarten, and the head of the group, Mitchel Resnick, recently wrote a book by the same name. The book is about the group’s research on creative learning and the four Ps—Passion, Peers, Projects, and Play. The group believes, as I do, that we learn best when we are pursuing our passion and working with others in a project-based environment with a playful approach. My memory of school was "no cheating,” “do your own work,” "focus on the textbook, not on your hobbies or your projects," and "there’s time to play at recess, be serious and study or you'll be shamed"—exactly the opposite of the four Ps.

Many mental health issues, I believe, are caused by trying to “fix” some type of neurodiversity or by simply being insensitive or inappropriate for the person. Many mental “illnesses” can be “cured” by providing the appropriate interface to learning, living, or interacting for that person focusing on the four Ps. My experience with the educational system, both as its subject and, now, as part of it, is not so unique. I believe, in fact, that at least the one-quarter of people who are diagnosed as somehow non-neurotypical struggle with the structure and the method of modern education. People who are wired differently should be able to think of themselves as the rule, not as an exception."
neurotypicals  neurodiversity  education  schools  schooling  learning  inequality  elitism  meritocracy  power  bias  diversity  autism  psychology  stevesilberman  schooliness  unschooling  deschooling  ronsuskind  mentalhealth  mitchresnick  mit  mitemedialab  medialab  lifelongkindergarten  teaching  howweteach  howwelearn  pedagogy  tyranny  2018  economics  labor  bendraper  flexibility  admissions  colleges  universities  joiito 
november 2018 by robertogreco
Reducing your carbon footprint still matters.
"Recent articles in Vox, the Guardian, and the Outline have warned that individuals “going green” in daily life won’t make enough of a difference to be worth the effort. In fact, they argue, such efforts could actually make matters worse, as focusing on individual actions might distract people from pressuring corporations and government officials to lower greenhouse gas emissions and enact the broader policy change we need to meet our climate goals. These articles and others like them (including in Slate) tend to conclude that the only truly meaningful action people can take to influence our climate future is to vote.

Voting is crucial, but this perspective misses a large point of individual actions. We don’t recommend taking personal actions like limiting plane rides, eating less meat, or investing in solar energy because all of these small tweaks will build up to enough carbon savings (though it could help). We do so because people taking action in their personal lives is actually one of the best ways to get to a society that implements the policy-level change that is truly needed. Research on social behavior suggests lifestyle change can build momentum for systemic change. Humans are social animals, and we use social cues to recognize emergencies. People don’t spring into action just because they see smoke; they spring into action because they see others rushing in with water. The same principle applies to personal actions on climate change.

Psychologists Bibb Latane and John Darley tested this exact scenario in a now-classic study. Participants filled out a survey in a quiet room, which suddenly began to fill with smoke (from a vent set up by the experimenters). When alone, participants left the room and reported the apparent fire. But in the presence of others who ignored the smoke, participants carried on as though nothing were wrong."



"There are plenty of things to do about climate change beyond voting. Take a train or bus instead of a plane, even if inconvenient—in fact, especially when inconvenient. Take a digital meeting instead of an in-person one, even if you give up expensed travel. Go to a protest, invest in noncarbon energy, buy solar panels, eat at meatless restaurants, canvass for climate-conscious candidates. Do whichever of these you can, as conspicuously as you can. With each step, you communicate an emergency that needs all hands on deck. Individual action—across supermarkets, skies, roads, homes, workplaces, and ballot boxes—sounds an alarm that might just wake us from our collective slumber and build a foundation for the necessary political change."
leorhackel  greggsparkman  climatechange  2018  politics  social  humans  globalwarming  bibblatane  johndarley  psychology  action  activism  environment  sustainability 
november 2018 by robertogreco
‘Silence Is Health’: How Totalitarianism Arrives | by Uki Goñi | NYR Daily | The New York Review of Books
"A nagging question that first popped into my head while I was a twenty-three-year-old reporter at the Buenos Aires Herald has returned to haunt me lately. What would happen if the US, the country where I was born and spent my childhood, spiraled down the kind of totalitarian vortex I was witnessing in Argentina back then? What if the most regressive elements in society gained the upper hand? Would they also lead a war against an abhorred pluralist democracy? The backlash in the US today against immigrants and refugees, legal abortion, even marriage equality, rekindles uncomfortable memories of the decay of democracy that preceded Argentina’s descent into repression and mass murder."



"This normalization of totalitarian undertones accelerated after my family moved back to Argentina when I was nineteen. To make myself better acquainted with Buenos Aires, I would take long walks through the capital. One day, in 1974, I found myself frozen in my steps on the broad 9 de Julio Avenue that divides Buenos Aires in half. In the middle of this avenue rises a tall white obelisk that is the city’s most conspicuous landmark, and in those days a revolving billboard had been suspended around it. Round and round turned the display and inscribed upon it in large blue letters on a plain white background was the slogan “Silence Is Health.”

With every turn, the billboard schooled Argentines in the total censorship and suppression of free speech that the dictatorship would soon impose. The billboard message was the brainchild of Oscar Ivanissevich, Argentina’s reactionary minister of education, ostensibly to caution motorists against excessive use of the horn. His other mission was an “ideological purge” of Argentina’s universities, which had become a hotbed of student activism. During an earlier ministerial term in 1949, Ivanissevich had led a bitter campaign against the “morbid… perverse… godless” trend of abstract art, recalling the Nazis’ invective against “degenerate” art. During that period, his sister and his nephew were both involved in smuggling Nazis into Argentina.

Ivanissevich’s Orwellian billboard made its appearance just as right-wing violence erupted in the buildup to the military coup. That same year, 1974, Ivanissevich had appointed as rector of Buenos Aires University a well-known admirer of Hitler’s, Alberto Ottalagano, who titled his later autobiography I’m a Fascist, So What? His job was to get rid of the kind of young left-wing protesters who gathered outside the Sheraton Hotel demanding that it be turned into a children’s hospital, and he warmed to the task of persecuting and expelling them. Being singled out by him was more than merely a matter of academic discipline; some fifteen of these students were murdered by right-wing death squads while Ottalagano was rector.

As a partial stranger in my own land, I noticed what those who had already been normalized could not: this was a population habituated to intolerance and violence. Two years later, Ivanissevich’s slogan made a macabre reappearance. In the basement of the dictatorship’s death camp based at the Navy Mechanics School (known as ESMA), where some 5,000 people were exterminated, officers hung two banners along the corridor that opened onto its torture cells. One read “Avenue of Happiness,” the other “Silence Is Health.”

*

To comprehend would-be totalitarians requires understanding their view of themselves as victims. And in a sense, they are victims—of their delusional fear of others, the nebulous, menacing others that haunt their febrile imaginations. This is something I saw repeated in the many interviews I carried out with both the perpetrators of Argentina’s dictatorship and the aging Nazis who had been smuggled to Argentina’s shores three decades earlier. (My interviews with the latter are archived at the US Holocaust Memorial Museum in Washington, D.C.) Their fears were, in both cases, irrational given the unassailable dominance of the military in Argentina and of the Nazis in Germany, but that was of no account to my interviewees.

Because my method was to grant them the respect and patience to which they felt entitled (difficult though that was for me to do), they sometimes seemed briefly to be aware that they had become willing hosts to violent delusions. Getting them to admit that, fully and consciously, was another matter. The chimera of a powerfully malign enemy, responsible for all their perceived ills, made complex, ambiguous realities comprehensible by reducing them to Manichean simplicities. These people were totalitarians not only because they believed in absolute power, but also because their binary thought patterns admitted only total explanations.

Argentina’s military and a large number of like-minded civilians were especially prone to fears of a loosely-defined but existential threat. The youth culture of the 1960s, the sexual revolution, the student protests of the 1970s, all struck alarm in their hearts. That a younger generation would question their strongly-held religious beliefs, challenge their hypocritical sexual mores, and propose alternative political solutions seemed positively blasphemous. The military set out to violently revert these trends and protect Argentina from the rising tide of modernity. To do so, they devised a plan of systematic annihilation that targeted especially young Argentines. It was not just an ideological struggle, but a generational war: about 83 percent of the dictatorship’s estimated 30,000 fatal victims were under thirty-five. (A disproportionate number also were Jewish.)"



"If you want to know what sustains totalitarian violence in a society, psychology is probably more useful than political analysis. Among the elite, support for the dictatorship was enthusiastic. “It was seen as kind of a social faux pas to talk about ‘desaparecidos’ or what was going on,” says Raymond McKay, a fellow journalist at the Buenos Aires Herald, in Messenger on a White Horse, a 2017 documentary about the newspaper. “It was seen as bad taste because the people didn’t want to know.”

Those who have lived their entire lives in functioning democracies may find it hard to grasp how easily minds can be won over to the totalitarian dark side. We assume such a passage would require slow, laborious persuasion. It does not. The transition from day to night is bewilderingly swift. Despite what many assume, civilized coexistence in a culture of tolerance is not always the norm, or even universally desired. Democracy is a hard-won, easily rolled back state of affairs from which many secretly yearn to be released.

Lest there be any doubt of its intention, the dictatorship titled itself the “Process of National Reorganization.” Books were burned. Intellectuals went into exile. Like medieval Inquisitors, the dictatorship proclaimed itself—in fiery speeches that I hear echoed in the conspiracist rants of American populists and nationalists today—to be waging a war to save “Western and Christian civilization” from oblivion. Such a war by definition included the physical annihilation of infected minds, even if they had committed no crime.

Another horrifying characteristic of totalitarianism is how it picks on the weakest elements in society, immigrants and children. The Darré-inspired Lebensborn program seized Aryan-looking children from Nazi-occupied territories, separating them from their parents and raising them as “pure” Germans in Lebensborn homes. In 1970s Argentina, the military devised a similar program. There were a large number of pregnant women among the thousands of young captives in the dictatorship’s death camps. Killing them while carrying their babies was a crime that not even Argentina’s military could bring themselves to commit. Instead, they kept the women alive as human incubators, murdering them after they gave birth and handing their babies to God-fearing military couples to raise as their own. A society that separates children from their parents, for whatever reason, is a society that is already on the path to totalitarianism.

This heinous practice partly inspired Margaret Atwood’s 1985 book The Handmaid’s Tale. “The generals in Argentina were dumping people out of airplanes,” Atwood said in an interview with The Los Angeles Times last year. “But if it was a pregnant woman, they would wait until she had the baby and then they gave the baby to somebody in their command system. And then they dumped the woman out of the airplane.”

This was the ultimate revenge of fearful older men upon a rebellious younger generation. Not only would they obliterate their perceived enemy, but the children of that enemy would be raised to become the model authority-obeying citizens against whom their biological parents had rebelled. It is estimated that some five hundred babies were taken from their murdered mothers this way, though so far only 128 have been found and identified via DNA testing. Not all of these have accepted reunification with their biological families."



"For many Argentines, then, the military represented not a subjugation to arbitrary rule, but a release from the frustrations, complexity, and compromises of representative government. A large part of society clasped with joy the extended hand of totalitarian certainty. Life was suddenly simplified by conformity to a single, uncontested power. For those who cherish democracy, it is necessary to comprehend the secret delight with which many greeted its passing. A quick fix to the insurgency seemed infinitely preferable to plodding investigations, piecemeal arrests, and case-by-case lawful trials. Whipped up by the irrational fear of a communist takeover, this impatience won the day. And once Argentina had accepted the necessity for a single, absolute solution, the killing could begin."
argentina  totalitarianism  fascism  history  2018  margaretatwood  nazis  wwii  ww2  hatred  antisemitism  germany  surveillance  trust  democracy  certainty  robertcox  ukigoñi  richardwaltherdarré  repressions  government  psychology  politics  christianity  catholicism  catholicchurch  antoniocaggiano  adolfeichmann  military  power  control  authoritarianism  patriarchy  paternalism  normalization  silence  resistance  censorship  dictatorship  oscarivanissevich  education  raymondmackay  juanperón  evita  communism  paranoia  juliomeinvielle  exile  generations 
november 2018 by robertogreco
Silicon Valley Nannies Are Phone Police for Kids - The New York Times
[This is one of three connected articles:]

"Silicon Valley Nannies Are Phone Police for Kids
Child care contracts now demand that nannies hide phones, tablets, computers and TVs from their charges."
https://www.nytimes.com/2018/10/26/style/silicon-valley-nannies.html

"The Digital Gap Between Rich and Poor Kids Is Not What We Expected
America’s public schools are still promoting devices with screens — even offering digital-only preschools. The rich are banning screens from class altogether."
https://www.nytimes.com/2018/10/26/style/digital-divide-screens-schools.html

"A Dark Consensus About Screens and Kids Begins to Emerge in Silicon Valley
“I am convinced the devil lives in our phones.”"
https://www.nytimes.com/2018/10/26/style/phones-children-silicon-valley.html

[See also:
"What the Times got wrong about kids and phones"
https://www.cjr.org/criticism/times-silicon-valley-kids.php

https://twitter.com/edifiedlistener/status/1058438953299333120
"Now that I've had a chance to read this article [specifically: "The Digital Gap Between Rich and Poor Kids Is Not What We Expected"] and some others related to children and screen time and the wealthy and the poor, I have some thoughts. 1/

First, this article on the unexpected digital divide between rich and poor seems entirely incomplete. There is an early reference to racial differences in screen usage but in the article there are no voices of black or brown folks that I could detect. 2/

We are told a number of things: Wealthy parents are shunning screens in their children's lives, psychologists underscore the addictive nature of screen time on kids, and of course, whatever the short end of the stick is - poor kids get that. 3/

We hear "It could happen that the children of poorer and middle-class parents will be raised by screens," while wealthy kids will perhaps enjoy "wooden toys and the luxury of human interaction." 4/

Think about that and think about the stories that have long been told about poor families, about single parents, about poor parents of color - They aren't as involved in their kids' education, they are too busy working. Familiar stereotypes. 5/

Many of these judgments often don't hold up under scrutiny. So much depends upon who gets to tell those stories and how those stories are marketed, sold and reproduced. 6/

In this particular story about the privilege of being able to withdraw from or reduce screen time, we get to fall back into familiar narratives especially about the poor and non-elite. 7/

Of course those with less will be told after a time by those with much more - "You're doing it wrong." And "My child will be distinguished by the fact that he/she/they is not dependent on a device for entertainment or diversion." 8/

My point is not that I doubt the risks and challenges of excessive screen time for kids and adults. Our dependence on tech *is* a huge social experiment and the outcomes are looking scarier by the day. 9/

I do, however, resist the consistent need of the wealthy elite to seek ways to maintain their distance to the mainstream. To be the ones who tell us what's "hot, or not" - 10/

Chris Anderson points out "“The digital divide was about access to technology, and now that everyone has access, the new digital divide is limiting access to technology,” - 11/

This article and its recent close cousins about spying nannies in SV & more elite parent hand wringing over screen in the NYT feel like their own category of expensive PR work - again allowing SV to set the tone. 12/

It's not really about screens or damage to children's imaginations - it's about maintaining divides, about insuring that we know what the rich do (and must be correct) vs what the rest of us must manage (sad, bad). 13/fin]
siliconvalley  edtech  children  technology  parenting  2018  nelliebowles  addiction  psychology  hypocrisy  digitaldivide  income  inequality  ipads  smartphones  screentime  schools  education  politics  policy  rules  childcare  policing  surveillance  tracking  computers  television  tv  tablets  phones  mobile  teaching  learning  howwelearn  howweteach  anyakamenetz  sherrispelic  ipad 
october 2018 by robertogreco
Carol Black on Twitter: "FYI: Dr. Chester M. Pierce, who coined the term "microaggression," also coined the term "childism:" https://t.co/vYyMkeWWpj HT @TobyRollo #Childism… https://t.co/2ZOH24MVIf"
"FYI:

Dr. Chester M. Pierce, who coined the term "microaggression," also coined the term "childism:"

https://www.healio.com/psychiatry/journals/psycann/1975-7-5-7/%7B289c676d-8693-4e7a-841e-2ce5d7f6d9f2%7D/childism HT @TobyRollo #Childism
"We contend that childism is the basic form of oppression in our society and underlies all alienation and violence, for it teaches everyone how to be an oppressor and makes them focus on the exercise of raw power rather than on volitional humaneness...

"Like its derivatives, sexism and racism, it is found in virtually everyone. Modification of childist practices would alter other oppressive systems that retard the development of humankind to its full potential."

—CHESTER M. PIERCE, MD GAIL B. ALLEN, MD

2. "In childism, the child-victim is put on the defensive. He is expected to accommodate himself to the adult-aggressor, and is hardly ever permitted to initiate action or control a situation."

3. "The vehicle for most adult action is microaggression; the child is not rendered a gross brutalization, but is treated in such a way as to lower his self-esteem, dignity, and worthiness by means of subtle, cumulative, and unceasing adult deprecation."

4. "As a result of this constant barrage of micro-aggression, the child remains on the defensive, mobilizing constantly to conform and perform. This incessant mobilization is not without cost, psychologically and probably physiologically."

5. "These children have not been physically assaulted. They have, however, been subjected to a number of pejorative acts; the posture, gestures, tone of voice... were an abuse that indicates their inferiority, for no other reason than their social attribute of childhood."

6. "If such abuse were an isolated occurrence, it could be ignored. Yet in all probability these youngsters receive the same gratuitously abusive behavior many times a day from "loving parents," "devoted teachers," "kindly physicians," "concerned policemen..."

7. "This places the child in circumstances that bring about serious, protracted... stress... It has a cumulative effect that may exert a powerful influence on his adult behavior, just as sexist or racist practices affect the entire future of women or members of a minority group."

8. "Children remain the most oppressed group... The more we understand the oppression of children, the more we understand oppression of any individual or group. With a more informed understanding of this process, many traditional dominance patterns could be modified."

~ Chester M. Pierce, MD, former Professor of Psychiatry at Harvard Medical School and Professor of Education at Harvard University, and Gail B. Allen, MD. http://www.mghglobalpsychiatry.org/chesterpierce.php "
chesterpierce  gailallen  carolblack  childism  ageism  2018  microagression  tobyrollo  authoritarianism  deschooling  schooling  unschooling  schooliness  psychology  oppression  power  control  adults  behavior  stress  sexism  racism  children  dominance 
october 2018 by robertogreco
The Shifting Landscape of Buddhism in America - Lion's Roar
"The first wave of academic scholarship on these communities was published around the turn of the millennium, as the study of Buddhism in America emerged as a distinct academic subfield. Influential books included Charles S. Prebish’s Luminous Passage: The Practice and Study of Buddhism in America (1999), Richard Hughes Seager’s Buddhism in America (1999), and James Coleman’s The New Buddhism: The Western Transformation of an Ancient Religion (2002). One common distinction made in this early research was between the so-called “two Buddhisms” in America: “ethnic” and “convert.” According to the researchers, the ethnic or “immigrant” Buddhism of Asian Americans (what scholars now commonly refer to as heritage Buddhism) focused on communal, devotional, and merit-making activities within a traditional cosmological context, whereas the convert Buddhism of overwhelmingly white, upper-middle class practitioners was individualistic, primarily focused on meditation practice and psychological in its approach.

An early challenge to the “two Buddhisms” typology came from scholar Jan Nattier, who observed that not all converts are white, and that some convert-populated communities, such as Soka Gakkai, do not privilege meditation. She proposed an alternative “three Buddhisms” typology—import, export, and baggage—that moved away from ethnicity and race and focused on the mode by which various forms of Buddhism were brought to the U.S.

As Scott Mitchell and Natalie Quli note in their coedited collection Buddhism Beyond Borders: New Perspectives on Buddhism in the United States (2015), and as Mitchell unpacks in his Buddhism in America: Global Religions, Local Contexts (2016), there have been numerous dramatic changes in the social and cultural landscape of America since those studies were published over a decade ago. These changes, as evidenced by the Maha Teacher Council, have brought new questions and concerns to meditation-based convert communities: Who has the authority to define and represent “American” Buddhism? What is the impact of mindfulness transitioning from a countercultural religious practice to a mainstream secular one? How have technology and the digital age affected Buddhist practice? In what ways are generational and demographic shifts changing meditation-based convert communities?

My research explores these questions through a series of case studies, highlighting four areas in which major changes are occurring, pushing these communities beyond their first-generation expressions.

Addressing the Exclusion of Asian Americans

Central to the shifting landscape of contemporary American Buddhism is a rethinking of the distinction between “convert” and “heritage” Buddhisms as practitioners and scholars have become increasingly aware of the problematic nature of both the “two Buddhisms” and “three Buddhisms” typologies. An early challenge came from Rev. Ryo Imamura, a Jodo Shinshu Buddhist priest, in a letter to Tricycle: The Buddhist Review in 1992. That winter, magazine founder and editor Helen Tworkov had written that “The spokespeople for Buddhism in America have been, almost exclusively, educated members of the white middle class. Asian American Buddhist so far have not figured prominently in the development of something called American Buddhism.” Rev. Imamuru correctly pointed out that this statement disregarded the contributions of Asian American immigrants who had nurtured Buddhism in the U.S. since the eighteenth century and implied that Buddhism only became truly American when white Americans practiced it. Although written twenty-five years ago, Rev. Imamura’s letter was only recently published in its entirety with a commentary by Funie Hsu on the Buddhist Peace Fellowship’s website. Hsu and Arunlikhati, who has curated the blog Angry Asian Buddhist since 2011, have emerged as powerful voices in bringing long-overdue attention to the erasure of Asian Americans from Buddhism in the U.S and challenging white privilege in American meditation-based convert communities.

Another shortcoming of the heritage/convert distinction is that it does not account for practitioners who bridge or disrupt this boundary. Where, for example, do we place second- and third-generation Asian Americans who have grown up in Asian American Buddhist communities but now practice in meditation-based lineages? What about Asian Americans who have converted to Buddhism from other religions, or from non-religious backgrounds? Chenxing Han’s promising research, featured in Buddhadharma’s Summer 2016 edition, brings the many different voices of these marginalized practitioners to the forefront. Similarly, how do we categorize “cradle Buddhists,” sometimes jokingly referred to as “dharma brats,” who were born into Buddhist “convert” communities? Millennials Lodro Rinzler and Ethan Nichtern—two of the most popular young American Buddhist teachers—fall into this category, having grown up in the Shambhala Buddhist tradition. How do such new voices affect meditation-based convert lineages?

Rev. Imamura’s letter echoes the early characterization of primarily white, meditation-based convert communities, observing that “White practitioners practice intensive psychotherapy on their cushions in a life-or-death struggle with the individual ego, whereas Asian Buddhists seem to just smile and eat together.” It is of little surprise then that the theme of community appears strongly in the work of Arunlikhati, Hsu, and Han. Arunlikhati has most recently written about the need to create refuges for Buddhists of color—”spaces where people can find true comfort and well-being”—and shares that his dream “is for Western Buddhism to be like a family that accepts all of its members openly.” In challenging white privilege, Asian Americans and other practitioners of color have been instrumental in recovering and building the neglected third refuge—sangha—in meditation-based convert Buddhism."



"Three Emerging Turns
In my forthcoming book, I posit three emerging turns, or sensibilities, within meditation-based convert Buddhism: critical, contextual, and collective. The critical turn refers to a growing acknowledgement of limitations within Buddhist communities. First-generation practitioners tended to be very celebratory of “American Buddhism,” enthusing that they were creating new, more modern, and “essential” forms of Buddhism that were nonhierarchical, gender-egalitarian, and free of the cultural and religious “baggage” of their Asian predecessors. While the modernization and secularization of Buddhism certainly continues, there is now much more discussion about the problems and pitfalls of these processes, with some exposing the Western ethnocentrism that has operated behind the “essential” versus “cultural” distinction. This understanding acknowledges that meditation-based convert Buddhism is as culturally shaped as any other form of Buddhism. Some, drawing attention to what is lost when the wider religious context of Buddhism is discarded, have called for a reengagement with neglected aspects of the tradition such as ritual and community.

The contextual turn refers to the increasing awareness of how Buddhist practice is shaped and limited by the specific social and cultural contexts in which it unfolds. In the case of the mindfulness debates, critics have argued that mindfulness has become commodified and assimilated into the context of global capitalism and neoliberalism. Another heated debate is around power and privilege in American Buddhist communities. Take, for instance, Pablo Das’s response to Buddhist teachers’ reflections on the U.S. presidential election, in which he critiques their perspectives as reflective of a privileged social location that negates the trauma of marginalized communities. Das suggests that calls to meditate and to “sit with what is” are not sufficient to create safety for vulnerable populations, and he warns against misusing Buddhist teachings on impermanence, equanimity, and anger to dismiss the realities of such groups. Insight teachers Sebene Selassie and Brian Lesage have fostered a dialogue between sociocultural awareness and Buddhism, developing a course for the Barre Center for Buddhist Studies titled “Buddha’s Teaching and Issues of Cultural Spiritual Bypassing,” which explores how unconscious social conditioning manifests both individually and collectively.

The collective turn refers to the multiple challenges to individualism as a cornerstone of meditation-based convert lineages. One shift has come in the form of efforts toward building inclusive sanghas. Another is the development of relational forms of meditation practice such as external mindfulness. And a third expression is the concept of “collective awakening,” hinted at in Thich Nhat Hanh’s suggestion that “the next Buddha might take the form of a community,” as well as the application of Buddhist principles and practices to the collective dukkha caused by racism and capitalism.

The first generation of meditation-based convert practitioners brought the discourses of psychology, science, and liberal feminism to their encounter with already modernized forms of Asian Buddhism. With the “three turns,” previously excluded, neglected, or entirely new conversations—around critical race theory, postcolonial thought, and cultural studies—are shaping the dialogue of Buddhist modernism. These are not necessarily replacing earlier influences but sitting alongside them and engaging in often-heated debates. Moreover, due to social media and the lively Buddhist blogosphere, these dialogues are also finding a much larger audience. While it is difficult to predict the extent to which these new perspectives will shape the future of Buddhism in America, the fact that they are particularly evident in Gen X and millennial practitioners suggests that their impact will be significant… [more]
us  buddhism  religion  2018  conversion  race  identity  mindfulness  annagleig  whiteprivilege  inclusion  racialjustice  history  diversity  meditation  babyboomers  generations  genx  millennials  pluralism  individualism  accountability  psychology  converts  boomers 
august 2018 by robertogreco
Maya Children In Guatemala Are Great At Paying Attention. What's Their Secret? : Goats and Soda : NPR
"So maybe the Maya children are more attentive in the origami/toy experiment — not because they have better attention spans — but because they are more motivated to pay attention. Their parents have somehow motivated them to pay attention even without being told.

To see this Maya parenting firsthand, I traveled down to a tiny Maya village in Yucatan, Mexico, and visited the home of Maria Tun Burgos. Researchers have been studying her family and this village for years.

On a warm April afternoon, Tun Burgos is feeding her chickens in backyard. Her three daughters are outside with her, but they doing basically whatever they want.

The oldest daughter, Angela, age 12, is chasing a baby chick that's gotten out of the pen. The middle girl, Gelmy, age 9, is running in and out of the yard with neighborhood kids. Most of the time, no one is really sure where she is. And the littlest daughter, Alexa, who is 4 years old, has just climbed up a tree.

"Alone, without mama," the little daredevil declares.

Right away, I realize what these kids have that many American kids miss out on: an enormous amount of freedom. The freedom to largely choose what they do, where they go, whom they do it with. That means, they also have the freedom to control what they pay attention to.

Even the little 4-year-old has the freedom to leave the house by herself, her mother says.

"Of course she can go shopping," Tun Burgos says. "She can buy some eggs or tomatoes for us. She knows the way and how to stay out of traffic."

Now the kids aren't just playing around in the yard. They're still getting work done. They go to school. They do several after-school activities — and many, many chores. When I was with the family, the oldest girl did the dishes even though no one asked her to, and she helped take care of her little sisters.

But the kids, to a great extent, set their schedules and agendas, says Suzanne Gaskins, a psychologist at Northeastern Illinois University, who has studied the kids in this village for decades.

"Rather than having the mom set the goal — and then having to offer enticements and rewards to reach that goal — the child is setting the goal," Gaskins says. "Then the parents support that goal however they can."

The parents intentionally give their children this autonomy and freedom because they believe it's the best way to motivate kids, Gaskins says.

"The parents feel very strongly that every child knows best what they want," she says. "And that goals can be achieved only when a child wants it."

And so they will do chores when they want to be helpful for their family.

With this strategy, Maya children also learn how to manage their own attention, instead of always depending on adults to tell them what to pay attention to, says Barbara Rogoff, who is a professor at the University of California Santa Cruz.

"It may be the case that [some American] children give up control of their attention when it's always managed by an adult," she says.

Turns out these Maya moms are onto something. In fact, they are master motivators.

Motivating kids, the Maya way
Although neuroscientists are just beginning to understand what's happening in the brain while we pay attention, psychologists already have a pretty good understanding of what's needed to motivate kids.

Psychologist Edward Deci has been studying it for nearly 50 years at the University of Rochester. And what does he say is one of the most important ingredients for motivating kids?

"Autonomy," Deci says. "To do something with this full sense of willingness and choice."

Many studies have shown that when teachers foster autonomy, it stimulates kids' motivation to learn, tackle challenges and pay attention, Deci says.

But in the last few decades, some parts of our culture have turned in the other direction, he says. They've started taking autonomy away from kids — especially in some schools.

"One of the things we've been doing in the American school system is making it more and more controlling rather than supportive," Deci says.

And this lack of autonomy in school inhibits kids' ability to pay attention, he says.

"Oh without question it does," Deci says. "So all of the high stakes tests are having negative consequences on the motivation, the attention and the learning of our children."

Now, many parents in the U.S. can't go full-on Maya to motivate kids. It's often not practical — or safe — to give kids that much autonomy in many places, for instance. But there are things parents here can do, says cognitive psychologist Mike Esterman.

For starters, he says, ask your kid this question: 'What would you do if you didn't have to do anything else?' "

"Then you start to see what actually motivates them and what they want to engage their cognitive resources in when no one tells them what they have to to do," Esterman says.

Then create space in their schedule for this activity, he says.

"For my daughter, I've been thinking that this activity will be like her 'passion,' and it's the activity I should be fostering," he says.

Because when a kid has a passion, Esterman says, it's golden for the child. It's something that will bring them joy ... and hone their ability to pay attention."
children  attention  education  parenting  psychology  passion  2018  maya  barbararogoff  maricelacorrea-chavez  behavior  autonomy  motivation  intrinsicmotivation 
july 2018 by robertogreco
Dan Ariely on Irrationality, Bad Decisions, and the Truth About Lies
"On this episode of the Knowledge Project, I’m joined by the fascinating Dan Ariely. Dan just about does it all. He has delivered 6 TED talks with a combined 20 million views, he’s a multiple New York Times best-selling author, a widely published researcher, and the James B Duke Professor of Psychology and Behavioral Economics at Duke University.

For the better part of three decades, Dan has been immersed in researching why humans do some of the silly, irrational things we do. And yes, as much as we’d all like to be exempt, that includes you too.

In this captivating interview, we tackle a lot of interesting topics, including:

• The three types of decisions that control our lives and how understanding our biases can help us make smarter decisions

• How our environment plays a big role in our decision making and the small changes we can make to automatically improve our outcomes

• The “behavioral driven” bathroom scale Dan has been working on to revolutionize weight loss

• Which of our irrational behaviors transfer across cultures and which ones are unique to certain parts of the world (for example, find out which country is the most honest)

• The dishonesty spectrum and why we as humans insist on flirting with the line between “honest” and “dishonest”

• 3 sneaky mental tricks Dan uses to avoid making ego-driven decisions [https://www.fs.blog/smart-decisions/ ]

• “Pluralistic ignorance” [https://www.fs.blog/2013/05/pluralistic-ignorance/ ] and how it dangerously affects our actions and inactions (As a bonus, Dan shares the hilarious way he demonstrates this concept to his students on their first day of class)

• The rule Dan created specifically for people with spinach in their teeth

• The difference between habits, rules and rituals, and why they are critical to shaping us into who we want to be

This was a riveting discussion and one that easily could have gone for hours. If you’ve ever wondered how you’d respond in any of these eye-opening experiments, you have to listen to this interview. If you’re anything like me, you’ll learn something new about yourself, whether you want to or not."
danariely  decisionmaking  decisions  truth  lies  rationality  irrationality  2018  habits  rules  psychology  ritual  rituals  danielkahneman  bias  biases  behavior  honesty  economics  dishonesty  human  humans  ego  evolutionarypsychology  property  capitalism  values  ownership  wealth  care  caretaking  resilience  enron  cheating 
may 2018 by robertogreco
[Essay] | Punching the Clock, by David Graeber | Harper's Magazine
"In 1901, the German psychologist Karl Groos discovered that infants express extraordinary happiness when they first discover their ability to cause predictable effects in the world. For example, they might scribble with a pencil by randomly moving their arms and hands. When they realize that they can achieve the same result by retracing the same pattern, they respond with expressions of utter joy. Groos called this “the pleasure at being the cause,” and suggested that it was the basis for play.

Before Groos, most Western political philosophers, economists, and social scientists assumed that humans seek power out of either a desire for conquest and domination or a practical need to guarantee physical gratification and reproductive success. Groos’s insight had powerful implications for our understanding of the formation of the self, and of human motivation more generally. Children come to see that they exist as distinct individuals who are separate from the world around them by observing that they can cause something to happen, and happen again. Crucially, the realization brings a delight, the pleasure at being the cause, that is the very foundation of our being.

Experiments have shown that if a child is allowed to experience this delight but then is suddenly denied it, he will become enraged, refuse to engage, or even withdraw from the world entirely. The psychiatrist and psychoanalyst Francis Broucek suspected that such traumatic experiences can cause many mental health issues later in life.

Groos’s research led him to devise a theory of play as make-believe: Adults invent games and diversions for the same reason that an infant delights in his ability to move a pencil. We wish to exercise our powers as an end in themselves. This, Groos suggested, is what freedom is—the ability to make things up for the sake of being able to do so.

The make-believe aspect of the work is precisely what performers of bullshit jobs find the most infuriating. Just about anyone in a supervised wage-labor job finds it maddening to pretend to be busy. Working is meant to serve a purpose—if make-believe play is an expression of human freedom, then make-believe work imposed by others represents a total lack of freedom. It’s unsurprising, then, that the first historical occurrence of the notion that some people ought to be working at all times, or that work should be made up to fill their time even in the absence of things that need
doing, concerns workers who are
not free: prisoners and slaves."



"The idea that workers have a moral obligation to allow their working time to be dictated has become so normalized that members of the public feel indignant if they see, say, transit workers lounging on the job. Thus busywork was invented: to ameliorate the supposed problem of workers not having enough to do to fill an eight-hour day. Take the experience of a woman named Wendy, who sent me a long history of pointless jobs she had worked:

“As a receptionist for a small trade magazine, I was often given tasks to perform while waiting for the phone to ring. Once, one of the ad- sales people dumped thousands of paper clips on my desk and asked me to sort them by color. She then used them interchangeably.

“Another example: my grandmother lived independently in an apartment in New York City into her early nineties, but she did need some help. We hired a very nice woman to live with her, help her do shopping and laundry, and keep an eye out in case she fell or needed help. So, if all went well, there was nothing for this woman to do. This drove my grandmother crazy. ‘She’s just sitting there!’ she would complain. Ultimately, the woman quit.”

This sense of obligation is common across the world. Ramadan, for example, is a young Egyptian engineer working for a public enterprise in Cairo.

The company needed a team of engineers to come in every morning and check whether the air conditioners were working, then hang around in case something broke. Of course, management couldn’t admit that; instead, the firm invented forms, drills, and box-­ticking rituals calculated to keep the team busy for eight hours a day. “I discovered immediately that I hadn’t been hired as an engineer at all but really as some kind of technical bureaucrat,” Ramadan explained. “All we do here is paperwork, filling out checklists and forms.” Fortunately, Ramadan gradually figured out which ones nobody would notice if he ignored and used the time to indulge a growing interest in film and literature. Still, the process left him feeling hollow. “Going every workday to a job that I considered pointless was psychologically exhausting and left me depressed.”

The end result, however exasperating, doesn’t seem all that bad, especially since Ramadan had figured out how to game the system. Why couldn’t he see it, then, as stealing back time that he’d sold to the corporation? Why did the pretense and lack of purpose grind him down?

A bullshit job—where one is treated as if one were usefully employed and forced to play along with the pretense—is inherently demoralizing because it is a game of make-­believe not of one’s own making. Of course the soul cries out. It is an assault on the very foundations of self. A human being unable to have a meaningful impact on the world ceases to exist."
davidgraeber  2018  work  bullshitjobs  capitalism  karlgroos  purpose  well-being  life  living  labor  play  pleasure  delight  employment  depression  slave  wageslavery  wages  freedom  humans  psychology  obligation  morality  care  caring  despair  consumerism 
may 2018 by robertogreco
The Best Mother's Day Gift: Get Mom Out Of The Box : Goats and Soda : NPR
"Secrets Of A Maya Supermom: What Parenting Books Don't Tell You"

[via: https://twitter.com/cblack__/status/996812739073880064 ]

"As psychologist Ben Bradley argues in his book Vision of Infancy, a Critical Introduction to Psychology: "Scientific observations about babies are more like mirrors which reflect back the preoccupations and visions of those who study them than like windows opening directly on the foundations of the mind."

And sometimes the data supporting the recommendation are so flimsy that another study in a few years will come along and not only overturn the first study but completely flip the advice 180 degrees.

This is exactly what happened last year with peanuts. Back in 2000, the American Academy of Pediatrics advised parents not to give babies peanut butter because one study suggested early exposure would increase the risk of developing an allergy. But last year, the medical community made a complete about-face on the advice and now says "Let them eat peanut butter!" Early peanut exposure actually prevents allergies, follow-up studies have found.

So if science isn't the secret sauce to parenting books, what is? To answer that, we have to go back in time.

In the early 1980s, the British writer Christina Hardyment began reviewing more than 650 parenting books and manuals, dating all the way to the mid-1700s when advice publications started appearing in hospitals. The result is an illuminating book, called Dream Babies, which traces the history of parenting advice from 17th century English physician and philosopher John Locke to the modern-day medical couple Bill and Martha Sears.

The conclusions from the book are as clear as your baby's tears: Advice in parenting books is typically based not on rigorous scientific studies as is at times claimed but on the opinions and experiences of the authors and on theories from past parenting manuals — sometimes as long as the 18th century.

Then there's the matter of consistency — or lack thereof. Since the late 1700s, "experts" have flip-flopped recommendations over and over, from advising strict routines and discipline to a more permissive, laissez-faire approach and back again.

"While babies and parents remain constants, advice on the former to the latter veers with the winds of social, philosophical and psychological change," Hardyment writes. "There is no such thing as a generally applicable blueprint for perfect parenting."

Take, for instance, the idea that babies need to feed on a particular schedule. According to Hardyment's research, that advice first appears in a London hospital pamphlet in 1748. Sleep schedules for babies start coming into fashion in the early 1900s. And sleep training? That idea was proposed by a British surgeon-turned-sports writer in 1873. If babies "are left to go to sleep in their cots, and allowed to find out that they do not get their way by crying, they at once become reconciled, and after a short time will go to bed even more readily in the cot than on the lap," John Henry Walsh wrote in his Manual of Domestic Economy.

Even the heated debate about breastfeeding has been simmering, and flaring up, for at least 250 years, Hardyment shows. In the 18th century, mothers didn't have high-tech formula but had many recommendations about what was best for the baby and the family. Should a mother send the baby off to a wet nurse's home, so her husband won't be offended by the sight of a baby suckling? And if the family couldn't afford a wet nurse, there was specially treated cow's milk available or even better, the baby could be nursed by a goat, 18th century parenting books advised. (If you're wondering how moms accomplished such a feat, Hardyment includes an 18th century drawing of a young mom pushing a swaddled newborn underneath a goat's udder.)

Goat udders aside, perhaps the bigger issue with parenting books and advice on the Web is what they aren't telling you. And boy, is there a large hole.

These sources ignore most of the world and come almost entirely from the experience of Western culture. But when it comes to understanding what a baby needs, how kids work and what to do when your toddler is lying on the sidewalk (just asking for a friend), Western society might not be the best place to focus.

"WEIRD," stressed-out parents equal anxious kids?

In 2010, three scientists at the University of British Columbia, Vancouver, rocked the psychology world.

They published a 23-page paper titled "The weirdest people in the world?" And in it, uncovered a major limitation with many psychological studies, especially those claiming to address questions of "human nature."

First, the team noted that the vast majority of studies in psychology, cognitive science and economics — about 96 percent — have been performed on people with European backgrounds. And yet, when scientists perform some of these experiments in other cultures the results often don't match up. Westerners stick out as outliers on the spectrum of behavior, while people from indigenous cultures tend to clump together, more in the middle.

Even in experiments that appear to test basic brain function, like visual perception, Westerners can act strangely. Take one of the most famous optical illusions — the Muller-Lyer illusion, from 1889.

Americans often believe the second line is about 20 percent longer than the first, even though the two lines are exactly the same length. But when scientists gave the test to 14 indigenous cultures, none of them were tricked to the same degree as Westerners. Some cultures, such as the San foragers in southern Africa's Kalahari desert, knew the two lines were equal length.

The conclusion from these analyses was startling: People from Western society, "including young children, are among the least representative populations one could find for generalizing about humans," Joseph Heinrich and his colleagues wrote. The researchers even came up with a catchy acronym to describe the phenomenon. They called our culture WEIRD, for Western, Educated, Industrialized, Rich and Democratic societies.

With that paper, the ethnocentric view of psychology cracked. It wasn't so much that the emperor of psychology had no clothes. It was more that he was dancing around in Western garb pretending to represent all humanity.

A few years later, an anthropologist from Utah State University, David Lancy, performed a similar analysis on parenting. The conclusion was just as clear-cut: When you look around the world and throughout human history, the Western style of parenting is WEIRD. We are outliers.

In many instances, what we think is "necessary" or "critical" for childhood is actually not present in any other cultures around the world or throughout time.

"The list of differences is really, really long," says Lancy, who summarizes them in the second edition of his landmark book, The Anthropology of Childhood: Cherubs, Chattel, Changelings. "There may be 40 to 50 things that we do that you don't see in indigenous cultures."

Perhaps most striking is how Western society segregates children from adults. We have created two worlds: the kid world and the adult world. And we go through great pains to keep them apart. Kids have their own special foods, their own times to go to sleep, their own activities on the weekends. Kids go to school. Parents go to work. "Much of the adult culture ... is restricted [for kids]," Lancy writes. "Children are perceived as too young, uneducated, or burdensome to be readily admitted to the adult sphere."

But in many indigenous cultures, children are immersed in the adult world early on, and they acquire great skills from the experience. They learn to socialize, to do household chores, cook food and master a family's business, Lancy writes.

Western culture is also a relative newcomer to parenting. Hunter-gatherers and other indigenous cultures have had tens of thousands of years to hone their strategies, not to mention that the parent-child relationship actually evolved in these contexts.

Of course, just because a practice is ancient, "natural" or universal doesn't mean it's necessarily better, especially given that Western kids eventually have to live — and hopefully succeed — in a WEIRD society. But widening the parenting lens, even just a smidgen, has a practical purpose: It gives parents options.

"When you look at the whole world and see the diversity out there, parents can start to imagine other ways of doing things," says Suzanne Gaskins, a developmental psychologist at Northeastern Illinois University, who for 40 years has been studying how Maya moms in the Yucatan raise helpful kids.

"Some of the approaches families use in other cultures might fit an American child's needs better than the advice they are given in books or from the pediatricians," she adds."

Who's in charge?

So what kind of different philosophies are out there?

When I spent time with Maya families that Gaskins has studied, I saw a very different approach to control.

In Western culture, parenting is often about control.

"We think of obedience from a control angle. Somebody is in charge and the other one is doing what they are told because they have to," says Barbara Rogoff, a psychologist at the University of California, Santa Cruz, who has studied the Maya culture for 30 years."

And if you pay attention to the way parents interact with children in our society, the idea is blazingly obvious. We tend to boss them around. "Put your shoes on!" or "Eat your sandwich!"

"People think either the adult is in control or the child is in control," Rogoff says.

But what if there is another way to interact with kids that removes control from the equation, almost altogether?

That's exactly what the Maya — and several other indigenous cultures — do. Instead of trying to control children, Rogoff says, parents aim to collaborate with them.

"It's kids and adults together accomplishing a common goal," Rogoff says. "It's not letting the kids do whatever they want. It's a matter of children — and parents — being willing to be … [more]
children  parenting  weird  anthropology  2018  control  maya  mothers  stress  guidance  motherhood  us  michaeleendoucleff  families  knowledge  indigenous  stephaniecoontz  culture  society  respect  johngillis  alloparents  interdependence  communities  community  collaboration  psychology  barbararogoff 
may 2018 by robertogreco
“The Workplace Is Killing People and Nobody Cares” | Stanford Graduate School of Business
"A new book examines the massive health care toll today’s work culture exacts on employees.

Jeffrey Pfeffer has an ambitious aspiration for his latest book. “I want this to be the Silent Spring of workplace health,” says Pfeffer, a professor of organizational behavior at Stanford Graduate School of Business. “We are harming both company performance and individual well-being, and this needs to be the clarion call for us to stop. There is too much damage being done.”

Dying for a Paycheck, published by HarperBusiness and released on March 20, maps a range of ills in the modern workplace — from the disappearance of good health insurance to the psychological effects of long hours and work-family conflict — and how these are killing people.

Pfeffer recently sat for an interview with Insights. The following has been edited for length and clarity."
psychology  mentalhwalth  work  labor  economics  health  healthcare  2018  jeffreypfeffer  food  eating  diet  culture  society  nuriachinchilla  socialpollution  social  humans  human  employment  corporatism  latecapitalism  mindfulness  well-being 
april 2018 by robertogreco
Take your time: the seven pillars of a Slow Thought manifesto | Aeon Essays
"In championing ‘slowness in human relations’, the Slow Movement appears conservative, while constructively calling for valuing local cultures, whether in food and agriculture, or in preserving slower, more biological rhythms against the ever-faster, digital and mechanically measured pace of the technocratic society that Neil Postman in 1992 called technopoly, where ‘the rate of change increases’ and technology reigns. Yet, it is preservative rather than conservative, acting as a foil against predatory multinationals in the food industry that undermine local artisans of culture, from agriculture to architecture. In its fidelity to our basic needs, above all ‘the need to belong’ locally, the Slow Movement founds a kind of contemporary commune in each locale – a convivium – responding to its time and place, while spreading organically as communities assert their particular needs for belonging and continuity against the onslaught of faceless government bureaucracy and multinational interests.

In the tradition of the Slow Movement, I hereby declare my manifesto for ‘Slow Thought’. This is the first step toward a psychiatry of the event, based on the French philosopher Alain Badiou’s central notion of the event, a new foundation for ontology – how we think of being or existence. An event is an unpredictable break in our everyday worlds that opens new possibilities. The three conditions for an event are: that something happens to us (by pure accident, no destiny, no determinism), that we name what happens, and that we remain faithful to it. In Badiou’s philosophy, we become subjects through the event. By naming it and maintaining fidelity to the event, the subject emerges as a subject to its truth. ‘Being there,’ as traditional phenomenology would have it, is not enough. My proposal for ‘evental psychiatry’ will describe both how we get stuck in our everyday worlds, and what makes change and new things possible for us."

"1. Slow Thought is marked by peripatetic Socratic walks, the face-to-face encounter of Levinas, and Bakhtin’s dialogic conversations"

"2. Slow Thought creates its own time and place"

"3. Slow Thought has no other object than itself"

"4. Slow Thought is porous"

"5. Slow Thought is playful"

"6. Slow Thought is a counter-method, rather than a method, for thinking as it relaxes, releases and liberates thought from its constraints and the trauma of tradition"

"7. Slow Thought is deliberate"
slow  slowthought  2018  life  philosophy  alainbadiou  neilpostman  time  place  conservation  preservation  guttormfløistad  cittaslow  carlopetrini  cities  food  history  urban  urbanism  mikhailbakhti  walking  emmanuellevinas  solviturambulando  walterbenjamin  play  playfulness  homoludens  johanhuizinga  milankundera  resistance  counterculture  culture  society  relaxation  leisure  artleisure  leisurearts  psychology  eichardrorty  wittgenstein  socrates  nietzsche  jacquesderrida  vincenzodinicola  joelelkes  giorgioagamben  garcíamárquez  michelfoucault  foucault  asjalacis  porosity  reflection  conviction  laurencesterne  johnmilton  edmundhusserl  jacqueslacan  dispacement  deferral  delay  possibility  anti-philosophy 
march 2018 by robertogreco
Survival of the Kindest: Dacher Keltner Reveals the New Rules of Power
"When Pixar was dreaming up the idea for Inside Out, a film that would explore the roiling emotions inside the head of a young girl, they needed guidance from an expert. So they called Dacher Keltner.

Dacher is a psychologist at UC Berkeley who has dedicated his career to understanding how human emotion shapes the way we interact with the world, how we properly manage difficult or stressful situations, and ultimately, how we treat one another.

In fact, he refers to emotions as the “language of social living.” The more fluent we are in this language, the happier and more meaningful our lives can be.

We tackle a wide variety of topics in this conversation that I think you’ll really enjoy.

You’ll learn:

• The three main drivers that determine your personal happiness and life satisfaction
• Simple things you can do everyday to jumpstart the “feel good” reward center of your brain
• The principle of “jen” and how we can use “high-jen behaviors” to bootstrap our own happiness
• How to have more positive influence in our homes, at work and in our communities.
• How to teach your kids to be more kind and empathetic in an increasingly self-centered world
• What you can do to stay grounded and humble if you are in a position of power or authority
• How to catch our own biases when we’re overly critical of another’s ideas (or overconfident in our own)

And much more. We could have spent an hour discussing any one of these points alone, but there was so much I wanted to cover. I’m certain you’ll find this episode well worth your time."
compassion  kindness  happiness  dacherkeltner  power  charlesdarwin  evolution  psychology  culture  society  history  race  racism  behavior  satisfaction  individualism  humility  authority  humans  humanism  morality  morals  multispecies  morethanhuman  objects  wisdom  knowledge  heidegger  ideas  science  socialdarwinism  class  naturalselection  egalitarianism  abolitionism  care  caring  art  vulnerability  artists  scientists  context  replicability  research  socialsciences  2018  statistics  replication  metaanalysis  socialcontext  social  borntobegood  change  human  emotions  violence  evolutionarypsychology  slvery  rape  stevenpinker  torture  christopherboehm  hunter-gatherers  gender  weapons  democracy  machiavelli  feminism  prisons  mentalillness  drugs  prisonindustrialcomplex  progress  politics  1990s  collaboration  canon  horizontality  hierarchy  small  civilization  cities  urban  urbanism  tribes  religion  dogma  polygamy  slavery  pigeons  archaeology  inequality  nomads  nomadism  anarchism  anarchy  agriculture  literacy  ruleoflaw  humanrights  governance  government  hannah 
march 2018 by robertogreco
OCCULTURE: 67. Carl Abrahamsson & Mitch Horowitz in “Occulture (Meta)” // Anton LaVey, Real Magic & the Nature of the Mind
"Look, I’m not gonna lie to you - we have a pretty badass show this time around. Carl Abrahamsson and Mitch Horowitz are in the house.

Carl Abrahamsson is a Swedish freelance writer, lecturer, filmmaker and photographer specializing in material about the arts & entertainment, esoteric history and occulture. Carl is the author of several books, including a forthcoming title from Inner Traditions called Occulture: The Unseen Forces That Drive Culture Forward.

Mitch Horowitz is the author of One Simple Idea: How Positive Thinking Reshaped Modern Life; Occult America, which received the 2010 PEN Oakland/Josephine Miles Award for literary excellence; and Mind As Builder: The Positive-Mind Metaphysics of Edgar Cayce. Mitch has written for The New York Times, The Wall Street Journal, The Washington Post, Salon, Time.com, and Politico. Mitch is currently in the midst of publishing a series of articles on Medium called "Real Magic".

And it is that series paired with Carl’s book that lays the foundation for our conversation here."
carlabrahamsson  mitchhorowitz  occult  culture  occulture  magic  belief  mind  ouijaboard  astrology  mindfulness  buddhism  religion  academia  antonlavey  materialism  mainstream  intellectualism  elitism  mindbodyspirit  2018  esotericism  authority  norms  nuance  change  enlightenment  popculture  science  humanities  socialsciences  medicine  conservatism  churches  newage  cosmology  migration  california  hippies  meaning  psychology  siliconvalley  ingenuity  human  humans  humannature  spirituality  openmindedness  nature  urbanization  urban  nyc  us  society  santería  vodou  voodoo  voudoun  climate  light  davidlynch  innovation  population  environment  meaningmaking  mikenesmith  californianideology  thought  thinking  philosophy  hoodoo  blackmetal  norway  beauty  survival  wholeperson  churchofsatan  satanism  agency  ambition  mysticism  self  stories  storytelling  mythology  humanism  beinghuman  surrealism  cv  repetition  radicalism  myths  history  renaissance  fiction  fantasy  reenchantment  counterculture  consciousness  highered  highereducation  cynicism  inquiry  realitytele 
february 2018 by robertogreco
The Tyranny of Convenience - The New York Times
"Convenience has the ability to make other options unthinkable. Once you have used a washing machine, laundering clothes by hand seems irrational, even if it might be cheaper. After you have experienced streaming television, waiting to see a show at a prescribed hour seems silly, even a little undignified. To resist convenience — not to own a cellphone, not to use Google — has come to require a special kind of dedication that is often taken for eccentricity, if not fanaticism.

For all its influence as a shaper of individual decisions, the greater power of convenience may arise from decisions made in aggregate, where it is doing so much to structure the modern economy. Particularly in tech-related industries, the battle for convenience is the battle for industry dominance.

Americans say they prize competition, a proliferation of choices, the little guy. Yet our taste for convenience begets more convenience, through a combination of the economics of scale and the power of habit. The easier it is to use Amazon, the more powerful Amazon becomes — and thus the easier it becomes to use Amazon. Convenience and monopoly seem to be natural bedfellows.

Given the growth of convenience — as an ideal, as a value, as a way of life — it is worth asking what our fixation with it is doing to us and to our country. I don’t want to suggest that convenience is a force for evil. Making things easier isn’t wicked. On the contrary, it often opens up possibilities that once seemed too onerous to contemplate, and it typically makes life less arduous, especially for those most vulnerable to life’s drudgeries.

But we err in presuming convenience is always good, for it has a complex relationship with other ideals that we hold dear. Though understood and promoted as an instrument of liberation, convenience has a dark side. With its promise of smooth, effortless efficiency, it threatens to erase the sort of struggles and challenges that help give meaning to life. Created to free us, it can become a constraint on what we are willing to do, and thus in a subtle way it can enslave us.

It would be perverse to embrace inconvenience as a general rule. But when we let convenience decide everything, we surrender too much."



"By the late 1960s, the first convenience revolution had begun to sputter. The prospect of total convenience no longer seemed like society’s greatest aspiration. Convenience meant conformity. The counterculture was about people’s need to express themselves, to fulfill their individual potential, to live in harmony with nature rather than constantly seeking to overcome its nuisances. Playing the guitar was not convenient. Neither was growing one’s own vegetables or fixing one’s own motorcycle. But such things were seen to have value nevertheless — or rather, as a result. People were looking for individuality again.

Perhaps it was inevitable, then, that the second wave of convenience technologies — the period we are living in — would co-opt this ideal. It would conveniencize individuality.

You might date the beginning of this period to the advent of the Sony Walkman in 1979. With the Walkman we can see a subtle but fundamental shift in the ideology of convenience. If the first convenience revolution promised to make life and work easier for you, the second promised to make it easier to be you. The new technologies were catalysts of selfhood. They conferred efficiency on self-expression."



"I do not want to deny that making things easier can serve us in important ways, giving us many choices (of restaurants, taxi services, open-source encyclopedias) where we used to have only a few or none. But being a person is only partly about having and exercising choices. It is also about how we face up to situations that are thrust upon us, about overcoming worthy challenges and finishing difficult tasks — the struggles that help make us who we are. What happens to human experience when so many obstacles and impediments and requirements and preparations have been removed?

Today’s cult of convenience fails to acknowledge that difficulty is a constitutive feature of human experience. Convenience is all destination and no journey. But climbing a mountain is different from taking the tram to the top, even if you end up at the same place. We are becoming people who care mainly or only about outcomes. We are at risk of making most of our life experiences a series of trolley rides.

Convenience has to serve something greater than itself, lest it lead only to more convenience. In her 1963 classic, “The Feminine Mystique,” Betty Friedan looked at what household technologies had done for women and concluded that they had just created more demands. “Even with all the new labor-saving appliances,” she wrote, “the modern American housewife probably spends more time on housework than her grandmother.” When things become easier, we can seek to fill our time with more “easy” tasks. At some point, life’s defining struggle becomes the tyranny of tiny chores and petty decisions.

An unwelcome consequence of living in a world where everything is “easy” is that the only skill that matters is the ability to multitask. At the extreme, we don’t actually do anything; we only arrange what will be done, which is a flimsy basis for a life.

We need to consciously embrace the inconvenient — not always, but more of the time. Nowadays individuality has come to reside in making at least some inconvenient choices. You need not churn your own butter or hunt your own meat, but if you want to be someone, you cannot allow convenience to be the value that transcends all others. Struggle is not always a problem. Sometimes struggle is a solution. It can be the solution to the question of who you are.

Embracing inconvenience may sound odd, but we already do it without thinking of it as such. As if to mask the issue, we give other names to our inconvenient choices: We call them hobbies, avocations, callings, passions. These are the noninstrumental activities that help to define us. They reward us with character because they involve an encounter with meaningful resistance — with nature’s laws, with the limits of our own bodies — as in carving wood, melding raw ingredients, fixing a broken appliance, writing code, timing waves or facing the point when the runner’s legs and lungs begin to rebel against him.

Such activities take time, but they also give us time back. They expose us to the risk of frustration and failure, but they also can teach us something about the world and our place in it.

So let’s reflect on the tyranny of convenience, try more often to resist its stupefying power, and see what happens. We must never forget the joy of doing something slow and something difficult, the satisfaction of not doing what is easiest. The constellation of inconvenient choices may be all that stands between us and a life of total, efficient conformity."
timwu  convenience  efficiency  psychology  business  2018  inconvenience  effort  technology  economics  work  labor  conformity  value  meaning  selfhood  self-expression  change  individuality  slow  slowness  customization  individualization  amazon  facebook  apple  multitasking  experience  human  humanness  passions  hobbies  resistance  struggle  choice  skill  mobile  phones  internet  streaming  applemusic  itunes 
february 2018 by robertogreco
People with depression use language differently – here's how to spot it
"From the way you move and sleep, to how you interact with people around you, depression changes just about everything. It is even noticeable in the way you speak and express yourself in writing. Sometimes this “language of depression” can have a powerful effect on others. Just consider the impact of the poetry and song lyrics of Sylvia Plath and Kurt Cobain, who both killed themselves after suffering from depression.

Scientists have long tried to pin down the exact relationship between depression and language, and technology is helping us get closer to a full picture. Our new study, published in Clinical Psychological Science, has now unveiled a class of words that can help accurately predict whether someone is suffering from depression.

Traditionally, linguistic analyses in this field have been carried out by researchers reading and taking notes. Nowadays, computerised text analysis methods allow the processing of extremely large data banks in minutes. This can help spot linguistic features which humans may miss, calculating the percentage prevalence of words and classes of words, lexical diversity, average sentence length, grammatical patterns and many other metrics.

So far, personal essays and diary entries by depressed people have been useful, as has the work of well-known artists such as Cobain and Plath. For the spoken word, snippets of natural language of people with depression have also provided insight. Taken together, the findings from such research reveal clear and consistent differences in language between those with and without symptoms of depression.

Content

Language can be separated into two components: content and style. The content relates to what we express – that is, the meaning or subject matter of statements. It will surprise no one to learn that those with symptoms of depression use an excessive amount of words conveying negative emotions, specifically negative adjectives and adverbs – such as “lonely”, “sad” or “miserable”.

More interesting is the use of pronouns. Those with symptoms of depression use significantly more first person singular pronouns – such as “me”, “myself” and “I” – and significantly fewer second and third person pronouns – such as “they”, “them” or “she”. This pattern of pronoun use suggests people with depression are more focused on themselves, and less connected with others. Researchers have reported that pronouns are actually more reliable in identifying depression than negative emotion words.

Language can be separated into two components: content and style. The content relates to what we express – that is, the meaning or subject matter of statements. It will surprise no one to learn that those with symptoms of depression use an excessive amount of words conveying negative emotions, specifically negative adjectives and adverbs – such as “lonely”, “sad” or “miserable”.

More interesting is the use of pronouns. Those with symptoms of depression use significantly more first person singular pronouns – such as “me”, “myself” and “I” – and significantly fewer second and third person pronouns – such as “they”, “them” or “she”. This pattern of pronoun use suggests people with depression are more focused on themselves, and less connected with others. Researchers have reported that pronouns are actually more reliable in identifying depression than negative emotion words.

We know that rumination (dwelling on personal problems) and social isolation are common features of depression. However, we don’t know whether these findings reflect differences in attention or thinking style. Does depression cause people to focus on themselves, or do people who focus on themselves get symptoms of depression?

Style

The style of language relates to how we express ourselves, rather than the content we express. Our lab recently conducted a big data text analysis of 64 different online mental health forums, examining over 6,400 members. “Absolutist words” – which convey absolute magnitudes or probabilities, such as “always”, “nothing” or “completely” – were found to be better markers for mental health forums than either pronouns or negative emotion words.

From the outset, we predicted that those with depression will have a more black and white view of the world, and that this would manifest in their style of language. Compared to 19 different control forums (for example, Mumsnet and StudentRoom), the prevalence of absolutist words is approximately 50% greater in anxiety and depression forums, and approximately 80% greater for suicidal ideation forums.

Pronouns produced a similar distributional pattern as absolutist words across the forums, but the effect was smaller. By contrast, negative emotion words were paradoxically less prevalent in suicidal ideation forums than in anxiety and depression forums.

Our research also included recovery forums, where members who feel they have recovered from a depressive episode write positive and encouraging posts about their recovery. Here we found that negative emotion words were used at comparable levels to control forums, while positive emotion words were elevated by approximately 70%. Nevertheless, the prevalence of absolutist words remained significantly greater than that of controls, but slightly lower than in anxiety and depression forums.

Crucially, those who have previously had depressive symptoms are more likely to have them again. Therefore, their greater tendency for absolutist thinking, even when there are currently no symptoms of depression, is a sign that it may play a role in causing depressive episodes. The same effect is seen in use of pronouns, but not for negative emotion words.

Practical implications

Understanding the language of depression can help us understand the way those with symptoms of depression think, but it also has practical implications. Researchers are combining automated text analysis with machine learning (computers that can learn from experience without being programmed) to classify a variety of mental health conditions from natural language text samples such as blog posts.

Such classification is already outperforming that made by trained therapists. Importantly, machine learning classification will only improve as more data is provided and more sophisticated algorithms are developed. This goes beyond looking at the broad patterns of absolutism, negativity and pronouns already discussed. Work has begun on using computers to accurately identify increasingly specific subcategories of mental health problems – such as perfectionism, self-esteem problems and social anxiety.

That said, it is of course possible to use a language associated with depression without actually being depressed. Ultimately, it is how you feel over time that determines whether you are suffering. But as the World Health Organisation estimates that more than 300m people worldwide are now living with depression, an increase of more than 18% since 2005, having more tools available to spot the condition is certainly important to improve health and prevent tragic suicides such as those of Plath and Cobain."
depression  language  usage  2018  words  wordusage  psychology 
february 2018 by robertogreco
Gifted and Talented and Complicated - The New York Times
"Child prodigies are exotic creatures, each unique and inexplicable. But they have a couple of things in common, as Ann Hulbert’s meticulous new book, “Off the Charts,” makes clear: First, most wunderkinds eventually experience some kind of schism with a devoted and sometimes domineering parent. “After all, no matter how richly collaborative a bond children forge with grown-up guides, some version of divorce is inevitable,” Hulbert writes. “It’s what modern experts would call developmentally appropriate.” Second, most prodigies grow up to be thoroughly unremarkable on paper. They do not, by and large, sustain their genius into adulthood.

What happens to alter the trajectory of shooting stars like Follett? In “Off the Charts,” Hulbert attempts to capture the complicated lives of child prodigies without descending into voyeurism or caricature. She has tried to “listen hard for the prodigies’ side of the story,” to her great credit.

This is an arduous task, and it sometimes shows in the writing, which can be stilted in its reliance on quotes and documentation. But Hulbert’s diligence results in a surprising payoff: The best advice for managing a child prodigy may be a wise strategy for parenting any child, including the many, many nonbrilliant ones.

Hulbert, The Atlantic’s literary editor, wrote her last book, “Raising America,” about the tortured history of parenting advice. So she is appropriately wary of preachy morality tales. “My goal isn’t to pile on the stark cautionary fare. Nor am I aiming to crack some ‘talent code,’” she writes in the prologue for “Off the Charts,” to our great relief.

Instead, she tries to place each of the boys and girls featured in the book in a specific time and place; their celebrity reveals much about their particular moment in American history. For example, Bobby Fischer’s chess prowess might not have been impressive enough for adults to overlook his breathtaking egotism — but for the launching of Sputnik and America’s anxiety about creeping Soviet domination in education and science. One era’s prodigy is another’s anonymous misfit.

The book begins with the story of two gifted boys who attended Harvard at the same time, in the early 1900s. Norbert Wiener, a budding philosopher and mathematician, was 14, and William Sidis, a star in linguistics and mathematics, was only 11. They were not friends, which was a shame. Both suffered under the weight of their elders’ intellectual expectations, combined with the impossibility of fitting in as boys among men. They were told they were superior, but then punished if they acted like it. Their identities depended on superhuman smarts, which made any academic failure feel like a knife to the heart.

Wiener would struggle with depression for the rest of his life, but he did manage to eventually find professional fulfillment at M.I.T., where he helped invent the field of cybernetics. Sidis was not so successful; after fleeing a criminal charge related to a political protest, he did low-level accounting work in New York. He continued to alienate others with his stubborn arrogance before dying at 46 of a cerebral hemorrhage.

What would have helped these boys and the other struggling prodigies in this book? Maybe nothing. But after poring over their words and stories, Hulbert has concluded that they might all offer parents similar advice: Accept who they are.

That doesn’t mean protecting them from failure or stress; quite the opposite. “What they want, and need, is the chance to obsess on their own idiosyncratic terms — to sweat and swerve, lose their balance, get their bearings, battle loneliness, discover resilience,” Hulbert writes. Interestingly, this is the same advice contemporary psychologists tend to give to all parents, not just the parents of prodigies. Parents must hold children accountable and help them thrive, which is easier said than done; but if they try to re-engineer the fundamentals of their offspring, they will fail spectacularly, sooner or later. And this lesson is particularly obvious in the extremes.

“Extraordinary achievement, though adults have rarely cared to admit it, takes a toll,” Hulbert writes. “It demands an intensity that rarely makes kids conventionally popular or socially comfortable. But if they get to claim that struggle for mastery as theirs, in all its unwieldiness, they just might sustain the energy and curiosity that ideally fuels such a quest.”

The special challenge for prodigies is that they are exceptional in more ways than one. “Genius is an abnormality, and abnormalities do not come one at a time,” explains Veda Kaplinsky, a longtime teacher of gifted students, in Andrew Solomon’s “Far From the Tree,” a book that is cited by Hulbert. “Many gifted kids have A.D.D. or O.C.D. or Asperger’s. When the parents are confronted with two sides of a kid, they’re so quick to acknowledge the positive, the talented, the exceptional; they are often in denial over everything else.”

The very traits that make prodigies so successful in one arena — their obsessiveness, a stubborn refusal to conform, a blistering drive to win — can make them pariahs in the rest of life. Whatever else they may say, most teachers do not in fact appreciate creativity and critical thinking in their own students. “Off the Charts” is jammed with stories of small geniuses being kicked out of places of learning. Matt Savage spent two days in a Boston-area Montessori preschool before being expelled. Thanks to parents who had the financial and emotional resources to help him find his way, he is now, at age 25, a renowned jazz musician.

Interestingly, some prodigies may actually do better when their eccentricities are seen by loving adults as disabilities first — and talents second. Hulbert tells the story of Jacob Barnett, born in 1998, who withdrew into autism as a toddler in Indiana. His parents tried every form of therapy they could find, before finally discovering that he could be drawn out through his captivation with astronomy. His mother, Kristine, took him to astronomy classes at the local university — not to jump-start his genius but to help coax him back to life. “If I had stopped and let myself bask in the awe of Jake’s amazing abilities — if I had stopped to ponder how unusual he really is — I don’t think I could have been a good mother to him,” she explained.

The most vivid section of the book comes at the end, when Hulbert reunites with the musical prodigy Marc Yu, a decade after first interviewing him at age 6. With his mother’s support, Yu had tried to ease up on his musical career and live a more normal life, an approach that had worked for other prodigies, including the child actress Shirley Temple. But Yu found that the strategies that worked at the keyboard were useless in high school, where no amount of discipline and focus could make him cool. The adorable, joke-cracking boy she’d remembered had grown into a lonely teenager. “I always expected things to go my way,” Yu told Hulbert. “If I wanted it, I worked hard enough, I got it, and people loved me. That’s no longer true, and I feel I exist in the shadow of popular kids.”

Yu’s story reinforces one of Hulbert’s central, if unsatisfying, findings: Children’s needs change. If you think you’ve got a child figured out, you will be proved wrong momentarily. As Hulbert writes: “Prodigies offer reminders writ large that children, in the end, flout our best and worst intentions.” And adults always overestimate their own influence."
children  prodigies  2017  annhulbert  success  parenting  2018  sfsh  acceptance  psychology  resilience  loneliness  depression 
january 2018 by robertogreco
Rebellious children? At least you're doing something right | Life and style | The Guardian
"We all want impeccably behaved children, right? Well maybe not, says Annalisa Barbieri. Here, she questions why there is such a fashion for taming our youngsters"

"Two stories caught my attention recently. One was a report that breastfed babies are more challenging in their behaviour and the other was about a new book called French Children Don't Throw Food: about how French children apparently behave really well, in restaurants and just generally.

(Hmm. Can I pause here to tell you a story? My aunt was French. She had twins. She'd carry round a little whip – actually several little leather straps of about 6" in length, all coming together into a wooden handle. She would hit my cousins on the back of their legs if they stepped even a tiny bit out of line. The word I remember her saying the most was "arrête". But it is absolutely true to say I never once saw them throw food.)

Most parenting books are about how to get children to do things well. By well, read obediently. When and how you - the adult - want them to do something: eat well, pee in the potty, sleep well (that's the big one), behave well. The aim, it would seem, is to raise compliant children. Because, according to these books, obedient children = successful parents, disobedient = head hanging failures. But actually is an obedient child cause for concern or celebration? The more I thought about it, the more intrigued I became by this question. Telling someone their child is obedient is (usually) meant as a compliment. But an obedient adult? Not quite so attractive is it? We have other words for that, doormat being one of them.

Alfie Kohn, author of 'Unconditional Parenting. Moving from Rewards and Punishments to Love and Reason' says, "When I ask parents, at the beginning of my lectures, what their long term goals are for the children, I hear words such as ethical, compassionate independent happy and so on. No-one ever says mindlessly compliant."

A compliant child becomes a particular concern, Kohn admits, when they reach adolescence. "If they take their orders from other people, that may include people we may not approve of. To put it the other way around: kids who are subject to peer pressure at its worst are kids whose parents taught them to do what they're told."

Alison Roy, lead child and adolescent psychotherapist at East Sussex Child and Adolescent Mental Health Services (CAMHS), says: "A child will push the boundaries if they have a more secure attachment. Children who have been responded to, led to believe - in a healthy way - that their voice is valued, that all they have to do is object and action will be taken - they will push boundaries. And this is really healthy behaviour. Compliance? They've learned there's no point arguing because their voice isn't valued."

So much of what we see as disobedience in children is actually just natural, curious, exploring, learning behaviour. Or reacting – in the only way they know how – to a situation over which they have no control.

"You can threaten or bribe a child into obedience for a little while," explains Kohn, "but you are missing the big picture and failing to address the underlying cause [of why they may not want to do something] which may be environmental – such as rushing a tired child through an unfamiliar place - or they may be psychological, such as fear about something else. A very obedient or complaint child – it depends, some are more docile by temperament - but others have created a false self because they sense their parent will only love them if they are obedient. The need for autonomy doesn't vanish because kids have been cowed into doing what they're told."

A very young child isn't actually meant to be obedient all of the time, according to Roy. This is because their needs are often completely at odds with an adult's. See that lovely wall you've just painted in £100-a-pot paint? That's just one lovely big, blank canvas to a two-year-old with a contraband crayon, who doesn't understand why you praise them so much for drawing on a piece of paper but shout at them for drawing on the wall. You think it's a cold day and want to wrestle a woolly pully over your child's head but actually the child isn't cold and doesn't want it. Imagine going to a friend's house and you accidentally spill a drink and get shouted at, instead of them saying "oh don't worry" and mopping it up. And yet...

There seems to be a real fashion for taming children and the reason seems to be fear: it's not that most people are worried about one incident of wall-scribbling, but that they seem to fear what this behaviour will turn into if it's not kept in check, as if all children are just waiting to grow up into sociopaths. One of the comments I get a lot, at the end of my columns for the Family section of the Guardian (when I have advocated understanding and a more what would be called 'softly softly' approach to a child) is something along the lines of 'they'll turn into a monster if you don't put your foot down/show them who's boss'.

"It's not based on empirical evidence," argues Kohn. "It's a very dark view of human nature.

At the top of my list of what makes a great parent is the courage to say 'I still have something to learn and I need to rethink what I'm doing'. The parents who worry me are those who dismiss the kind of challenge that I and others offer, waving it away as unrealistic or not practical enough, or idealistic." Kohn advises a 'working with', rather than a 'doing to' approach to children. In short, getting to know your child, listening to them. "Talk less, ask more.""
parenting  2012  annalisabarbieri  children  rebellion  obedience  behavior  psychology  power  control  listening  compliance  alisonroy 
january 2018 by robertogreco
The Culture of Childhood: We’ve Almost Destroyed It
[previously posted here: https://www.psychologytoday.com/blog/freedom-learn/201609/biological-foundations-self-directed-education ]

"Children learn the most valuable lessons with other children, away from adults."



"I don’t want to trivialize the roles of adults in children’s lives, but, truth be told, we adults greatly exaggerate our roles in our theories and beliefs about how children develop. We have this adult-centric view that we raise, socialize, and educate children.

Certainly we are important in children’s lives. Children need us. We feed, clothes, shelter, and comfort them. We provide examples (not always so good) of what it’s like to be an adult. But we don’t raise, socialize, or educate them. They do all that for themselves, and in that process they are far more likely to look to other children than to us adults as models. If child psychologists were actually CHILD psychologists (children), theories of child development would be much less about parents and much more about peers.

Children are biologically designed to grow up in a culture of childhood.
Have you ever noticed how your child’s tastes in clothes, music, manner of speech, hobbies, and almost everything else have much more to do with what other children she or he knows are doing or like than what you are doing or like? Of course you have. Children are biologically designed to pay attention to the other children in their lives, to try to fit in with them, to be able to do what they do, to know what they know. Through most of human history, that’s how children became educated, and that’s still largely how children become educated today, despite our misguided attempts to stop it and turn the educating job over to adults.

Wherever anthropologists have observed traditional cultures and paid attention to children as well as adults, they’ve observed two cultures, the adults’ culture and the children’s culture. The two cultures, of course, are not completely independent of one another. They interact and influence one another; and children, as they grow up, gradually leave the culture of childhood and enter into the culture of adulthood. Children’s cultures can be understood, at least to some degree, as practice cultures, where children try out various ways of being and practice, modify, and build upon the skills and values of the adult culture.

I first began to think seriously about cultures of childhood when I began looking into band hunter-gatherer societies. In my reading, and in my survey of anthropologists who had lived in such societies, I learned that the children in those societies — from roughly the age of four on through their mid teen years — spent most of their waking time playing and exploring with groups of other children, away from adults (Gray, 2012, also here). They played in age-mixed groups, in which younger children emulated and learned from older ones. I found that anthropologists who had studied children in other types of traditional cultures also wrote about children’s involvement in peer groups as the primary means of their socialization and education (e.g. Lancy et al, 2010; Eibl-Eibesfeldt, 1989). Judith Harris (1998), in a discussion of such research, noted that the popular phrase It takes a village to raise a child is true if interpreted differently from the usual Western interpretation. In her words (p 161): “The reason it takes a village is not because it requires a quorum of adults to nudge erring youngsters back onto the paths of righteousness. It takes a village because in a village there are always enough kids to form a play group.”

I also realized, as I thought about all this, that my own childhood, in Minnesota and Wisconsin in the 1950s, was in many ways like that of children in traditional societies. We had school (which was not the big deal it is today) and chores, and some of us had part time jobs, but, still, most of our time was spent with other children away from adults. My family moved frequently, and in each village or city neighborhood to which we moved I found a somewhat different childhood culture, with different games, different traditions, somewhat different values, different ways of making friends. Whenever we moved, my first big task was to figure out the culture of my new set of peers, so I could become part of it. I was by nature shy, which I think was an advantage because I didn’t just blunder in and make a fool of myself. I observed, studied, practiced the skills that I saw to be important to my new peers, and then began cautiously to enter in and make friends. In the mid 20th century, a number of researchers described and documented many of the childhood cultures that could be found in neighborhoods throughout Europe and the United States (e.g. Opie & Opie, 1969)."



"Children learn the most important lessons in life from other children, not from adults.
Why, in the course of natural selection, did human children evolve such a strong inclination to spend as much time as possible with other children and avoid adults? With a little reflection, it’s not hard to see the reasons. There are many valuable lessons that children can learn in interactions with other children, away from adults, that they cannot learn, or are much less likely to learn, in interactions with adults. Here are some of them.

Authentic communication. …

Independence and courage. …

Creating and understanding the purpose and modifiability of rules. …

The famous developmental psychologist Jean Piaget (1932) noted long ago that children develop a more sophisticated and useful understanding of rules when they play with other children than when they play with adults. With adults, they get the impression that rules are fixed, that they come down from some high authority and cannot be changed. But when children play with other children, because of the more equal nature of the relationship, they feel free to challenge one another’s ideas about the rules, which often leads to negotiation and change in rules. They learn in this this way that rules are not fixed by heaven, but are human contrivances to make life more fun and fair. This is an important lesson; it is a cornerstone of democracy.

Practicing and building on the skills and values of the adult culture. …

Getting along with others as equals."



"The adult battle against cultures of childhood has been going on for centuries.

Hunter-gatherer adults seemed to understand that children needed to grow up largely in a culture of childhood, with little adult interference, but that understanding seemed to decline with the rise of agriculture, land ownership, and hierarchical organizations of power among adults (Gray, 2012). Adults began to see it as their duty to suppress children’s natural willfulness, so as to promote obedience, which often involved attempts to remove them from the influences of other children and subordinate them to adult authority. The first systems of compulsory schooling, which are the forerunners of our schools today, arose quite explicitly for that purpose.

If there is a father of modern schools, it is the Pietist clergyman August Hermann Francke, who developed a system of compulsory schooling in Prussia, in the late 17th century, which was subsequently copied and elaborated upon throughout Europe and America. Francke wrote, in his instructions to schoolmasters: “Above all it is necessary to break the natural willfulness of the child. While the schoolmaster who seeks to make the child more learned is to be commended for cultivating the child’s intellect, he has not done enough. He has forgotten his most important task, namely that of making the will obedient.” Francke believed that the most effective way to break children’s wills was through constant monitoring and supervision. He wrote: “Youth do not know how to regulate their lives, and are naturally inclined toward idle and sinful behavior when left to their own devices. For this reason, it is a rule in this institution [the Prussian Pietist schools] that a pupil never be allowed out of the presence of a supervisor. The supervisor’s presence will stifle the pupil’s inclination to sinful behavior, and slowly weaken his willfulness.” [Quoted by Melton, 1988.]

We may today reject Francke’s way of stating it, but the underlying premise of much adult policy toward children is still in Francke’s tradition. In fact, social forces have conspired now to put Francke’s recommendation into practice far more effectively than occurred at Francke’s time or any other time in the past. Parents have become convinced that it is dangerous and irresponsible to allow children to play with other children, away from adults, so restrictions on such play are more severe and effective than they have ever been before. By increasing the amount of time spent in school, expanding homework, harping constantly on the importance of scoring high on school tests, banning children from public spaces unless accompanied by an adult, and replacing free play with adult-led sports and lessons, we have created a world in which children are almost always in the presence of a supervisor, who is ready to intervene, protect, and prevent them from practicing courage, independence, and all the rest that children practice best with peers, away from adults. I have argued elsewhere (Gray, 2011, and here) that this is why we see record levels of anxiety, depression, suicide, and feelings of powerlessness among adolescents and young adults today.

The Internet is the savior of children’s culture today

There is, however, one saving grace, one reason why we adults have not completely crushed the culture of childhood. That’s the Internet. We’ve created a world in which children are more or less prevented from congregating in physical space without an adult, but children have found another way. They get together in cyberspace. They play games and communicate over the Internet. They create their own rules and culture and ways of being with others over … [more]
childhood  culture  learning  children  play  rules  age  adults  parenting  schools  petergray  2016  sfsh  openstudioproject  lcproject  self-directed  self-directedlearning  games  unschooling  deschooling  society  behavior  howwelearn  democracy  change  practice  communication  autonomy  online  internet  web  authenticity  courage  hunter-gatherers  augusthermannfrancke  obedience  willfulness  youth  generations  jeanpiaget  ionaopie  peteropie  psychology  anthropology  peers 
january 2018 by robertogreco
Why are Democrats so afraid of taxes?
"Tax hikes on the rich to fund child care, universal health care, higher education, and a green infrastructure bank would immensely benefit both the college-educated and non-college folks who are seeing their standard of living threatened by the GOP. According to Global Strategy Group polling, 85 percent of working-class whites and 80 percent of college-educated whites support higher taxes on the one percent.

Class politics do not threaten the Democratic Party — they may be the only way to save it. But all camps in the Democratic Party are grasping at different parts of the problem. Many strategists on the Hillary Clinton-end of things have rightfully noted that a shift in college-educated white support for Democrats is a positive harbinger for the party. But they have seemingly failed to grasp that the Bernie Sanders wing has a point: these voters can be won over on classic tax and spend social democracy. In 2016, only three percent of college-educated white Clinton voters made more than $250,000 a year, according to the Cooperative Congressional Election Study from that year. Far from worrying about taxes, these voters are increasingly worried about proving health care and child care for their children. Most have seen their retirement security erode and worry about whether their children can afford college. Instead of trying to appeal to a mushy center that doesn’t really exist, Democrats should embrace high taxes, particularly on the rich, to fund social services. The public is ready."
democrats  taxes  policy  208  economics  healthcare  childcare  inequality  banking  finance  richardrorty  hillaryclinton  berniesanders  spencerpiston  class  infrastructure  climatechange  publicgoods  materialism  psychology  emptiness  capitalism 
january 2018 by robertogreco
I Want it All Now! Documentary on Marin County (1978) - YouTube
"From deep in NBC's archives, a funky '70s documentary which brought Marin County, California to national attention, from its fucked up deadbeat parents to its misguided fascination with mystical oriental ooga-booga horseshit. If you ever wondered why people associate peacock feathers and suicide with Marin, this is why. Strangely, Tupac Shakur does not make a cameo.

Each story in this film is an accurate depiction of everyone in Marin and does not deviate from any Marinite's experience, without exception."

[Via: ".@NBCNews did an extraordinary profile of Marin County 40 years ago:" https://twitter.com/nikosleverenz/status/950213237236117504

in response to: "In the 1960s, Marin County pioneered slow-growth environmentalism. Today the county's also home to some of the nation's highest housing costs, decades-old patterns of segregation and has the state's largest racial disparities http://www.latimes.com/politics/la-pol-ca-marin-county-affordable-housing-20170107-story.html "
https://twitter.com/dillonliam/status/950046576554029056 ]
marin  towatch  1978  bayarea  marincounty  1970s  1960s  history  narcissism  wealth  happiness  psychology  self  self-help  selfishness  race  racism  suburbs  sanfrancisco  capitalism  californianideology 
january 2018 by robertogreco
Is everything you think you know about depression wrong? | Society | The Guardian
"So, what is really going on? When I interviewed social scientists all over the world – from São Paulo to Sydney, from Los Angeles to London – I started to see an unexpected picture emerge. We all know that every human being has basic physical needs: for food, for water, for shelter, for clean air. It turns out that, in the same way, all humans have certain basic psychological needs. We need to feel we belong. We need to feel valued. We need to feel we’re good at something. We need to feel we have a secure future. And there is growing evidence that our culture isn’t meeting those psychological needs for many – perhaps most – people. I kept learning that, in very different ways, we have become disconnected from things we really need, and this deep disconnection is driving this epidemic of depression and anxiety all around us.

Let’s look at one of those causes, and one of the solutions we can begin to see if we understand it differently. There is strong evidence that human beings need to feel their lives are meaningful – that they are doing something with purpose that makes a difference. It’s a natural psychological need. But between 2011 and 2012, the polling company Gallup conducted the most detailed study ever carried out of how people feel about the thing we spend most of our waking lives doing – our paid work. They found that 13% of people say they are “engaged” in their work – they find it meaningful and look forward to it. Some 63% say they are “not engaged”, which is defined as “sleepwalking through their workday”. And 24% are “actively disengaged”: they hate it.

Most of the depressed and anxious people I know, I realised, are in the 87% who don’t like their work. I started to dig around to see if there is any evidence that this might be related to depression. It turned out that a breakthrough had been made in answering this question in the 1970s, by an Australian scientist called Michael Marmot. He wanted to investigate what causes stress in the workplace and believed he’d found the perfect lab in which to discover the answer: the British civil service, based in Whitehall. This small army of bureaucrats was divided into 19 different layers, from the permanent secretary at the top, down to the typists. What he wanted to know, at first, was: who’s more likely to have a stress-related heart attack – the big boss at the top, or somebody below him?

Everybody told him: you’re wasting your time. Obviously, the boss is going to be more stressed because he’s got more responsibility. But when Marmot published his results, he revealed the truth to be the exact opposite. The lower an employee ranked in the hierarchy, the higher their stress levels and likelihood of having a heart attack. Now he wanted to know: why?

And that’s when, after two more years studying civil servants, he discovered the biggest factor. It turns out if you have no control over your work, you are far more likely to become stressed – and, crucially, depressed. Humans have an innate need to feel that what we are doing, day-to-day, is meaningful. When you are controlled, you can’t create meaning out of your work.

Suddenly, the depression of many of my friends, even those in fancy jobs – who spend most of their waking hours feeling controlled and unappreciated – started to look not like a problem with their brains, but a problem with their environments. There are, I discovered, many causes of depression like this. However, my journey was not simply about finding the reasons why we feel so bad. The core was about finding out how we can feel better – how we can find real and lasting antidepressants that work for most of us, beyond only the packs of pills we have been offered as often the sole item on the menu for the depressed and anxious. I kept thinking about what Dr Cacciatore had taught me – we have to deal with the deeper problems that are causing all this distress.

I found the beginnings of an answer to the epidemic of meaningless work – in Baltimore. Meredith Mitchell used to wake up every morning with her heart racing with anxiety. She dreaded her office job. So she took a bold step – one that lots of people thought was crazy. Her husband, Josh, and their friends had worked for years in a bike store, where they were ordered around and constantly felt insecure, Most of them were depressed. One day, they decided to set up their own bike store, but they wanted to run it differently. Instead of having one guy at the top giving orders, they would run it as a democratic co-operative. This meant they would make decisions collectively, they would share out the best and worst jobs and they would all, together, be the boss. It would be like a busy democratic tribe. When I went to their store – Baltimore Bicycle Works – the staff explained how, in this different environment, their persistent depression and anxiety had largely lifted.

It’s not that their individual tasks had changed much. They fixed bikes before; they fix bikes now. But they had dealt with the unmet psychological needs that were making them feel so bad – by giving themselves autonomy and control over their work. Josh had seen for himself that depressions are very often, as he put it, “rational reactions to the situation, not some kind of biological break”. He told me there is no need to run businesses anywhere in the old humiliating, depressing way – we could move together, as a culture, to workers controlling their own workplaces."



"After I learned all this, and what it means for us all, I started to long for the power to go back in time and speak to my teenage self on the day he was told a story about his depression that was going to send him off in the wrong direction for so many years. I wanted to tell him: “This pain you are feeling is not a pathology. It’s not crazy. It is a signal that your natural psychological needs are not being met. It is a form of grief – for yourself, and for the culture you live in going so wrong. I know how much it hurts. I know how deeply it cuts you. But you need to listen to this signal. We all need to listen to the people around us sending out this signal. It is telling you what is going wrong. It is telling you that you need to be connected in so many deep and stirring ways that you aren’t yet – but you can be, one day.”

If you are depressed and anxious, you are not a machine with malfunctioning parts. You are a human being with unmet needs. The only real way out of our epidemic of despair is for all of us, together, to begin to meet those human needs – for deep connection, to the things that really matter in life."
depression  society  psychology  johannhari  2018  work  labo  hierarchy  meaning  purpose  belonging  competence  culture  medication  pharmaceuticals  anxiety  workplace  democracy  cooperation  sfsh  joannecacciatore  irvingkirsch  michaelmarmot  meredithmitchell  johncacioppo  vincentfelitti  aintidepressants  brain  serotonin 
january 2018 by robertogreco
Human cumulative culture: a comparative perspective [.pdf]
"Lewis G. Dean, Gill L. Vale, Kevin N. Laland, Emma Flynn and Rachel L. Kendal"

"Many animals exhibit social learning and behavioural traditions, but human culture exhibits unparalleled complexity and diversity, and is unambiguously cumulative in character. These similarities and differences have spawned a debate over whether animal traditions and human culture are reliant on homologous or analogous psychological processes. Human cumulative culture combines high-fidelity transmission of cultural knowledge with beneficial modifications to generate a ‘ratcheting’ in technological complexity, leading to the development of traits far more complex than one individual could invent alone. Claims have been made for cumulative culture in several species of animals, including chimpanzees, orangutans and New Caledonian crows, but these remain contentious. Whilst initial work on the topic of cumulative culture was largely theoretical, employing mathematical methods developed by population biologists, in recent years researchers from a wide range of disciplines, including psychology, biology, economics, biological anthropology, linguistics and archaeology, have turned their attention to the experimental investigation of cumulative culture. We review this literature, highlighting advances made in understanding the underlying processes of cumulative culture and emphasising areas of agreement and disagreement amongst investigators in separate fields."
lewisden  gillvale  kevinlaland  emmaflynn  rachelkendal  2013  culture  animals  human  humans  anthropology  biology  crows  corvids  multispecies  psychology  economics  cumulativeculture  apes  chimpanzees  orangutans  linguistics  archaeology  morethanhuman 
january 2018 by robertogreco
Verso: Psychopolitics: Neoliberalism and New Technologies of Power, by Byung-Chul Han
"Exploring how neoliberalism has discovered the productive force of the psyche

Byung-Chul Han, a star of German philosophy, continues his passionate critique of neoliberalism, trenchantly describing a regime of technological domination that, in contrast to Foucault’s biopower, has discovered the productive force of the psyche. In the course of discussing all the facets of neoliberal psychopolitics fueling our contemporary crisis of freedom, Han elaborates an analytical framework that provides an original theory of Big Data and a lucid phenomenology of emotion. But this provocative essay proposes counter models too, presenting a wealth of ideas and surprising alternatives at every turn.

Reviews

“How do we say we? It seems important. How do we imagine collective action, in other words, how do we imagine acting on a scale sufficient to change the social order? How seriously can or should one take the idea of freedom in the era of Big Data? There seems to be something drastically wrong with common ideas about what the word act means. Psychopolitics is a beautifully sculpted attempt to figure out how to mean action differently, in an age where humans are encouraged to believe that it's possible and necessary to see everything.” – Timothy Morton

“A combination of neoliberal ethics and ubiquitous data capture has brought about a fundamental transformation and expansion of capitalist power, beyond even the fears of the Frankfurt School. In this blistering critique, Byung-Chul Han shows how capitalism has now finally broken free of liberalism, shrinking the spaces of individuality and autonomy yet further. At the same time, Psychopolitics demonstrates how critical theory can and must be rejuvenated for the age of big data.” – Will Davies

“The new star of German philosophy.” – El País

“What is new about new media? These are philosophical questions for Byung-Chul Han, and precisely here lies the appeal of his essays.” – Die Welt

“In Psychopolitics, critique of the media and of capitalism fuse into the coherent picture of a society that has been both blinded and paralyzed by alien forces. Confident and compelling.” – Spiegel Online"
books  toread  neoliberalism  technology  labor  work  latecapitalism  capitalism  postcapitalism  byung-chulhan  psychology  philosophy  liberalism  individuality  autonomy  willdavies  timothymorton  society  culture  action 
january 2018 by robertogreco
The Burnout Society | Byung-Chul Han
"Our competitive, service-oriented societies are taking a toll on the late-modern individual. Rather than improving life, multitasking, "user-friendly" technology, and the culture of convenience are producing disorders that range from depression to attention deficit disorder to borderline personality disorder. Byung-Chul Han interprets the spreading malaise as an inability to manage negative experiences in an age characterized by excessive positivity and the universal availability of people and goods. Stress and exhaustion are not just personal experiences, but social and historical phenomena as well. Denouncing a world in which every against-the-grain response can lead to further disempowerment, he draws on literature, philosophy, and the social and natural sciences to explore the stakes of sacrificing intermittent intellectual reflection for constant neural connection."
books  toread  byung-chulhan  work  labor  latecapitalism  neoliberalism  technology  multitasking  depression  attention  add  adhd  attentiondeficitdisorder  personality  psychology  philosophy  convenience  neurosis  psychosis  malaise  society  positivity  positivepsychology  capitalism  postcapitalism 
january 2018 by robertogreco
Tricia Wang on Instagram: “🏆📚🎉BEST SELF-REFLECTION BOOK OF 2017 AWARD GOES TO: Supernormal! If you’ve been through any childhood adversity (e.g. family instability…” • Instagram
"🏆📚🎉BEST SELF-REFLECTION BOOK OF 2017 AWARD GOES TO: Supernormal! If you’ve been through any childhood adversity (e.g. family instability, racial/ethnic shit immigrant background, health or mental illness, etc) and if as an adult you tend to dive into work in a way that compromises your health or mental stability, GET THIS BOOK! You are likely what the author, Meg Jay, calls a “supernormal, “everyday superheros who have made a life out of dodging bullets and leaping over obstacles, hiding in plain sight as teachers, artists, doctors, lawyers, parents, students….” This is the first book I’ve read that effectively explains the befuddling phenomena of why a subset of kids who have grown up in adverse situations succeed as adults COMBINED with the latest neuroscience research on what longterm stress does to the brain and body. I put up quotes from my favorite sections on my website.
And if you happen to have experienced a perfectly supportive, emotional stable childhood, gift this book to someone special. Thanks to #1 life meddler @latoyap for the invaluable recommendation."
books  toread  triciawang  megjay  psychology  adversity  health  stress  childhood 
january 2018 by robertogreco
Mindset Marketing, Behaviorism, and Deficit Ideology | Ryan Boren
"The marketing of mindsets is everywhere. Grit, growth mindset, project-based mindset, entrepreneurial mindset, innovator’s mindset, and a raft of canned social-emotional skills programs are vying for public money. These notions jump straight from psychology departments to aphoristic word images shared on social media and marketing festooned on school walls.

Growth mindset and Positive Behavior Support marketing have joined Leader in Me marketing at our elementary school. Instead of being peppered with synergy and Franklin Covey’s trademarks and proprietary jargon, we’re now peppered with LiM and growth mindset and PBS. Like every marketed mindset going back to the self-esteem movement, these campaigns are veneers on the deficit model that ignore long-standing structural problems like poverty, racism, sexism, ableism, and childism. The practice and implementation of these mindsets are always suborned by deficit ideology, bootstrap ideology, meritocracy myths, and greed.

“Money Doesn’t Have to Be an Obstacle,” “Race Doesn’t Matter,” “Just Work Harder,” “Everyone Can Go to College,” and “If You Believe, Your Dreams Will Come True.” These notions have helped fueled inequity in the U.S. public education system. Mindset marketing without structural ideology, restorative practices, and inclusion is more harmful than helpful. This marketing shifts responsibility for change from our systems to children. We define kids’ identities through the deficit and medical models, gloss over the structural problems they face, and then tell them to get some grit and growth mindset. This is a gaslighting. It is abusive.

Canned social-emotional skills programs, behaviorism, and the marketing of mindsets have serious side effects. They reinforce the cult of compliance and encourage submission to authoritarian rule. They line the pockets of charlatans and profiteers. They encourage surveillance and avaricious data collection. Deficit model capitalism’s data-based obsession proliferates hucksterism and turn kids into someone’s business model. The behaviorism of PBS is of the mindset of abusers and manipulators. It is ideological and intellectual kin with ABA, which autistic people have roundly rejected as abusive, coercive, and manipulative torture. We call it autistic conversion therapy. The misbehavior of behaviorism is an ongoing harm.

Instead, acknowledge pipeline problems and the meritocracy myth, stop bikeshedding the structural problems of the deficit model, and stop blaming kids and families. Develop a school culture based not on deficit ideologies and cargo cult shrink wrap, but on diversity & inclusion, neurodiversity, the social model of disability, structural ideology, and indie ed-tech. Get rid of extrinsics, and adopt instead the intrinsic motivation of autonomy, mastery, and purpose. Provide fresh air, sunlight, and plenty of time for major muscle movement instead of mindset bandages for the pathologies caused by the lack of these three critical things.

“Self-esteem that’s based on external sources has mental health consequences.” Stop propagating the latest deficit/bootstrap/behaviorism fads. Develop the critical capacity to see beyond the marketing. Look beyond deficit model compliance to social model inclusion. The social model and structural ideology are the way forward. Growth mindset and behaviorism, as usually implemented, are just more bootstrap metaphors that excuse systems from changing and learning.

Deficit ideology, surveillance capitalism, mindset marketing, and behaviorism are an unholy alliance. Fix injustice, not kids. “It essentially boils down to whether one chooses to do damage to the system or to the student.”"
ryanboren2017  mindset  marketing  behavior  behaviorism  deficitideology  disabilities  disability  race  education  learning  grit  growthmindset  projectbasedlearning  entrepreneurship  innovation  psychology  racism  poverty  sexism  bootstrapping  meritocracy  greed  childism  ableism  socialemotional  surveillance  surveillancecapitalism  capitalism  health  intrinsicmotivation  extrinsicmotivation  diversity  inclusion  neurodiversity  edtech  autonomy  mastery  purpose  self-esteem  compliance  socialemotionallearning 
december 2017 by robertogreco
Our personalities are shaped by the climate we grew up in, new study says - The Washington Post
"Take two children with similar backgrounds. Both are boys. They’re raised in families with the same socioeconomic status. They live in similar-looking neighborhoods and have the same access to education and health care.

The only difference is that one of the boys grows up in San Diego, where it’s comfortably warm most of the year and the average high temperature is about 70 degrees. The other is in Marquette, Mich., which is significantly colder. The average high there is just 50 degrees.

One of these kids is significantly more likely to be agreeable, open and emotionally stable, according to a new study, simply because he grew up in a warmer climate.

We know anecdotally that weather affects our mood. Summertime temperatures seem to lift our spirits, while the coldest weeks of winter put us in a funk. The study, which was published in Nature on Monday, says it does more than that in the long run.

All else being equal, the kid in San Diego is more likely to grow up to be friendlier, more outgoing and more willing to explore new things, the study suggests."

[Study: https://www.nature.com/articles/s41562-017-0240-0.epdf ]
eather  friendliness  personality  sandiego  california  2017  psychology  mood  openness  climate  stability  emotions 
november 2017 by robertogreco
Jonathan Mooney: "The Gift: LD/ADHD Reframed" - YouTube
"The University of Oregon Accessible Education Center and AccessABILITY Student Union present renowned speaker, neuro-diversity activist and author Jonathan Mooney.

Mooney vividly, humorously and passionately brings to life the world of neuro-diversity: the research behind it, the people who live in it and the lessons it has for all of us who care about the future of education. Jonathan explains the latest theories and provides concrete examples of how to prepare students and implement frameworks that best support their academic and professional pursuits. He blends research and human interest stories with concrete tips that parents, students, teachers and administrators can follow to transform learning environments and create a world that truly celebrates cognitive diversity."
neurodiversity  2012  jonathanmooney  adhd  cognition  cognitivediversity  sfsh  accessibility  learning  education  differences  howwelearn  disability  difference  specialeducation  highered  highereducation  dyslexia  droputs  literacy  intelligence  motivation  behavior  compliance  stillness  norms  shame  brain  success  reading  multiliteracies  genius  smartness  eq  emotions  relationships  tracking  maryannewolf  intrinsicmotivation  extrinsicmotivation  punishment  rewards  psychology  work  labor  kids  children  schools  agency  brokenness  fixingpeople  unschooling  deschooling  strengths  strengths-basedoutlook  assets  deficits  identity  learningdisabilities  schooling  generalists  specialists  howardgardner  howweteach  teams  technology  support  networks  inclusivity  diversity  accommodations  normal  average  standardization  standards  dsm  disabilities  bodies  body 
november 2017 by robertogreco
Happiness Is Other People - The New York Times
"And according to research, if we want to be happy, we should really be aiming to spend less time alone. Despite claiming to crave solitude when asked in the abstract, when sampled in the moment, people across the board consistently report themselves as happier when they are around other people than when they are on their own. Surprisingly this effect is not just true for people who consider themselves extroverts but equally strong for introverts as well."
happiness  psychology  culture  2017  solitude  ruthwhippman  anxiety  individualism  society  community  self-care 
november 2017 by robertogreco
Frontiers | Less-structured time in children's daily lives predicts self-directed executive functioning | Psychology
"Executive functions (EFs) in childhood predict important life outcomes. Thus, there is great interest in attempts to improve EFs early in life. Many interventions are led by trained adults, including structured training activities in the lab, and less-structured activities implemented in schools. Such programs have yielded gains in children's externally-driven executive functioning, where they are instructed on what goal-directed actions to carry out and when. However, it is less clear how children's experiences relate to their development of self-directed executive functioning, where they must determine on their own what goal-directed actions to carry out and when. We hypothesized that time spent in less-structured activities would give children opportunities to practice self-directed executive functioning, and lead to benefits. To investigate this possibility, we collected information from parents about their 6–7 year-old children's daily, annual, and typical schedules. We categorized children's activities as “structured” or “less-structured” based on categorization schemes from prior studies on child leisure time use. We assessed children's self-directed executive functioning using a well-established verbal fluency task, in which children generate members of a category and can decide on their own when to switch from one subcategory to another. The more time that children spent in less-structured activities, the better their self-directed executive functioning. The opposite was true of structured activities, which predicted poorer self-directed executive functioning. These relationships were robust (holding across increasingly strict classifications of structured and less-structured time) and specific (time use did not predict externally-driven executive functioning). We discuss implications, caveats, and ways in which potential interpretations can be distinguished in future work, to advance an understanding of this fundamental aspect of growing up."

[via: https://twitter.com/cblack__/status/924720295465721856 ]
2014  deschooling  unschooling  psychology  executivefunctioning  self-directed  self-directedlearning  learning  education  sfsh  childhood  freedom  children  experience  structure  janebarker  andreisemenov  lauramichaelson  lindsayprovan  hannahsnyder  yukomunakata 
october 2017 by robertogreco
The Touch of Madness - Pacific Standard
"So Jones grew alarmed when, soon after starting at DePaul in the fall of 2007, at age 27, she began having trouble retaining things she had just read. She also struggled to memorize the new characters she was learning in her advanced Chinese class. She had experienced milder versions of these cognitive and memory blips a couple times before, most recently as she’d finished her undergraduate studies earlier that year. These new mental glitches were worse. She would study and draw the new logograms one night, then come up short when she tried to draw them again the next morning.

These failures felt vaguely neurological. As if her synapses had clogged. She initially blamed them on the sleepless, near-manic excitement of finally being where she wanted to be. She had wished for exactly this, serious philosophy and nothing but, for half her life. Now her mind seemed to be failing. Words started to look strange. She began experiencing "inarticulable atmospheric changes," as she put it—not hallucinations, really, but alterations of temporality, spatiality, depth perception, kinesthetics. Shimmerings in reality's fabric. Sidewalks would feel soft and porous. Audio and visual input would fall out of sync, creating a lag between the movement of a speaker's lips and the words' arrival at Jones' ears. Something was off.

"You look at your hand," as she described it to me later, holding hers up and examining it front and back, "and it looks the same as always. But it's not. It's yours—but it's not. Nothing has changed"—she let her hand drop to her knee—"yet it's different. And that's what gets you. There's nothing to notice; but you can't help but notice."

Another time she found herself staring at the stone wall of a building on campus and realizing that the wall's thick stone possessed two contradictory states. She recognized that the wall was immovable and that, if she punched it, she'd break her hand. Yet she also perceived that the stone was merely a constellation of atomic particles so tenuously bound that, if she blew on it, it would come apart. She experienced this viscerally. She felt the emptiness within the stone.

Initially she found these anomalies less threatening than weird. But as they intensified, the gap between what she was perceiving and what she could understand rationally generated an unbearable cognitive dissonance. How could something feel so wrong but she couldn't say what? She had read up the wazoo about perception, phenomenology, subjectivity, consciousness. She of all people should be able to articulate what she was experiencing. Yet she could not. "Language had betrayed me," she says. "There was nothing you could point to and say, 'This looks different about the world.' There were no terms. I had no fucking idea."

Too much space was opening within and around and below her. She worried she was going mad. She had seen what madness looked like from the outside. When Jones was in her teens, one of her close relatives, an adult she'd always seen frequently, and whom we'll call Alex for privacy reasons, had in early middle age fallen into a state of almost relentless schizophrenia. It transformed Alex from a warm, caring, and open person who was fully engaged with the world into somebody who was isolated from it—somebody who seemed remote, behaved in confusing and alarming ways, and periodically required hospitalization. Jones now started to worry this might be happening to her."



"Reading philosophy helped Jones think. It helped order the disorderly. Yet later, in college, she lit up when she discovered the writers who laid the philosophical foundation for late 20-century critical psychiatry and madness studies: Michel Foucault, for instance, who wrote about how Western culture, by medicalizing madness, brands the mad as strangers to human nature. Foucault described both the process and the alienating effect of this exclusion-by-definition, or "othering," as it soon came to be known, and how the mad were cut out and cast away, flung into pits of despair and confusion, leaving ghosts of their presence behind.

To Jones, philosophy, not medicine, best explained the reverberations from the madness that had touched her family: the disappearance of the ex-husband; the alienation of Alex, who at times seemed "there but not there," unreachable. Jones today describes the madness in and around her family as a koan, a puzzle that teaches by its resistance to solution, and which forces upon her the question of how to speak for those who may not be able to speak for themselves.

Jones has since made a larger version of this question—of how we think of and treat the mad, and why in the West we usually shunt them aside—her life's work. Most of this work radiates from a single idea: Culture shapes the experience, expression, and outcome of madness. The idea is not that culture makes one mad. It's that culture profoundly influences every aspect about how madness develops and expresses itself, from its onset to its full-blown state, from how the afflicted experience it to how others respond to it, whether it destroys you or leaves you whole.

This idea is not original to Jones. It rose from the observation, first made at least a century ago and well-documented now, that Western cultures tend to send the afflicted into a downward spiral rarely seen in less modernized cultures. Schizophrenia actually has a poorer prognosis for people in the West than for those in less urbanized, non-Eurocentric societies. When the director of the World Health Organization's mental-health unit, Shekhar Saxena, was asked last year where he'd prefer to be if he were diagnosed with schizophrenia, he said for big cities he'd prefer a city in Ethiopia or Sri Lanka, like Colombo or Addis Ababa, rather than New York or London, because in the former he could expect to be seen as a productive if eccentric citizen rather than a reject and an outcast.

Over the past 25 years or so, the study of culture's effect on schizophrenia has received increasing attention from philosophers, historians, psychiatrists, anthropologists, and epidemiologists, and it is now edging into the mainstream. In the past five years, Nev Jones has made herself one of this view's most forceful proponents and one of the most effective advocates for changing how Western culture and psychiatry respond to people with psychosis. While still a graduate student at DePaul she founded three different groups to help students with psychosis continue their studies. After graduating in 2014, she expanded her reach first into the highest halls of academe, as a scholar at Stanford University, and then into policy, working with state and private agencies in California and elsewhere on programs for people with psychosis, and with federal agencies to produce toolkits for universities, students, and families about dealing with psychosis emerging during college or graduate study. Now in a new position as an assistant professor at the University of South Florida, she continues to examine—and ask the rest of us to see—how culture shapes madness.

In the United States, the culture's initial reaction to a person's first psychotic episode, embedded most officially in a medical system that sees psychosis and schizophrenia as essentially biological, tends to cut the person off instantly from friends, social networks, work, and their sense of identity. This harm can be greatly reduced, however, when a person's first care comes from the kind of comprehensive, early intervention programs, or EIPs, that Jones works on. These programs emphasize truly early intervention, rather than the usual months-long lag between first symptoms and any help; high, sustained levels of social, educational, and vocational support; and building on the person's experience, ambitions, and strengths to keep them as functional and engaged as possible. Compared to treatment as usual, EIPs lead to markedly better outcomes across the board, create more independence, and seem to create far less trauma for patients and their family and social circles."



"Once his eye was caught, Kraepelin started seeing culture's effects everywhere. In his native Germany, for instance, schizophrenic Saxons were more likely to kill themselves than were Bavarians, who were, in turn, more apt to do violence to others. In a 1925 trip to North America, Kraepelin found that Native Americans with schizophrenia, like Indonesians, didn't build in their heads the elaborate delusional worlds that schizophrenic Europeans did, and hallucinated less.

Kraepelin died in 1926, before he could publish a scholarly version of those findings. Late in his life, he embraced some widely held but horrific ideas about scientific racism and eugenics. Yet he had clearly seen that culture exerted a powerful, even fundamental, effect on the intensity, nature, and duration of symptoms in schizophrenia, and in bipolar disorder and depression. He urged psychiatrists to explore just how culture created such changes.

Even today, few in medicine have heeded this call. Anthropologists, on the other hand, have answered it vigorously over the last couple of decades. To a cultural anthropologist, culture includes the things most of us would expect—movies, music, literature, law, tools, technologies, institutions, and traditions. It also includes a society's predominant ideas, values, stories, interpretations, beliefs, symbols, and framings—everything from how we should dress, greet one another, and prepare and eat food, to what it means to be insane. Madness, in other words, is just one more thing about which a culture constructs and applies ideas that guide thought and behavior.

But what connects these layers of culture to something so seemingly internal as a person's state of mind? The biocultural anthropologist Daniel Lende says that it helps here to think of culture as a series of concentric circles surrounding each of us. For simplicity's sake, let's keep it to two circles around a core, with each circle … [more]
2017  daviddobbs  mentalhealth  psychology  health  culture  madness  nevjones  japan  ethiopia  colombo  addisababa  schizophrenia  society  srilanka  shekharsaxena  philosophy  perception  treatment  medicine  psychosis  media  academia  anthropology  daniellende  pauleugenbleuler  emilkraepelin  danielpaulschreber  edwadsapir  relationships  therapy  tinachanter  namitagoswami  irenehurford  richardnoll  ethanwatters  wolfgangjilek  wolfgangpfeiffer  stigma  banishment  hallucinations  really  but  alterations  of  temporality  time  spatiality  depthperception  kinesthetics  memory  memories  reality  phenomenology  subjectivity  consciousness  donaldwinnicott  alienation  kinship  isolation  tanyaluhrmann 
october 2017 by robertogreco
Dr. Nev Jones on Vimeo
[found after reading:

"The Tough of Madness: Culture profoundly shapes our ideas about mental illness, which is something psychologist Nev Jones knows all too well."
https://psmag.com/magazine/the-touch-of-madness-mental-health-schizophrenia ]
nevjones  academia  psychology  psychosis  schizophrenia  2017  mentalhealth  healthcare  health  ptsd  immigration  support  culture  society  risk 
october 2017 by robertogreco
How children’s self-control has changed in the past 50 years - The Washington Post
"“Kids these days are better at delaying gratification on the marshmallow test,” Protzko writes. “Each year, all else equal, corresponds to an increase in the ability to delay gratification by another six seconds.”

This was something of a surprise. Before running the analysis, Protzko had surveyed 260 experts in the field of cognitive development to see what they predicted would happen.

Over half said they believed that kids' ability to delay gratification had gotten worse over time. Another 32 percent said there's be no change, while only 16 percent said kids' self-control had improved in the past 50 years.

The experts, it seems, were just as pessimistic about the abilities of today's kids as everyone else.

It's not clear what, exactly, could be causing kids' performance to improve — it's not like they teach the marshmallow test in schools. Kids are improving in other areas too: Protzko notes that IQ scores have increased at a similar rate to the marshmallow test scores, suggesting a possible link between the two.

On a whole host of other measures — substance use, sexual behavior, seat belt use, to name just a few — teenagers today are performing much better than their peers from several decades ago. Many of these measures reflect precisely the sort of gratification-delaying ability that the marshmallow test has been shown to predict.

Given all the good news about kids, Protzko wanted to know why so many experts had such a dour outlook.

Marshmallow test aside, Protzko's just as interested in why so many experts predicted it incorrectly. “How could so many experts in cognitive development believe that ability to delay gratification would decrease?” the paper asks. He calls it the “kids these days” effect: “the specifically incorrect belief that children in the present are substantively different and necessarily worse than children a generation or two ago.”

He notes that elders have been complaining about children's shortcomings since at least 419 B.C., when Greek playwright Aristophanes wrote “The Clouds.”

“It cannot be that society has been in decline due to failing children for over two millennia,” Protzko concludes. “Contrary to historical and present complaints, kids these days appear to be better than we were. A supposed modern culture of instant gratification has not stemmed the march of improvement.”"
sfsh  children  2017  johnprotzkop  kidsthesedays  education  psychology  cognition  gratification  self-control  marshmallowtest 
september 2017 by robertogreco
Maslow’s Hierarchy of Needs vs. The Max Neef Model of Human Scale development
"Maslow wanted to understand what motivated people , in order to accomplish that he studied the various needs of people and created a hierarchy out of those needs. The idea was that the needs that belong towards the end of the Pyramid are Deficit Needs/ Basic Needs (Physiological, safety, love/belonging, esteem) and Growth Needs (Self Actualization).

One must satisfy lower level basic needs before progressing on to meet higher level growth needs. Once these needs have been reasonably satisfied, one may be able to reach the highest level called self-actualization.

CRITICISM

The strongest criticism of this theory is based on the way this theory was formed. In order to create a definition of Self Actualization, Maslow identified 18 people as Self Actualizers and studied their characteristics, this is a very small percentage of people. Secondly there are artists, philosophers who do not meet the basic needs but show signs of Self Actualization.

One of the interesting ways of looking at theories that I learned in class was how a person’s place and identity impacts the work he/ she does. Maslow was from US, a capitalist nation, therefore his model never looks at group dynamics or the social aspect.

Contemporary research by Tay & Diener (2011) has tested Maslow’s theory by analyzing the data of 60,865 participants from 123 countries, representing every major region of the world. The survey was conducted from 2005 to 2010.
Respondents answered questions about six needs that closely resemble those in Maslow’s model: basic needs (food, shelter); safety; social needs (love, support); respect; mastery; and autonomy. They also rated their well-being across three discrete measures: life evaluation (a person’s view of his or her life as a whole), positive feelings (day-to-day instances of joy or pleasure), and negative feelings (everyday experiences of sorrow, anger, or stress).

The results of the study support the view that universal human needs appear to exist regardless of cultural differences. However, the ordering of the needs within the hierarchy was not correct.
“Although the most basic needs might get the most attention when you don’t have them,” Diener explains, “you don’t need to fulfill them in order to get benefits [from the others].” Even when we are hungry, for instance, we can be happy with our friends. “They’re like vitamins,” Diener says about how the needs work independently. “We need them all.”

Source : http://www.simplypsychology.org/maslow.html

vs.

Max Neef Model of Human Scale Development

Manfred max- Neef is a Chilean Economist. He defines the model as a taxonomy of human needs and a process by which communities can identify their “wealths” and “poverties” according to how these needs are satisfied.

He describes needs as being constant through all cultures and across historical time periods. The thing that changes with time and across cultures is the way that these needs are satisfied. According to the model human needs are to be understood as a system i.e. they are interrelated and interactive.

According to Max Neef the fundamental needs of humans are

• subsistence
• protection
• affection
• understanding
• participation
• leisure
• creation
• identity
• freedom

Max-Neef further classifies Satisfiers (ways of meeting needs) as follows.

1. Violators: claim to be satisfying needs, yet in fact make it more difficult to satisfy a need.

2. Pseudo Satisfiers: claim to be satisfying a need, yet in fact have little to no effect on really meeting such a need.

3. Inhibiting Satisfiers: those which over-satisfy a given need, which in turn seriously inhibits the possibility of satisfaction of other needs.

4. Singular Satisfiers: satisfy one particular need only. These are neutral in regard to the satisfaction of other needs.

5. Synergistic Satisfiers: satisfy a given need, while simultaneously contributing to the satisfaction of other needs.

It is interesting to note that Max-Neef came from Chile which was a socialist nation and therefore his model was more inclusive by considering society at large.

Hi, this article is a part of a series of articles I am writing while studying Design Led Innovation at Srishti Institute of Art, Design & Technology. They are meant to be reflections on things I learn or read about during this time.I look forward to any feedback or crit that you can provide. :)"
nhakhandelwal  2016  abrahammaslow  manfredmaxneef  psychology  self-actualization  humans  humanneeds  needs  motivation  safety  self-esteem  respect  mastery  autonomy  emotions  humandevelopment  creation  freedom  identity  leisure  understanding  participation  affection  protection  subsistence  classideas  sfsh  chile  culture  systemsthinking  humanscale  scale 
august 2017 by robertogreco
Being rich wrecks your soul. We used to know that. - The Washington Post
"The point is not necessarily that wealth is intrinsically and everywhere evil, but that it is dangerous — that it should be eyed with caution and suspicion, and definitely not pursued as an end in itself; that great riches pose great risks to their owners; and that societies are right to stigmatize the storing up of untold wealth. That’s why Aristotle, for instance, argued that wealth should be sought only for the sake of living virtuously — to manage a household, say, or to participate in the life of the polis. Here wealth is useful but not inherently good; indeed, Aristotle specifically warned that the accumulation of wealth for its own sake corrupts virtue instead of enabling it. For Hindus, working hard to earn money is a duty (dharma), but only when done through honest means and used for good ends. The function of money is not to satiate greed but to support oneself and one’s family. The Koran, too, warns against hoarding money and enjoins Muslims to disperse it to the needy.

Some contemporary voices join this ancient chorus, perhaps none more enthusiastically than Pope Francis. He’s proclaimed that unless wealth is used for the good of society, and above all for the good of the poor, it is an instrument “of corruption and death.” And Francis lives what he teaches: Despite access to some of the sweetest real estate imaginable — the palatial papal apartments are the sort of thing that President Trump’s gold-plated extravagance is a parody of — the pope bunks in a small suite in what is effectively the Vatican’s hostel. In his official state visit to Washington, he pulled up to the White House in a Fiat so sensible that a denizen of Northwest D.C. would be almost embarrassed to drive it. When Francis entered the Jesuit order 59 years ago, he took a vow of poverty, and he’s kept it.

According to many philosophies and faiths, then, wealth should serve only as a steppingstone to some further good and is always fraught with moral danger. We all used to recognize this; it was a commonplace. And this intuition, shared by various cultures across history, stands on firm empirical ground.

Over the past few years, a pile of studies from the behavioral sciences has appeared, and they all say, more or less, “Being rich is really bad for you.” Wealth, it turns out, leads to behavioral and psychological maladies. The rich act and think in misdirected ways.

When it comes to a broad range of vices, the rich outperform everybody else. They are much more likely than the rest of humanity to shoplift and cheat , for example, and they are more apt to be adulterers and to drink a great deal . They are even more likely to take candy that is meant for children. So whatever you think about the moral nastiness of the rich, take that, multiply it by the number of Mercedes and Lexuses that cut you off, and you’re still short of the mark. In fact, those Mercedes and Lexuses are more likely to cut you off than Hondas or Fords: Studies have shown that people who drive expensive cars are more prone to run stop signs and cut off other motorists .

The rich are the worst tax evaders, and, as The Washington Post has detailed, they are hiding vast sums from public scrutiny in secret overseas bank accounts.

They also give proportionally less to charity — not surprising, since they exhibit significantly less compassion and empathy toward suffering people. Studies also find that members of the upper class are worse than ordinary folks at “reading” people’ s emotions and are far more likely to be disengaged from the people with whom they are interacting — instead absorbed in doodling, checking their phones or what have you. Some studies go even further, suggesting that rich people, especially stockbrokers and their ilk (such as venture capitalists, whom we once called “robber barons”), are more competitive, impulsive and reckless than medically diagnosed psychopaths. And by the way, those vices do not make them better entrepreneurs; they just have Mommy and Daddy’s bank accounts (in New York or the Cayman Islands) to fall back on when they fail."



"Some will say that we have not entirely forgotten it and that we do complain about wealth today, at least occasionally. Think, they’ll say, about Occupy Wall Street; the blowback after Mitt Romney’s comment about the “47 percent”; how George W. Bush painted John Kerry as out of touch. But think again: By and large, those complaints were not about wealth per se but about corrupt wealth — about wealth “gone wrong” and about unfairness. The idea that there is no way for the vast accumulation of money to “go right” is hardly anywhere to be seen.

Getting here wasn’t straightforward. Wealth has arguably been seen as less threatening to one’s moral health since the Reformation, after which material success was sometimes taken as evidence of divine election. But extreme wealth remained morally suspect, with the rich bearing particular scrutiny and stigmatization during periods like the Gilded Age. This stigma persisted until relatively recently; only in the 1970s did political shifts cause executive salaries skyrocket, and the current effectively unprecedented inequality in income (and wealth) begin to appear, without any significant public complaint or lament.

The story of how a stigma fades is always murky, but contributing factors are not hard to identify. For one, think tanks have become increasingly partisan over the past several decades, particularly on the right: Certain conservative institutions, enjoying the backing of billionaires such as the Koch brothers, have thrown a ton of money at pseudo-academics and “thought leaders” to normalize and legitimate obscene piles of lucre. They produced arguments that suggest that high salaries naturally flowed from extreme talent and merit, thus baptizing wealth as simply some excellent people’s wholly legitimate rewards. These arguments were happily regurgitated by conservative media figures and politicians, eventually seeping into the broader public and replacing the folk wisdom of yore. But it is hard to argue that a company’s top earners are literally hundreds of times more talented than the lowest-paid employees.

As stratospheric salaries became increasingly common, and as the stigma of wildly disproportionate pay faded, the moral hazards of wealth were largely forgotten. But it’s time to put the apologists for plutocracy back on the defensive, where they belong — not least for their own sake. After all, the Buddha, Aristotle, Jesus, the Koran, Jimmy Stewart, Pope Francis and now even science all agree: If you are wealthy and are reading this, give away your money as fast as you can."
charlesmathewes  evansandsmark  2017  wealth  inequality  behavior  psychology  buddha  aristotle  jesus  koran  jimmystewart  popefrancis  ethics  generosity  vices  fscottfitzgerald  ernesthemingway  tonystark  confucius  austerity  tacitus  opulence  christ  virtue  caution  suspicion  polis  poverty  donaldtrump  jesuits  morality  humanism  cheating  taxevasion  charity  empathy  compassion  disengagement  competition  competitiveness  psychopaths  capitalism  luxury  politics  simplicity  well-being  suicide  ows  occupywallstreet  geogewbush  johnkerry  mittromney  gildedage  kochbrothers 
august 2017 by robertogreco
Why there’s no such thing as a gifted child | Education | The Guardian
"Even Einstein was unexceptional in his youth. Now a new book questions our fixation with IQ and says adults can help almost any child become gifted"



"When Maryam Mirzakhani died at the tragically early age of 40 this month, the news stories talked of her as a genius. The only woman to win the Fields Medal – the mathematical equivalent of a Nobel prize – and a Stanford professor since the age of 31, this Iranian-born academic had been on a roll since she started winning gold medals at maths Olympiads in her teens.

It would be easy to assume that someone as special as Mirzakhani must have been one of those gifted children who excel from babyhood. The ones reading Harry Potter at five or admitted to Mensa not much later. The child that takes maths GCSE while still in single figures, or a rarity such as Ruth Lawrence, who was admitted to Oxford while her contemporaries were still in primary school.

But look closer and a different story emerges. Mirzakhani was born in Tehran, one of three siblings in a middle-class family whose father was an engineer. The only part of her childhood that was out of the ordinary was the Iran-Iraq war, which made life hard for the family in her early years. Thankfully it ended around the time she went to secondary school.

Mirzakhani, did go to a highly selective girls’ school but maths wasn’t her interest – reading was. She loved novels and would read anything she could lay her hands on; together with her best friend she would prowl the book stores on the way home from school for works to buy and consume.

As for maths, she did rather poorly at it for the first couple of years in her middle school, but became interested when her elder brother told her about what he’d learned. He shared a famous maths problem from a magazine that fascinated her – and she was hooked. The rest is mathematical history.

Is her background unusual? Apparently not. Most Nobel laureates were unexceptional in childhood. Einstein was slow to talk and was dubbed the dopey one by the family maid. He failed the general part of the entry test to Zurich Polytechnic – though they let him in because of high physics and maths scores. He struggled at work initially, failing to get academic post and being passed over for promotion at the Swiss Patent Office because he wasn’t good enough at machine technology. But he kept plugging away and eventually rewrote the laws of Newtonian mechanics with his theory of relativity.

Lewis Terman, a pioneering American educational psychologist, set up a study in 1921 following 1,470 Californians, who excelled in the newly available IQ tests, throughout their lives. None ended up as the great thinkers of their age that Terman expected they would. But he did miss two future Nobel prize winners – Luis Alvarez and William Shockley, both physicists – whom he dismissed from the study as their test scores were not high enough.

There is a canon of research on high performance, built over the last century, that suggests it goes way beyond tested intelligence. On top of that, research is clear that brains are malleable, new neural pathways can be forged, and IQ isn’t fixed. Just because you can read Harry Potter at five doesn’t mean you will still be ahead of your contemporaries in your teens.

According to my colleague, Prof Deborah Eyre, with whom I’ve collaborated on the book Great Minds and How to Grow Them, the latest neuroscience and psychological research suggests most people, unless they are cognitively impaired, can reach standards of performance associated in school with the gifted and talented. However, they must be taught the right attitudes and approaches to their learning and develop the attributes of high performers – curiosity, persistence and hard work, for example – an approach Eyre calls “high performance learning”. Critically, they need the right support in developing those approaches at home as well as at school.

So, is there even such a thing as a gifted child? It is a highly contested area. Prof Anders Ericsson, an eminent education psychologist at Florida State University, is the co-author of Peak: Secrets from the New Science of Expertise. After research going back to 1980 into diverse achievements, from music to memory to sport, he doesn’t think unique and innate talents are at the heart of performance. Deliberate practice, that stretches you every step of the way, and around 10,000 hours of it, is what produces the expert. It’s not a magic number – the highest performers move on to doing a whole lot more, of course, and, like Mirzakhani, often find their own unique perspective along the way.

Ericsson’s memory research is particularly interesting because random students, trained in memory techniques for the study, went on to outperform others thought to have innately superior memories – those you might call gifted.

He got into the idea of researching the effects of deliberate practice because of an incident at school, in which he was beaten at chess by someone who used to lose to him. His opponent had clearly practised.

But it is perhaps the work of Benjamin Bloom, another distinguished American educationist working in the 1980s, that gives the most pause for thought and underscores the idea that family is intrinsically important to the concept of high performance.

Bloom’s team looked at a group of extraordinarily high achieving people in disciplines as varied as ballet, swimming, piano, tennis, maths, sculpture and neurology, and interviewed not only the individuals but their parents, too.

He found a pattern of parents encouraging and supporting their children, in particular in areas they enjoyed themselves. Bloom’s outstanding adults had worked very hard and consistently at something they had become hooked on young, and their parents all emerged as having strong work ethics themselves.

While the jury is out on giftedness being innate and other factors potentially making the difference, what is certain is that the behaviours associated with high levels of performance are replicable and most can be taught – even traits such as curiosity.

Eyre says we know how high performers learn. From that she has developed a high performing learning approach that brings together in one package what she calls the advanced cognitive characteristics, and the values, attitudes and attributes of high performance. She is working on the package with a group of pioneer schools, both in Britain and abroad.

But the system needs to be adopted by families, too, to ensure widespread success across classes and cultures. Research in Britain shows the difference parents make if they take part in simple activities pre-school in the home, supporting reading for example. That support shows through years later in better A-level results, according to the Effective Pre-School, Primary and Secondary study, conducted over 15 years by a team from Oxford and London universities.

Eye-opening spin-off research, which looked in detail at 24 of the 3,000 individuals being studied who were succeeding against the odds, found something remarkable about what was going in at home. Half were on free school meals because of poverty, more than half were living with a single parent, and four in five were living in deprived areas.

The interviews uncovered strong evidence of an adult or adults in the child’s life who valued and supported education, either in the immediate or extended family or in the child’s wider community. Children talked about the need to work hard at school and to listen in class and keep trying. They referenced key adults who had encouraged those attitudes.

Einstein, the epitome of a genius, clearly had curiosity, character and determination. He struggled against rejection in early life but was undeterred. Did he think he was a genius or even gifted? No. He once wrote: “It’s not that I’m so smart, it’s just that I stay with problems longer. Most people say that it is the intellect which makes a great scientist. They are wrong: it is character.”

And what about Mirzakhani? Her published quotations show someone who was curious and excited by what she did and resilient. One comment sums it up. “Of course, the most rewarding part is the ‘Aha’ moment, the excitement of discovery and enjoyment of understanding something new – the feeling of being on top of a hill and having a clear view. But most of the time, doing mathematics for me is like being on a long hike with no trail and no end in sight.”

The trail took her to the heights of original research into mathematics in a cruelly short life. That sounds like unassailable character. Perhaps that was her gift."
sfsh  parenting  gifted  precocity  children  prodigies  2017  curiosity  rejection  resilience  maryammirzakhani  childhood  math  mathematics  reading  slowlearning  lewisterman  iq  iqtests  tests  testing  luisalvarez  williamshockley  learning  howwelearn  deboraheyre  wendyberliner  neuroscience  psychology  attitude  persistence  hardwork  workethic  andersericsson  performance  practice  benjaminbloom  education  ballet  swimming  piano  tennis  sculpture  neurology  encouragement  support  giftedness  behavior  mindset  genius  character  determination  alberteinstein 
july 2017 by robertogreco
The Algorithm That Makes Preschoolers Obsessed With YouTube Kids - The Atlantic
"Surprise eggs and slime are at the center of an online realm that’s changing the way the experts think about human development."



"And here’s where the ouroboros factor comes in: Kids watch the same kinds of videos over and over. Videomakers take notice of what’s most popular, then mimic it, hoping that kids will click on their stuff. When they do, YouTube’s algorithm takes notice, and recommends those videos to kids. Kids keep clicking on them, and keep being offered more of the same. Which means video makers keep making those kinds of videos—hoping kids will click.

This is, in essence, how all algorithms work. It’s how filter bubbles are made. A little bit of computer code tracks what you find engaging—what sorts of videos do you watch most often, and for the longest periods of time?—then sends you more of that kind of stuff. Viewed a certain way, YouTube Kids is offering programming that’s very specifically tailored to what children want to see. Kids are actually selecting it themselves, right down to the second they lose interest and choose to tap on something else. The YouTube app, in other words, is a giant reflection of what kids want. In this way, it opens a special kind of window into a child’s psyche.

But what does it reveal?

“Up until very recently, surprisingly few people were looking at this,” says Heather Kirkorian, an assistant professor of human development in the School of Human Ecology at the University of Wisconsin-Madison. “In the last year or so, we’re actually seeing some research into apps and touchscreens. It’s just starting to come out.”

Kids’ videos are among the most watched content in YouTube history. This video, for example, has been viewed more than 2.3 billion times, according to YouTube’s count:

[video: https://www.youtube.com/watch?v=KYniUCGPGLs ]



"The vague weirdness of these videos aside, it’s actually easy to see why kids like them. “Who doesn’t want to get a surprise? That’s sort of how all of us operate,” says Sandra Calvert, the director of the Children’s Digital Media Center at Georgetown University. In addition to surprises being fun, many of the videos are basically toy commercials. (This video of a person pressing sparkly Play-Doh onto chintzy Disney princess figurines has been viewed 550 million times.) And they let kids tap into a whole internet’s worth of plastic eggs and perceived power. They get to choose what they watch. And kids love being in charge, even in superficial ways.

“It’s sort of like rapid-fire channel surfing,” says Michael Rich, a professor of pediatrics at Harvard Medical School and the director of the Center on Media and Child Health. “In many ways YouTube Kids is better suited to the attention span of a young child—just by virtue of its length—than something like a half-hour or hour broadcast program can be.”

Rich and others compare the app to predecessors like Sesame Street, which introduced short segments within a longer program, in part to keep the attention of the young children watching. For decades, researchers have looked at how kids respond to television. Now they’re examining the way children use mobile apps—how many hours they’re spending, which apps they’re using, and so on."



"“You have to do what the algorithm wants for you,” says Nathalie Clark, the co-creator of a similarly popular channel, Toys Unlimited, and a former ICU nurse who quit her job to make videos full-time. “You can’t really jump back and forth between themes.”

What she means is, once YouTube’s algorithm has determined that a certain channel is a source of videos about slime, or colors, or shapes, or whatever else—and especially once a channel has had a hit video on a given topic—videomakers stray from that classification at their peril. “Honestly, YouTube picks for you,” she says. “Trending right now is Paw Patrol, so we do a lot of Paw Patrol.”

There are other key strategies for making a YouTube Kids video go viral. Make enough of these things and you start to get a sense of what children want to see, she says. “I wish I could tell you more,” she added, “But I don’t want to introduce competition. And, honestly, nobody really understands it. ”

The other thing people don’t yet understand is how growing up in the mobile internet age will change the way children think about storytelling. “There’s a rich set of literature showing kids who are reading more books are more imaginative,” says Calvert, of the Children’s Digital Media Center. “But in the age of interactivity, it’s no longer just consuming what somebody else makes. It’s also making your own thing.”

In other words, the youngest generation of app users is developing new expectations about narrative structure and informational environments. Beyond the thrill a preschooler gets from tapping a screen, or watching The Bing Bong Song video for the umpteenth time, the long-term implications for cellphone-toting toddlers are tangled up with all the other complexities of living in a highly networked on-demand world."
algorithms  adriennelafrance  youtube  2017  children  edtech  attention  nathalieclark  michaelrich  psychology  youtubekids  rachelbar  behavior  toddlers  repetition  storytelling  narrative  preschoolers 
july 2017 by robertogreco
Ravens have paranoid, abstract thoughts about other minds | WIRED UK
"Cementing their status as the most terrifying of all the birds, a new study has found that ravens are able to imagine being spied upon -- a level of abstraction that was previously thought to be unique to humans.

The ability to think abstractly about other minds is singled out by many as a uniquely human trait. Now, a study from the Universities of Houston and Vienna have found that ravens are able to adapt their behaviour by attributing their perceptions to others.

The study, published in Nature Communications, found that if a nearby peephole was open, ravens guarded pockets of food against discovery in response to the sound of other birds -- even if they didn't see another bird. This was not replicated when the peephole was closed, despite hearing the same auditory clues.

According to the study's authors, the discovery "shed[s] a new light on Theory of Mind" -- the ability to attribute mental states to others. A number of studies have found that animals are able to understand what others see -- but only when they can see the head or eyes, which provide gaze cues. This suggests that these animals are responding only to surface cues, and are not experiencing the same abstraction as humans.

The ability to hide food is extremely important to ravens, and they behave completely differently when they feel they are being watched -- hiding food more quickly, for example, and are less likely to return to a hiding place for fear of revealing the location to a competitor.

The study replicated this behaviour. Two rooms were connected by windows and peepholes, both of which could be opened and closed. The ravens were trained to look through the peepholes to observe human experimenters making stashes of food. Finally, both windows were covered while a single peephole remained open -- and, though no bird was present, the ravens still hid the food as if they were being watched.

"Completing this evolutionary and developmental picture will bring us much closer to figuring out what's really unique about the human mind" —Cameron Buckner, University of Houston

"We showed that ravens can generalise from their own experience using the peephole as a pilferer, and predict that audible competitors could potentially see their caches through the peephole," the authors wrote. "Consequently, we argue that they represent 'seeing' in a way that cannot be reduced to the tracking of gaze cues."

Although ravens may not seem similar to humans, the two species do have something in common -- their social lives. Like humans, ravens go through distinct social phases, from fluid interaction with other birds as adolescents to stable breeding pairs in adults. "There is a time when who is in the pack, who's a friend, who's an enemy can change very rapidly," said Cameron Buckner, lead author of the research. "There are not many other species that demonstrate as much social flexibility. "Ravens cooperate well. They can compete well. They maintain long-term, monogamous relationships. It makes them a good place to look for social cognition, because similar social pressures might have driven the evolution of similarly advanced cognitive capacities in very different species".

It's not the only thing ravens can do -- they've also been found to mimic human speech, complete complex logic puzzles and show empathy for fellow birds, which Buckner says could "change our perception of human uniqueness". "Finding that Theory of Mind is present in birds would require us to give up a popular story as to what makes humans special," he said. "Completing this evolutionary and developmental picture will bring us much closer to figuring out what's really unique about the human mind"."
ravens  theoryofmind  corvids  birds  2016  animals  nature  psychology  intelligence 
july 2017 by robertogreco
The History of Ed-Tech: What Went Wrong?
"There’s a popular origin story about education technology: that, it was first developed and adopted by progressive educators, those interested in “learning by doing” and committed to schools as democratic institutions. Then, something changed in the 1980s (or so): computers became commonplace, and ed-tech became commodified – built and sold by corporations, not by professors or by universities. Thus the responsibility for acquiring classroom technology and for determining how it would be used shifted from a handful of innovative educators (often buying hardware and software with their own money) to school administration; once computers were networked, the responsibility shifted to IT. The purpose of ed-tech shifted as well – from creative computing to keyboarding, from projects to “productivity.” (And I’ll admit. I’m guilty of having repeated some form of this narrative myself.)

[tweet: "What if the decentralized, open web was a historical aberration, an accident between broadcast models, not an ideal that was won then lost?"
https://twitter.com/ibogost/status/644994975797805056 ]

But what if, to borrow from Ian Bogost, “progressive education technology” – the work of Seymour Papert, for example – was a historical aberration, an accident between broadcast models, not an ideal that was won then lost?

There’s always a danger in nostalgia, when one invents a romanticized past – in this case, a once-upon-a-time when education technology was oriented towards justice and inquiry before it was re-oriented towards test scores and flash cards. But rather than think about “what went wrong,” it might be useful to think about what was wrong all along.

Although Papert was no doubt a pioneer, he wasn’t the first person to recognize the potential for computers in education. And he was hardly alone in the 1960s and 1970s in theorizing or developing educational technologies. There was Patrick Suppes at Stanford, for example, who developed math instruction software for IBM mainframes and who popularized what became known as “computer-assisted instruction.” (Arguably, Papert refers to Suppes’ work in Mindstorms when he refers to “the computer being used to program the child” rather than his own vision of the child programming the computer.)

Indeed, as I’ve argued repeatedly, the history of ed-tech dates at least as far back as the turn of the twentieth century and the foundation of the field of educational psychology. Much of we see in ed-tech today reflects those origins – the work of psychologist Sidney Pressey, the work of psychologist B. F. Skinner, the work of psychologist Edward Thorndike. It reflects those origins because, as historian Ellen Condliffe Lagemann has astutely observed, “One cannot understand the history of education in the United States during the twentieth century unless one realizes that Edward L. Thorndike won and John Dewey lost.”

Ed-tech has always been more Thorndike than Dewey because education has been more Thorndike than Dewey. That means more instructivism than constructionism. That means more multiple choice tests than projects. That means more surveillance than justice.
(How Thorndike's ed-tech is now being rebranded as “personalization” (and by extension, as progressive education) – now that's an interesting story..."

[via: ""Edward L. Thorndike won and John Dewey lost" is pretty much the perfect tl;dr version of the history of education."
https://twitter.com/jonbecker/status/884460561584594944

See also: "Or David Snedden won. People forget about him."
https://twitter.com/doxtdatorb/status/884520604287860736 ]
audreywatters  ianbogost  johndewey  seymourpapert  edtech  computers  technology  education  ellencondliffe  edwardthorndike  bfskinner  sidneypressey  psychology  management  administration  it  patricksuppes  constructivism  constructionism  progressive  mindstorms  progressiveeducation  standardization  personalization  instructivism  testing  davidsnedden  history 
july 2017 by robertogreco
DIAGRAM >> The Structure of Boredom
"Part III, the structure of boredom, analogously, is as follows: The self (1) relates to the now or present actuality in the mode of immediate experiencing (2). When that present (3) is symbolized as being devoid of values regarded as necessary for one's existence, one experiences boredom (5). Boredom is the awareness that the essential values through which one fulfills himself are not able to be actualized under these present circumstances. To the degree to which these limited values are elevated to absolutes which appear to be unactualizable (6), one is vulnerable to intensive, depressive, demonic boredom."

[via: https://twitter.com/salrandolph/status/877349051049619457 ]
boredom  diagrams  thomasoden  psychology  theology  1969  now  present  awareness  presence  guilt  future  past  anxiety  responsiveness  imagination  trust  emptiness  meaning  meaningmaking 
june 2017 by robertogreco
Is the U.S. Education System Producing a Society of “Smart Fools”? - Scientific American
[had me until he says more (a new kind of) testing is the answer to the problem]

"At last weekend’s annual meeting of the Association for Psychological Science (APS) in Boston, Cornell University psychologist Robert Sternberg sounded an alarm about the influence of standardized tests on American society. Sternberg, who has studied intelligence and intelligence testing for decades, is well known for his “triarchic theory of intelligence,” which identifies three kinds of smarts: the analytic type reflected in IQ scores; practical intelligence, which is more relevant for real-life problem solving; and creativity. Sternberg offered his views in a lecture associated with receiving a William James Fellow Award from the APS for his lifetime contributions to psychology. He explained his concerns to Scientific American.

[An edited transcript of the interview follows.]

In your talk, you said that IQ tests and college entrance exams like the SAT and ACT are essentially selecting and rewarding “smart fools”—people who have a certain kind of intelligence but not the kind that can help our society make progress against our biggest challenges. What are these tests getting wrong?

Tests like the SAT, ACT, the GRE—what I call the alphabet tests—are reasonably good measures of academic kinds of knowledge, plus general intelligence and related skills. They are highly correlated with IQ tests and they predict a lot of things in life: academic performance to some extent, salary, level of job you will reach to a minor extent—but they are very limited. What I suggested in my talk today is that they may actually be hurting us. Our overemphasis on narrow academic skills—the kinds that get you high grades in school—can be a bad thing for several reasons. You end up with people who are good at taking tests and fiddling with phones and computers, and those are good skills but they are not tantamount to the skills we need to make the world a better place.

What evidence do you see of this harm?

IQ rose 30 points in the 20th century around the world, and in the U.S. that increase is continuing. That’s huge; that’s two standard deviations, which is like the difference between an average IQ of 100 and a gifted IQ of 130. We should be happy about this but the question I ask is: If you look at the problems we have in the world today—climate change, income disparities in this country that probably rival or exceed those of the gilded age, pollution, violence, a political situation that many of us never could have imaged—one wonders, what about all those IQ points? Why aren’t they helping?

What I argue is that intelligence that’s not modulated and moderated by creativity, common sense and wisdom is not such a positive thing to have. What it leads to is people who are very good at advancing themselves, often at other people’s expense. We may not just be selecting the wrong people, we may be developing an incomplete set of skills—and we need to look at things that will make the world a better place.

Do we know how to cultivate wisdom?

Yes we do. A whole bunch of my colleagues and I study wisdom. Wisdom is about using your abilities and knowledge not just for your own selfish ends and for people like you. It’s about using them to help achieve a common good by balancing your own interests with other people’s and with high-order interests through the infusion of positive ethical values.

You know, it’s easy to think of smart people but it’s really hard to think of wise people. I think a reason is that we don’t try to develop wisdom in our schools. And we don’t test for it, so there’s no incentive for schools to pay attention.

Can we test for wisdom and can we teach it?

You learn wisdom through role-modeling. You can start learning that when you are six or seven. But if you start learning what our schools are teaching, which is how to prepare for the next statewide mastery tests, it crowds out of the curriculum the things that used to be essential. If you look at the old McGuffey Readers, they were as much about teaching good values and good ethics and good citizenship as about teaching reading. It’s not so much about teaching what to do but how to reason ethically; to go through an ethical problem and ask: How do I arrive at the right solution?

I don’t always think about putting ethics and reasoning together. What do you mean by that?

Basically, ethical reasoning involves eight steps: seeing that there’s a problem to deal with (say, you see your roommate cheat on an assignment); identifying it as an ethical problem; seeing it as a large enough problem to be worth your attention (it’s not like he’s just one mile over the speed limit); seeing it as personally relevant; thinking about what ethical rules apply; thinking about how to apply them; thinking what are the consequences of acting ethically—because people who act ethically usually don’t get rewarded; and, finally, acting. What I’ve argued is ethical reasoning is really hard. Most people don’t make it through all eight steps.

If ethical reasoning is inherently hard, is there really less of it and less wisdom now than in the past?

We have a guy [representative-elect Greg Gianforte of Montana] who allegedly assaulted a reporter and just got elected to the U.S. House of Representatives—and that’s after a 30-point average increase in IQ. We had violence in campaign rallies. Not only do we not encourage creativity, common sense and wisdom, I think a lot of us don’t even value them anymore. They’re so distant from what’s being taught in schools. Even in a lot of religious institutions we’ve seen a lot of ethical and legal problems arise. So if you’re not learning these skills in school or through religion or your parents, where are you going to learn them? We get people who view the world as being about people like themselves. We get this kind of tribalism.

So where do you see the possibility of pushing back?

If we start testing for these broader kinds of skills, schools will start to teach to them, because they teach to the test. My colleagues and I developed assessments for creativity, common sense and wisdom. We did this with the Rainbow Project, which was sort of experimental when I was at Yale. And then at Tufts, when I was dean of arts and sciences, we started Kaleidoscope, which has been used with tens of thousands of kids for admission to Tufts. They are still using it. But it’s very hard to get institutions to change. It’s not a quick fix. Once you have a system in place, the people who benefit from it rise to the top and then they work very hard to keep it.

Looking at the broader types of admission tests you helped implement—like Kaleidoscope at Tufts, the Rainbow Project at Yale, or Panorama at Oklahoma State, is there any evidence that kids selected for having these broader skills are in any way different from those who just score high on the SAT?

The newly selected kids were different. I think the folks in admissions would say so, at least when we started. We admitted kids who would not have gotten in under the old system—maybe they didn’t quite have the test scores or grades. When I talk about this, I give examples, such as those who wrote really creative essays.

Has there been any longitudinal follow-up of these kids?

We followed them through the first year of college. With Rainbow we doubled prediction [accuracy] for academic performance, and with Kaleidoscope we could predict the quality of extracurricular performance, which the SAT doesn’t do.

Do you think the emphasis on narrow measures like the SAT or GRE is hurting the STEM fields in particular?

I think it is. I think it’s hurting everything. We get scientists who are very good forward incrementers—they are good at doing the next step but they are not the people who change the field. They are not redirectors or reinitiators, who start a field over. And those are the people we need.

Are you hopeful about change?

If one could convince even a few universities and schools to try to follow a different direction, others might follow. If you start encouraging a creative attitude, to defy the crowd and to defy the zeitgeist, and if you teach people to think for themselves and how what they do affects others, I think it’s a no-lose proposition. And these things can be taught and they can be tested."
education  science  social  wisdom  iq  meritocracy  intelligence  2017  psychology  claudiawallis  robertsternberg  performance  creativity  unschooling  deschooling  lcproject  openstudioproject  sfsh  tcsnmy  rainbowproject  power  ethics  reasoning  values  learning  selfishness  gildedage  inequality  climatechange  pollution  violence  testing  standardizedtesting  standardization  sat  gre  act  knowledge  teachingtothetest 
june 2017 by robertogreco
Why Millennials Are Lonely
"We’re getting lonelier.

The General Social Survey found that the number of Americans with no close friends has tripled since 1985. “Zero” is the most common number of confidants, reported by almost a quarter of those surveyed. Likewise, the average number of people Americans feel they can talk to about ‘important matters’ has fallen from three to two.

Mysteriously, loneliness appears most prevalent among millennials. I see two compounding explanations.

First, incredibly, loneliness is contagious. A 2009 study using data collected from roughly 5000 people and their offspring from Framingham, Massachusetts since 1948 found that participants are 52% more likely to be lonely if someone they’re directly connected to (such as a friend, neighbor, coworker or family member) is lonely. People who aren’t lonely tend to then become lonelier if they’re around people who are.

Why? Lonely people are less able to pick up on positive social stimuli, like others’ attention and commitment signals, so they withdraw prematurely – in many cases before they’re actually socially isolated. Their inexplicable withdrawal may, in turn, make their close connections feel lonely too. Lonely people also tend to act “in a less trusting and more hostile fashion,” which may further sever social ties and impart loneliness in others.

This is how, as Dr. Nicholas Christakis told the New York Times in a 2009 article on the Framingham findings, one lonely person can “destabilize an entire social network” like a single thread unraveling a sweater.
If you’re lonely, you transmit loneliness, and then you cut the tie or the other person cuts the tie. But now that person has been affected, and they proceed to behave the same way. There is this cascade of loneliness that causes a disintegration of the social network.

Like other contagions, loneliness is bad for you. Lonely adolescents exhibit more social stress compared to not lonely ones. Individuals who feel lonely also have significantly higher Epstein-Barr virus antibodies (the key player in mononucleosis). Lonely women literally feel hungrier. Finally, feeling lonely increases risk of death by 26% and doubles our risk of dying from heart disease.

But if loneliness is inherently contagious, why has it just recently gotten worse?

The second reason for millennial loneliness is the Internet makes it viral. It’s not a coincidence that loneliness began to surge two years after Apple launched its first commercial personal computer and five years before Tim Berners-Lee invented the World Wide Web.

Ironically, we use the Internet to alleviate our loneliness. Social connection no longer requires a car, phone call or plan – just a click. And it seems to work: World of Warcraft players experience less social anxiety and less loneliness when online than in the real world. The Internet temporarily enhances the social satisfaction and behavior of lonely people, who are more likely to go online when they feel isolated, depressed or anxious.

The Internet provides, as David Brooks wrote in a New York Times column last fall, “a day of happy touch points.”

But the Internet can eventually isolate us and stunt our remaining relationships. Since Robert Putnam’s famous 2000 book Bowling Alone, the breakdown of community and civic society has almost certainly gotten worse. Today, going to a bowling alley alone, Putnam’s central symbol of “social capital deficit,” would actually be definitively social. Instead, we’re “bowling” – and a host of other pseudo-social acts – online.

One reason the Internet makes us lonely is we attempt to substitute real relationships with online relationships. Though we temporarily feel better when we engage others virtually, these connections tend to be superficial and ultimately dissatisfying. Online social contacts are “not an effective alternative for offline social interactions,” sums one study.

In fact, the very presence of technology can hinder genuine offline connection. Simply having a phone nearby caused pairs of strangers to rate their conversation as less meaningful, their conversation partners as less empathetic and their new relationship as less close than strangers with a notebook nearby instead.

Excessive Internet use also increases feelings of loneliness because it disconnects us from the real world. Research shows that lonely people use the Internet to “feel totally absorbed online” – a state that inevitably subtracts time and energy that could otherwise be spent on social activities and building more fulfilling offline friendships.

Further exacerbating our isolation is society’s tendency to ostracize lonely peers. One famous 1965 study found that when monkeys were confined to a solitary isolation chamber called the "pit of despair" and reintroduced to their colony months later, they were shunned and excluded. The Framingham study suggested that humans may also drive away the lonely, so that “feeling socially isolated can lead to one becoming objectively isolated.”

The more isolated we feel, the more we retreat online, forging a virtual escape from loneliness. This is particularly true for my generation, who learned to self-soothe with technology from a young age. It will only become more true as we flock to freelancing and other means of working alone.

In his controversial 1970 book The Pursuit of Loneliness, sociologist Phillip Slater coined the “Toilet Assumption”: our belief that undesirable feelings and social realities will “simply disappear if we ignore them.” Slater argued that America’s individualism and, in turn, our loneliness, “is rooted in the attempt to deny the reality of human interdependence.” The Internet is perhaps the best example to date of our futile attempt to flush away loneliness.

Instead, we’re stuck with a mounting pile of infectious isolation."
online  internet  socialmedia  loneliness  2017  isolation  social  phillipslater  1970  1965  contagion  psychology  technology  smartphones  robertputnam  2000  web  nicholaschristakis  trust  hostility 
june 2017 by robertogreco
Mindfulness training does not foster empathy, and can even make narcissists worse – Research Digest
"Sharing with others, helping people in need, consoling those who are distressed. All these behaviours can be encouraged by empathy – by understanding what other people are thinking and feeling, and sharing their emotions. Enhance empathy, especially in those who tend to have problems with it – like narcissists – and society as a whole might benefit. So how can it be done?

In fact, the cultivation of empathy is a “presumed benefit” of mindfulness training, note the authors of a new study, published in Self and Identity, designed to investigate this experimentally. People who are “mindfully aware” focus on the present moment, without judgement. So, it’s been argued, they should be better able to resist getting caught up in their own thoughts, freeing them to think more about the mental states of other people. As mindfulness courses are increasingly being offered in schools and workplaces, as well as in mental health settings, it’s important to know what such training can and can’t achieve. The new results suggest it won’t foster empathy – and, worse, it could even backfire.

Anna Ridderinkhof, at the University of Amsterdam, and her colleagues divided 161 adult volunteers in three groups. Each completed questionnaires assessing their levels of narcissistic and also autistic traits. It’s already known that people who score highly on narcissism (who feel superior to others, believe they are entitled to privileges and want to be admired) tend to experience less “affective empathy”. They aren’t as likely to share the emotional state of another person. People who score highly on autistic traits have no problem with affective empathy, but tend to show impairments in “cognitive empathy”. They find it harder to work out what other people are feeling.

One group spent five minutes in a guided mindfulness meditation, in which they were encouraged to focus on the physical sensations of breathing, while observing any thoughts, without judging them. The second group took part in a relaxation exercise (so any effects of stress relief alone could be examined). People in the control group were invited to let their minds wander, and to be immersed in their thoughts and feelings.

After these exercises, the researchers tested the volunteers’ propensity to feel cognitive empathy, via the Reading the Mind in the Eyes test, which involves identifying emotions from photographs of people’s eyes, and they also tested their affective empathy, by analysing how much emotional concern they showed toward a player who was socially rejected in a ball game.

There is some debate about whether a greater capacity for empathy would be helpful for most people. Some researchers, such as Professor Tania Singer, a director at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, even suggest that an “excess” of empathy explains what’s often termed “burnout” in members of caring professions, such as nurses. But Ridderinkhof’s team predicted that mindfulness training would improve empathy in the volunteers who needed it most: in people with high levels of autistic or narcissistic traits.

It didn’t. While there was no overall effect on empathy in the mindfulness group, further analysis revealed that, compared with the control and relaxation groups combined, non-narcissists who completed the mindfulness exercise did show a slight improvement specifically in cognitive empathy, but for narcissistic people, their cognitive empathy was actually reduced. For the people who scored highly on autistic traits, meanwhile, there was no effect on mind-reading accuracy, though there were intriguing signs of greater prosocial behaviour, indicated by an increase in the number of passes of the ball to socially excluded individuals.

Since volunteers were encouraged not to judge any thoughts they had during the mindfulness meditation, this might indeed have helped non-narcissists let go of self-critical thoughts, allowing them to think more about the mental states of others, the researchers suggest. “By contrast, it may have ironically ‘licensed’ narcissistic individuals to focus more exclusively on their self-aggrandising thoughts.” As a result, they may have thought even less about the mental states of others.

Critics may argue that a single five-minute mindfulness meditation exercise is simply not enough, and that improvements in empathy – in non-narcissists, at least – might perhaps show up with longer sessions. While the research team thinks this is worth exploring, there is evidence from earlier studies (that lacked a proper control group) that five-minute sessions can increase accuracy on a mind-reading test, for example. It was reasonable to opt for a brief session in this study, they argue.

Future research might also investigate whether alternative approaches – perhaps training the related concept of “compassion” (which involves “feeling for” rather than “feeling with” a person in psychological pain, and is advocated by Singer) might help narcissists behave more pro-socially."

["Does mindfulness meditation increase empathy? An experiment"
http://www.tandfonline.com/doi/full/10.1080/15298868.2016.1269667 ]
narcissism  mindfulness  meditation  emmayoung  2017  empathy  behavior  psychology  cognitiveempathy  annaridderinkhof 
may 2017 by robertogreco
Cyborgology: What is The History of The Quantified Self a History of?
[from Part 1: https://thesocietypages.org/cyborgology/2017/04/13/what-is-the-history-of-the-quantified-self-a-history-of-part-1/]

"In the past few months, I’ve posted about two works of long-form scholarship on the Quantified Self: Debora Lupton’s The Quantified Self and Gina Neff and Dawn Nufus’s Self-Tracking. Neff recently edited a volume of essays on QS (Quantified: Biosensing Technologies in Everyday Life, MIT 2016), but I’d like to take a not-so-brief break from reviewing books to address an issue that has been on my mind recently. Most texts that I read about the Quantified Self (be they traditional scholarship or more informal) refer to a meeting in 2007 at the house of Kevin Kelly for the official start to the QS movement. And while, yes, the name “Quantified Self” was coined by Kelly and his colleague Gary Wolf (the former founded Wired, the latter was an editor for the magazine), the practice of self-tracking obviously goes back much further than 10 years. Still, most historical references to the practice often point to Sanctorius of Padua, who, per an oft-cited study by consultant Melanie Swan, “studied energy expenditure in living systems by tracking his weight versus food intake and elimination for 30 years in the 16th century.” Neff and Nufus cite Benjamin Franklin’s practice of keeping a daily record of his time use. These anecdotal histories, however, don’t give us much in terms of understanding what a history of the Quantified Self is actually a history of.

Briefly, what I would like to prove over the course of a few posts is that at the heart of QS are statistics, anthropometrics, and psychometrics. I recognize that it’s not terribly controversial to suggest that these three technologies (I hesitate to call them “fields” here because of how widely they can be applied), all developed over the course of the nineteenth century, are critical to the way that QS works. Good thing, then, that there is a second half to my argument: as I touched upon briefly in my [shameless plug alert] Theorizing the Web talk last week, these three technologies were also critical to the proliferation of eugenics, that pseudoscientific attempt at strengthening the whole of the human race by breeding out or killing off those deemed deficient.

I don’t think it’s very hard to see an analogous relationship between QS and eugenics: both movements are predicated on anthropometrics and psychometrics, comparisons against norms, and the categorization and classification of human bodies as a result of the use of statistical technologies. But an analogy only gets us so far in seeking to build a history. I don’t think we can just jump from Francis Galton’s ramblings at the turn of one century to Kevin Kelly’s at the turn of the next. So what I’m going to attempt here is a sort of Foucauldian genealogy—from what was left of eugenics after its [rightful, though perhaps not as complete as one would hope] marginalization in the 1940s through to QS and the multi-billion dollar industry the movement has inspired.

I hope you’ll stick around for the full ride—it’s going to take a a number of weeks. For now, let’s start with a brief introduction to that bastion of Western exceptionalism: the eugenics movement."

[from Part 2: https://thesocietypages.org/cyborgology/2017/04/20/what-is-the-history-of-the-quantified-self-a-history-of-part-2/

"Here we begin to see an awkward situation in our quest to draw a line from Galton and hard-line eugenics (we will differentiate between hardline and “reform” eugenics further on) to the quantified self movement. Behaviorism sits diametrically opposed to eugenics for a number of reasons. Firstly, it does not distinguish between human and animal beings—certainly a tenet to which Galton and his like would object, understanding that humans are the superior species and a hierarchy of greatness existing within that species as well. Secondly, behaviorism accepts that outside, environmental influences will change the psychology of a subject. In 1971, Skinner argued that “An experimental analysis shifts the determination of behavior from autonomous man to the environment—an environment responsible both for the evolution of the species and for the repertoire acquired by each member” (214). This stands in direct conflict with the eugenical ideal that physical and psychological makeup is determined by heredity. Indeed, the eugenicist Robert Yerkes, otherwise close with Watson, wholly rejected the behaviorist’s views (Hergenhahn 400). Tracing the quantified-self’s behaviorist and self-experimental roots, then, leaves us without a very strong connection to the ideologies driving eugenics. Still, using Pearson as a hint, there may be a better path to follow."]

[from Part 3: https://thesocietypages.org/cyborgology/2017/04/27/what-is-the-history-of-the-quantified-self-a-history-of-part-3/

"The history of Galton and eugenics, then, can be traced into the history of personality tests. Once again, we come up against an awkward transition—this time from personality tests into the Quantified Self. Certainly, shades of Galtonian psychometrics show themselves to be present in QS technologies—that is, the treatment of statistical datasets for the purpose of correlation and prediction. Galton’s word association tests strongly influenced the MBTI, a test that, much like Quantified Self projects, seeks to help a subject make the right decisions in their life, though not through traditional Galtonian statistical tools. The MMPI and 16PFQ are for psychological evaluative purposes. And while some work has been done to suggest that “mental wellness” can be improved through self-tracking (see Kelley et al., Wolf 2009), much of the self-tracking ethos is based on factors that can be adjusted in order to see a correlative change in the subject (Wolf 2009). That is, by tracking my happiness on a daily basis against the amount of coffee I drink or the places I go, then I am acknowledging an environmental approach and declaring that my current psychological state is not set by my genealogy. A gap, then, between Galtonian personality tests and QS."]

[from Part 4 (Finale): https://thesocietypages.org/cyborgology/2017/05/08/what-is-the-history-of-the-quantified-self-a-history-of-the-finale/

"What is the history of the quantified self a history of? One could point to technological advances in circuitry miniaturization or in big data collection and processing. The proprietary and patented nature of the majority of QS devices precludes certain types of inquiry into their invention and proliferation. But it is not difficult to identify one of QS’s most critical underlying tenets: self-tracking for the purpose of self-improvement through the identification of behavioral and environmental variables critical to one’s physical and psychological makeup. Recognizing the importance of this premise to QS allows us to trace back through the scientific fields which have strongly influenced the QS movement—from both a consumer and product standpoint. Doing so, however, reveals a seeming incommensurability between an otherwise analogous pair: QS and eugenics. A eugenical emphasis on heredity sits in direct conflict to a self-tracker’s belief that a focus on environmental factors could change one’s life for the better—even while both are predicated on statistical analysis, both purport to improve the human stock, and both, as argued by Dale Carrico, make assertions towards what is a “normal” human.

A more complicated relationship between the two is revealed upon attempting this genealogical connection. What I have outlined over the past few weeks is, I hope, only the beginning of such a project. I chose not to produce a rhetorical analysis of the visual and textual language of efficiency in both movements—from that utilized by the likes of Frederick Taylor and his eugenicist protégés, the Gilbreths, to what Christina Cogdell calls “Biological Efficiency and Streamline Design” in her work, Eugenic Design, and into a deep trove of rhetoric around efficiency utilized by market-available QS device marketers. Nor did I aim to produce an exhaustive bibliographic lineage. I did, however, seek to use the strong sense of self-experimentation in QS to work backwards towards the presence of behaviorism in early-twentieth century eugenical rhetoric. Then, moving in the opposite direction, I tracked the proliferation of Galtonian psychometrics into mid-century personality test development and eventually into the risk-management goals of the neoliberal surveillance state. I hope that what I have argued will lead to a more in-depth investigation into each step along this homological relationship. In the grander scheme, I see this project as part of a critical interrogation into the Quantified Self. By throwing into sharp relief the linkages between eugenics and QS, I seek to encourage resistance to fetishizing the latter’s technologies and their output, as well as the potential for meaningful change via those technologies."]
gabischaffzin  quantifiedself  2017  kevinkelly  garywolf  eugenics  anthropometrics  psychometrics  measurement  statistics  heredity  francisgalton  charlesdarwin  adolphequetelet  normal  psychology  pernilsroll-hansen  michelfoucault  majianadesan  self-regulation  marginalization  anthropology  technology  data  personality  henryfairfieldosborn  moralbehaviorism  behaviorism  williamepstein  mitchelldean  neoliberalism  containment  risk  riskassessment  freedom  rehabilitation  responsibility  obligation  dalecarrico  fredericktaylor  christinacogdell  surveillance  nikolasrose  myers-briggs  mbti  katherinebriggs  isabelbriggsmeyers  bellcurve  emilkraepelin  charlesspearman  rymondcattell  personalitytests  allenneuringer  microsoft  self-experimentation  gamification  deborahlupton  johnwatson  robertyerkes  ginaneff  dawnnufus  self-tracking  melanieswan  benjaminfranklin  recordkeeping  foucault 
may 2017 by robertogreco
4 Things Worse Than Not Learning To Read In Kindergarten | HuffPost
"Limited time for creative play. Young children learn by playing. They learn by digging and dancing and building and knocking things down, not by filling out piles of worksheets. And they learn by interacting with other children, solving problems, sharing and cooperating, not by drilling phonics. Mrs. Gantt and Mrs. Floyd created fabulous centers and units that allowed children to learn about everything from houses to trucks to pets to oceans. And they snuck in some reading and math skills that the children didn’t even notice, because they were so busy playing and creating! Teachers today, however, often have to limit (or even eliminate) time for centers and units, because the academic requirements they are forced to meet don’t allow time for creative learning.

Limited physical activity. Few things are more counterproductive than limiting recess and other types of physical play time for children. Children learn better when they move. Parents and teachers know this intuitively, but research also confirms it. Children who have more opportunities to run around and play have better thinking skills and increased brain activity. And don’t assume that young children are naturally active and are getting all of the exercise they need; researchers have found that children as young as three and four are surprisingly inactive. Yet many schools are limiting or even eliminating recess, even for very young children.

Teaching that focuses on standards and testing. Teachers are increasingly under pressure to prepare their students to perform on standardized tests. This means that their focus is shifting from teaching children in ways that match their development and learning styles to “teaching to the test.” As one teacher reported, “I have watched as my job requirements swung away from a focus on children, their individual learning styles, emotional needs, and their individual families, interests and strengths to a focus on testing, assessing and scoring young children...” This shift in focus means that teachers have less time to nurture and develop children as lifelong learners, because they’re required to focus their efforts on standards that are unrealistic for many children.

Frustration and a sense of failure. Children know when they aren’t meeting the expectations of teachers and other adults. What they don’t know, however, is that those expectations often make no sense. And because they don’t know that, they experience frustration and a sense of failure when they don’t measure up. So the boy who thrived in his experiential preschool, but struggles in his academic -focused kindergarten may become frustrated to the point that he “hates school.” And the girl who can’t sit still for 30 minutes and fill out worksheets knows that she’s disappointing her teacher, but doesn’t know that the task isn’t appropriate for her. Which means that many normal children are becoming frustrated - and are being labelled - by an entirely unrealistic system. As one report has bluntly stated, “Most children are eager to meet high expectations, but their tools and skills as learners as well as their enthusiasm for learning suffer when the demands are inappropriate.”"
kindergarten  reading  schools  education  sfsh  literacy  children  2017  play  health  psychology  testing  failure  frustration  readiness  gayegrooverchristmus 
may 2017 by robertogreco