ayjay + thinking   9

Don't Overthink It | Boston Review
This is true of all big personal decisions: we will know what is great about a college education once we have one; we will know what it is like to love our children only after they exist; we will know what living as an immigrant entails, for us, only after we have emigrated. In these cases, our grasp of the target and its value (e.g. married life) is a matter of living rather than thinking. Marriage is itself a learning experience, one that cannot be pre-empted by calculative reasoning, no matter how sophisticated. We cannot take the measure of our lives in advance. 

We can illustrate the difference between the personal and the political with the “crystal ball test.” Suppose Obama and his advisors could have looked into a crystal ball and seen what the results of the raid would be. They would then be able to definitively answer the question of whether they should undertake it, because they would know exactly what to look for as markers of success and failure. Now suppose that I could look into a crystal ball and see myself twenty or forty years after the decision to go to college or emigrate or get married or have children. What do I look for to check whether the undertaking was a success? Do I look to see if she is smiling? Or how wealthy my future self is? Those metrics won’t do. Perhaps my future self does not care to smile all the time; and perhaps she’s less interested in wealth than I currently am. These changes in her might have been connected to her finding some happiness that I can’t (yet) fathom. [...]

Late in the book, Johnson details his own big decision—a move to California—and the tensions between himself and his wife that ensued in the period following the move. Johnson wonders whether he could have anticipated and pre-empted these later struggles by thinking more exhaustively in advance: “I have often looked back at the decision and wondered if we could have approached it in a way that would have done a better job of reconciling our different values from the beginning.” But no matter how much we increase our investment at the front end—perfecting our minds with thinking classes, long ruminations, novel-reading, and moral algebra—we cannot spare ourselves the agony of learning by doing.
thinking  HTT  ethics  psychology 
yesterday by ayjay
How to Pay Attention (course)
This course is an advanced seminar in the anthropology of attention. What makes the anthropology of attention different from other ways of studying attention (e.g. psychology) is that we study it as a social and cultural phenomenon: attention is not just a matter of individual minds selecting objects from environments. Rather, attention is collectively organized and valued. We learn how to pay attention and what to pay attention to from other people; other people make technological and media systems to intentionally organize collective attention. We learn to value certain kinds of attention (e.g. intense focus on work, mindfulness, or multi-tasking) and to criticize others (e.g. absent-mindedness, distraction, intense focus on entertainment) in cultural contexts. So, while we will be experimenting with our own attentions throughout this course, we will remember that our attentions are not really our own. No one pays attention alone.
attention  thinking 
february 2018 by ayjay
Powerless Placebos
Surfing Uncertainty had the the best explanation of the placebo effect I’ve seen. Perceiving the world directly at every moment is too computationally intensive, so instead the brain guesses what the the world is like and uses perception to check and correct its guesses. In a high-bandwidth system like vision, guesses are corrected very quickly and you end up very accurate (except for weird things like ignoring when the word “the” is twice in a row, like it’s been several times in this paragraph already without you noticing). In a low-bandwidth system like pain perception, the original guess plays a pretty big role, with real perception only modulating it to a limited degree (consider phantom limb pain, where the brain guesses that an arm that isn’t there hurts, and nothing can convince it otherwise). Well, if you just saw a truck run over your foot, you have a pretty strong guess that you’re having foot pain. And if you just got a bunch of morphine, you have a pretty strong guess that your pain is better. The real sense-data can modulate it in a Bayesian way, but the sense-data is so noisy that it won’t be weighted highly enough to replace the guess completely.
neuroscience  thinking  knowledge  from instapaper
february 2018 by ayjay
Rebecca Solnit on a Childhood of Reading and Wandering
These linked paths and roads form a circuit of about six miles that I began hiking ten years ago to walk off my angst during a difficult year. I kept coming back to this route for respite from my work and for my work too, because thinking is generally thought of as doing nothing in a production-oriented culture, and doing nothing is hard to do. It’s best done by disguising it as doing something, and the something closest to doing nothing is walking. Walking itself is the intentional act closest to the unwilled rhythms of the body, to breathing and the beating of the heart. It strikes a delicate balance between working and idling, being and doing. It is a bodily labor that produces nothing but thoughts, experiences, arrivals. After all those years of walking to work out other things, it made sense to come back to work close to home, in Thoreau’s sense, and to think about walking.

Walking, ideally, is a state in which the mind, the body, and the world are aligned, as though they were three characters finally in conversation together, three notes suddenly making a chord. Walking allows us to be in our bodies and in the world without being made busy by them. It leaves us free to think without being wholly lost in our thoughts.
walking  thinking  HTT  from instapaper
august 2017 by ayjay
Emotional Intelligence Needs a Rewrite - Issue 51: Limits - Nautilus
Books and articles on emotional intelligence claim that your brain has an inner core that you inherited from reptiles, wrapped in a wild, emotional layer that you inherited from mammals, all enrobed in—and controlled by—a logical layer that is uniquely human. This three-layer view, called the triune brain, has been popular since the 1950s but has no basis in reality. Brains did not evolve in layers. Brains are like companies—they reorganize as they grow in size. The difference between your brain and, say, a chimp or monkey brain has nothing to do with layering and everything to do with microscopic wiring. Decades of neuroscience research now show that no part of your brain is exclusively dedicated to thoughts or emotions. Both are produced by your entire brain as billions of neurons work together. […]

Emotional granularity is a key to emotional intelligence. If your brain can construct many different emotions automatically and make fine distinctions among them, it can tailor your emotions better to your situation. You’re also better equipped to anticipate and perceive emotion in others in the blink of an eye. The more emotions that you know, the more finely your brain can construct emotional meaning automatically from other people’s actions. Even though your brain is always guessing, when it has more options to guess with, the odds are better it will guess appropriately."
emotion  thinking  from instapaper
august 2017 by ayjay
The Invention of Numbers - Education & Culture
Everett chronicles a great deal of evidence suggesting that humans are hardwired in the brain to distinguish one, two, and three, but no more. There are unwritten languages that can mark nouns and verbs as singular and plural but also trial—but none that mark the “four-al” or beyond. Hunter-gatherer people’s languages tend to have “real” numbers for just one, two, three, and four, with four often being something like “two-two.” Note that even in English, we say not “one-th,” “two-th” or “three-th” but have irregular, one-off forms: first, second, and third, where first and second have no sign of one and two and third is only forcedly relatable to three. After that, however, come the predictable fourth, fifth, sixth, and so on. Babies are best at distinguishing one, two, or three things; beyond that, it gets messy. Roman numerals had simple strokes up to three, but then detoured into subtractive complication with the IV for four.
math  thinking  neuroscience  language 
may 2017 by ayjay
The Kekulé Problem - Issue 47: Consciousness - Nautilus
So what are we saying here? That some unknown thinker sat up one night in his cave and said: Wow. One thing can be another thing. Yes. Of course that’s what we are saying. Except that he didnt say it because there was no language for him to say it in. For the time being he had to settle for just thinking it. And when did this take place? Our influential persons claim to have no idea. Of course they dont think that it took place at all. But aside from that. One hundred thousand years ago? Half a million? Longer? Actually a hundred thousand would be a pretty good guess. It dates the earliest known graphics—found in the Blombos Cave in South Africa. These scratchings have everything to do with our chap waking up in his cave. For while it is fairly certain that art preceded language it probably didnt precede it by much. Some influential persons have actually claimed that language could be up to a million years old. They havent explained what we have been doing with it all this time. What we do know—pretty much without question—is that once you have language everything else follows pretty quickly. The simple understanding that one thing can be another thing is at the root of all things of our doing. From using colored pebbles for the trading of goats to art and language and on to using symbolic marks to represent pieces of the world too small to see.
thinking  language  from instapaper
april 2017 by ayjay
Book Review: Seeing Like A State
But psychiatric patients have a metis of dealing with their individual diseases the same way peasants have a metis of dealing with their individual plots of land. My favorite example of this is doctors who learn their patients are taking marijuana, refuse to keep prescribing them their vitally important drugs unless the patient promises to stop, and then gets surprised when the patients end up decompensating because the marijuana was keeping them together. I’m not saying smoking marijuana is a good thing. I’m saying that for some people it’s a load-bearing piece of their mental edifice. And if you take it away without any replacement they will fall apart. And they have explained this to you a thousand times and you didn’t believe them.

There are so many fricking patients who respond to sedative medications by becoming stimulated, or stimulant medications by becoming sedated, or who become more anxious whenever they do anti-anxiety exercises, or who hallucinate when placed on some super common medication that has never caused hallucinations in anyone else, or who become suicidal if you try to reassure them that things aren’t so bad, or any other completely perverse and ridiculous violation of the natural order that you can think of. And the only redeeming feature of all of this is that the patients themselves know all of this stuff super-well and are usually happy to tell you if you ask. [...]

Maybe instead of concluding that Scott is too focused on peasant villages, we should conclude that he’s focused on confrontations between a well-educated authoritarian overclass and a totally separate poor underclass. Most modern political issues don’t exactly map on to that – even things like taxes where the rich and the poor are on separate sides don’t have a bimodal distribution. But in cases there are literally about rich people trying to dictate to the poorest of the poor how they should live their lives, maybe this becomes more useful.

Actually, one of the best things the book did to me was make me take cliches about “rich people need to defer to the poor on poverty-related policy ideas” more seriously. This has become so overused that I roll my eyes at it: “Could quantitative easing help end wage stagnation? Instead of asking macroeconomists, let’s ask this 19-year old single mother in the Bronx!” But Scott provides a lot of situations where that was exactly the sort of person they should have asked. He also points out that Tanzanian natives using their traditional farming practices were more productive than European colonists using scientific farming. I’ve had to listen to so many people talk about how “we must respect native people’s different ways of knowing” and “native agriculturalists have a profound respect for the earth that goes beyond logocentric Western ideals” and nobody had ever bothered to tell me before that they actually produced more crops per acre, at least some of the time. That would have put all of the other stuff in a pretty different light.
thinking  from instapaper
march 2017 by ayjay
The One and Only
Ellis, however, when he turns in his concluding chapter from assembling an anthology of passages (fictional and nonfictional) which have helped to construct Shakespeare’s image over the centuries, and addresses instead “what kinds of methods have to be used by those who, with no new information available, feel constrained to produce new biographies,” proves to be both shrewd and perceptive.

There are, he suggests, six basic strategies employed by most of the Shakespeare biographers of our time. These he enumerates as (1) “the argument from absence,” meaning that where there is silence, as with proof of Shakespeare’s Catholicism, there must have been a need for secrecy that effectively proves the case; (2) “minding your language”—the use of what he calls “weasel words” such as “perhaps,” “if,” “probably,” “could have,” “may,” to conceal the fact that we don’t really know; (3) using the plays to reveal distinct features of Shakespeare’s own life and thought; (4) using the sonnets in the same way; (5) shifting the burden onto historical circumstances that apparently elucidate the nature of his private existence; and (6) “the argument from proximity, or joining up the dots,” meaning the deployment of what we know about Shakespeare’s schoolmaster in Stratford, or his relatives and acquaintances, to make the little or nothing actually established about him go a long way.
shakespeare  criticism  thinking  from instapaper
december 2016 by ayjay

Copy this bookmark: