12263
Elephants and Kings: An Environmental History, Trautmann
"Because of their enormous size, elephants have long been irresistible for kings as symbols of their eminence. In early civilizations—such as Egypt, Mesopotamia, the Indus Civilization, and China—kings used elephants for royal sacrifice, spectacular hunts, public display of live captives, or the conspicuous consumption of ivory—all of them tending toward the elephant’s extinction. The kings of India, however, as Thomas R. Trautmann shows in this study, found a use for elephants that actually helped preserve their habitat and numbers in the wild: war.
"Trautmann traces the history of the war elephant in India and the spread of the institution to the west—where elephants took part in some of the greatest wars of antiquity—and Southeast Asia (but not China, significantly), a history that spans 3,000 years and a considerable part of the globe, from Spain to Java. He shows that because elephants eat such massive quantities of food, it was uneconomic to raise them from birth. Rather, in a unique form of domestication, Indian kings captured wild adults and trained them, one by one, through millennia. Kings were thus compelled to protect wild elephants from hunters and elephant forests from being cut down. By taking a wide-angle view of human-elephant relations, Trautmann throws into relief the structure of India’s environmental history and the reasons for the persistence of wild elephants in its forests."
to:NB  books:noted  elephants  ancient_history  india  war
7 hours ago
The Science Of Grading Teachers Gets High Marks | FiveThirtyEight
Errr, if there's a _systematic_ flaw in the research design, which is what the critics are saying, then it's not surprising that applying the same design to a different data set gives similar results! (That's pretty much what it means for the flaw to be systematic.)
8 hours ago
Mellars, P.A.: The Neanderthal Legacy: An Archaeological Perspective from Western Europe. (eBook and Paperback)
"The Neanderthals populated western Europe from nearly 250,000 to 30,000 years ago when they disappeared from the archaeological record. In turn, populations of anatomically modern humans, Homo sapiens, came to dominate the area. Seeking to understand the nature of this replacement, which has become a hotly debated issue, Paul Mellars brings together an unprecedented amount of information on the behavior of Neanderthals. His comprehensive overview ranges from the evidence of tool manufacture and related patterns of lithic technology, through the issues of subsistence and settlement patterns, to the more controversial evidence for social organization, cognition, and intelligence. Mellars argues that previous attempts to characterize Neanderthal behavior as either "modern" or "ape-like" are both overstatements. We can better comprehend the replacement of Neanderthals, he maintains, by concentrating on the social and demographic structure of Neanderthal populations and on their specific adaptations to the harsh ecological conditions of the last glaciation.
"Mellars's approach to these issues is grounded firmly in his archaeological evidence. He illustrates the implications of these findings by drawing from the methods of comparative socioecology, primate studies, and Pleistocene paleoecology. The book provides a detailed review of the climatic and environmental background to Neanderthal occupation in Europe, and of the currently topical issues of the behavioral and biological transition from Neanderthal to fully "modern" populations."
to:NB  books:noted  human_evolution
yesterday
The Rise and Fall of Urban Economies: Lessons from San Francisco and Los Angeles | Michael Storper, Thomas Kemeny, Naji Makarem, and Taner Osman
"Today, the Bay Area is home to the most successful knowledge economy in America, while Los Angeles has fallen progressively farther behind its neighbor to the north and a number of other American metropolises. Yet, in 1970, experts would have predicted that L.A. would outpace San Francisco in population, income, economic power, and influence. The usual factors used to explain urban growth—luck, immigration, local economic policies, and the pool of skilled labor—do not account for the contrast between the two cities and their fates. So what does?
"The Rise and Fall of Urban Economies challenges many of the conventional notions about economic development and sheds new light on its workings. The authors argue that it is essential to understand the interactions of three major components—economic specialization, human capital formation, and institutional factors—to determine how well a regional economy will cope with new opportunities and challenges. Drawing on economics, sociology, political science, and geography, they argue that the economic development of metropolitan regions hinges on previously underexplored capacities for organizational change in firms, networks of people, and networks of leaders. By studying San Francisco and Los Angeles in unprecedented levels of depth, this book extracts lessons for the field of economic development studies and urban regions around the world."

--- I'd be interested to see how they argue that "luck" can't account for the difference. (Also,despite its "fall", the Los Angeles metro area has per-capita incomes higher than at least 90% of the country...)
to:NB  books:noted  economics  cities  san_francisco  los_angeles
yesterday
From Uniform Laws of Large Numbers to Uniform Ergodic Theorems
"The purpose of these lectures is to present three different approaches with their own methods for establishing uniform laws of large numbers and uni- form ergodic theorems for dynamical systems. The presentation follows the principle according to which the i.i.d. case is considered first in great de- tail, and then attempts are made to extend these results to the case of more general dependence structures. The lectures begin (Chapter 1) with a re- view and description of classic laws of large numbers and ergodic theorems, their connection and interplay, and their infinite dimensional extensions to- wards uniform theorems with applications to dynamical systems. The first approach (Chapter 2) is of metric entropy with bracketing which relies upon the Blum-DeHardt law of large numbers and Hoffmann-Jørgensen’s exten- sion of it. The result extends to general dynamical systems using the uniform ergodic lemma (or Kingman’s subadditive ergodic theorem). In this context metric entropy and majorizing measure type conditions are also considered. The second approach (Chapter 3) is of Vapnik and Chervonenkis. It relies upon Rademacher randomization (subgaussian inequality) and Gaussian ran- domization (Sudakov’s minoration) and offers conditions in terms of random entropy numbers. Absolutely regular dynamical systems are shown to sup- port the VC theory using a blocking technique and Eberlein’s lemma. The third approach (Chapter 4) is aimed to cover the wide sense stationary case which is not accessible by the previous two methods. This approach relies upon the spectral representation theorem and offers conditions in terms of the orthogonal stochastic measures which are associated with the underlying dynamical system by means of this theorem. The case of bounded variation is covered, while the case of unbounded variation is left as an open question. The lectures finish with a supplement in which the role of uniform conver- gence of reversed martingales towards consistency of statistical models is explained via the concept of Hardy’s regular convergence."

--- I got a glance at this once a decade ago in a library, and have been looking for a copy for years.
in_NB  ergodic_theory  vc-dimension  learning_theory  stochastic_processes  to_read  empirical_processes
6 days ago
globalinequality: Did socialism keep capitalism equal?
Some econometric evidence for one of my pet-crank notions. The regression specifications look dubious, however.
9 days ago
Project MUSE - Reading Lucretius in the Renaissance
"In the Renaissance, Epicureanism and other heterodox scientific theories were strongly associated with heresy and atheism, and frequently condemned. Yet, when Lucretius’s Epicurean poem De Rerum Natura reappeared in 1417, these associations did not prevent the poem’s broad circulation. A survey of marginalia in Lucretius manuscripts reveals a characteristic humanist reading agenda, focused on philology and moral philosophy, which facilitated the circulation of such heterodox texts among an audience still largely indifferent to their radical content. Notes in later sixteenth century print copies reveal a transformation in reading methods, and an audience more receptive to heterodox science."

--- Condensation of her forthcoming book (or vice versa).
--- Shorter AP: Sure, experimenting with Lucretius seems harmless enough at the beginning, just philological interest and elegant verse, but all it takes is some cleaned up editions for minds to be warped forever.
10 days ago
Goodbye to the genius who changed the way we think (and you didn’t even know it) - The Washington Post
Scott Page's memorial notice for the late, lamented John Holland. (I am morally certain Scott did not write the headline.)
I was never close to Holland, but he was an inspiration: a figure from the mad old days when things were still so new and formless that no one could tell the difference between genius and inanity, who'd come out of them with genuinely great contributions and an oddly innocent indifference to disciplinary boundaries. I remember how reading and working my way through his _Adaptation in Natural and Artificial Systems_ seemed to open up new worlds...
holland.john  obituaries  cellular_automata  adaptive_behavior  genetic_algorithms  page.scott  cognitive_science  agent-based_models  complexity  to:blog
10 days ago
The Creative Apocalypse That Wasn’t - The New York Times
I'd like to believe this is all right, but there are some points where doubt worries at me. (1) SJ keeps talking about "average" incomes - presumably these are _mean_ incomes, which are going to be very misleading in winner-take-all industries. Median would be much better. (2) This also doesn't speak to the volatility of incomes, which is a huge part of the issue. (3) I also worry about selection bias by just looking at those who in a given year manage to make enough money in the arts to be classified as working artists...

--- ETA: some critique here https://pinboard.in/u:cshalizi/b:837728ae56ab
have_read  networked_life  culture_industires  the_work_of_art_in_the_age_of_mechanical_reproduction  johnson.steven  to:blog  intellectual_property
10 days ago
Expectations and Investment
"Using micro data from ... quarterly survey of Chief Financial Officers, we show that corporate investment plans as well as actual investment are well explained by CFOs’ expectations of earnings growth. The information in expectations data is not subsumed by traditional variables, such as Tobin’s Q or discount rates. We also show that errors in CFO expectations of earnings growth are predictable from past earnings and other data, pointing to extrapolative structure of expectations and suggesting that expectations may not be rational. This evidence, like earlier findings in finance, points to the usefulness of data on actual expectations for understanding economic behavior."
to:NB  economics  decision-making  via:jbdelong
12 days ago
Media, markets and institutional change: Evidence from the Protestant Reformation | VOX, CEPR’s Policy Portal
"Internet-based communications technologies appear to be integral to the diffusion of social movements today. This column looks back at the Protestant Reformation – the first mass movement to use the new technology of the printing press to drive social change. It argues that diffusion of the Reformation was not driven by technology alone. Competition and openness in the media were also crucial, and delivered their biggest effects in cities where political freedom was most limited."

--- Looks promising, but needs careful examination
track_down_references  the_printing_press_as_an_agent_of_change  epidemiology_of_representations  reformation  early_modern_european_history  via:henry_farrell
12 days ago
Extracting Low-Dimensional Latent Structure from Time Series in the Presence of Delays
"Noisy, high-dimensional time series observations can often be described by a set of low-dimensional latent variables. Commonly used methods to extract these latent variables typically assume instantaneous relationships between the latent and observed variables. In many physical systems, changes in the latent variables manifest as changes in the observed variables after time delays. Techniques that do not account for these delays can recover a larger number of latent variables than are present in the system, thereby making the latent representation more difficult to interpret. In this work, we introduce a novel probabilistic technique, time-delay gaussian-process factor analysis (TD-GPFA), that performs dimensionality reduction in the presence of a different time delay between each pair of latent and observed variables. We demonstrate how using a gaussian process to model the evolution of each latent variable allows us to tractably learn these delays over a continuous domain. Additionally, we show how TD-GPFA combines temporal smoothing and dimensionality reduction into a common probabilistic framework. We present an expectation/conditional maximization either (ECME) algorithm to learn the model parameters. Our simulations demonstrate that when time delays are present, TD-GPFA is able to correctly identify these delays and recover the latent space. We then applied TD-GPFA to the activity of tens of neurons recorded simultaneously in the macaque motor cortex during a reaching task. TD-GPFA is able to better describe the neural activity using a more parsimonious latent space than GPFA, a method that has been used to interpret motor cortex data but does not account for time delays. More broadly, TD-GPFA can help to unravel the mechanisms underlying high-dimensional time series data by taking into account physical delays in the system."
to:NB  to_read  inference_to_latent_objects  dimension_reduction  neural_data_analysis  nonparametrics  time_series  yu.byron
12 days ago
Science Isn’t Broken | FiveThirtyEight
I like the idea of having researchers compete to throw all sorts of different modeling choices at the same data, and the initial example is cool.
13 days ago
Eigenstyle
This is a perfectly nice example. So how sexist am I that I am not going to swap out the cars-and-trucks one in my chapter on PCA for this? (I guess I should at least mention it.)
13 days ago
Purchasing Whiteness: Pardos, Mulattos, and the Quest for Social Mobility in the Spanish Indies | Ann Twinam
"The colonization of Spanish America resulted in the mixing of Natives, Europeans, and Africans and the subsequent creation of a casta system that discriminated against them. Members of mixed races could, however, free themselves from such burdensome restrictions through the purchase of a gracias al sacar—a royal exemption that provided the privileges of Whiteness. For more than a century, the whitening gracias al sacar has fascinated historians. Even while the documents remained elusive, scholars continually mentioned the potential to acquire Whiteness as a provocative marker of the historic differences between Anglo and Latin American treatments of race. Purchasing Whiteness explores the fascinating details of 40 cases of whitening petitions, tracking thousands of pages of ensuing conversations as petitioners, royal officials, and local elites disputed not only whether the state should grant full whiteness to deserving individuals, but whether selective prejudices against the castas should cease."

--- Surely it is only a matter of time before a libertarian proposes reviving this custom for the US.
to:NB  books:noted  race  racism  early_modern_world_history
15 days ago
The Great Beanie Baby Bubble: Mass Delusion and the Dark Side of Cute by Zac Bissonnette - Powell's Books
"There has never been a craze like Beanie Babies. The $5 beanbag animals with names like Seaweed the Otter and Gigi the Poodle drove millions of Americans into a greed-fueled frenzy as they chased the rarest Beanie Babies, whose values escalated weekly in the late 1990s. "A single Beanie Baby sold for$10,000, and on eBay the animals comprised 10 percent of all sales. Suburban moms stalked UPS trucks to get the latest models, a retired soap opera star lost his kids’ six-figure college funds investing in them, and a New Jersey father sold three million copies of a self-published price guide that predicted what each animal would be worth in ten years. More than any other consumer good in history, Beanie Babies were carried to the height of success by a collective belief that their values would always rise.
"Just as strange as the mass hysteria was the man behind it. From the day he started in the toy industry, after dropping out of college, Ty Warner devoted all his energy to creating what he hoped would be the most perfect stuffed animals the world had ever seen. Sometimes called the "Steve Jobs of plush" by his employees, he obsessed over every detail of every animal. He had no marketing budget and no connections, but he had something more valuable — an intuitive grasp of human psychology that would make him the richest man in the history of toys.
"Through first-ever interviews with former Ty Inc. employees, Warner’s sister, and the two ex-girlfriends who were by his side as he achieved the American dream, The Great Beanie Baby Bubble tells the inspiring yet tragic story of one of America’s most enigmatic self-made tycoons. Bestselling author Zac Bissonnette uncovers Warner’s highly original approach to product development, sales, and marketing that enabled the acquisition of plush animals to activate the same endorphins chased by stock speculators and gamblers.
"Starting with a few Beanie-crazed housewives on a cul-de-sac in Naperville, Illinois, Beanie Babies became the first viral craze of the Internet era. Bissonnette traces their rise from the beginning of the official website — one of the first corporate websites to aggressively engage consumers — to the day when "rare" models became worthless as quickly as they’d once been deemed priceless. He also explores the big questions: Why did grown men and women lose their minds over stuffed animals? Was it something unique about the last years of the American century — or just the weirdest version of the irrational episodes that have happened periodically ever since the Dutch tulip mania of the 1630s?"

--- Permit me to doubt the "genius" bits.
to:NB  books:noted  coveted  market_failures_in_everything  market_bubbles  finance  economics  via:pinboard
15 days ago
Learning by Doing: The Real Connection between Innovation, Wages, and Wealth - James Bessen
"Today’s great paradox is that we feel the impact of technology everywhere—in our cars, our phones, the supermarket, the doctor’s office—but not in our paychecks. In the past, technological advancements dramatically increased wages, but for three decades now, the median wage has remained stagnant. Machines have taken over much of the work of humans, destroying old jobs while increasing profits for business owners. The threat of ever-widening economic inequality looms, but in Learning by Doing, James Bessen argues that increased inequality is not inevitable.
"Workers can benefit by acquiring the knowledge and skills necessary to implement rapidly evolving technologies; unfortunately, this can take years, even decades. Technical knowledge is mostly unstandardized and difficult to acquire, learned through job experience rather than in the classroom. As Bessen explains, the right policies are necessary to provide strong incentives for learning on the job. Politically influential interests have moved policy in the wrong direction recently. Based on economic history as well as analysis of today’s labor markets, his book shows a way to restore broadly shared prosperity."
in_NB  books:noted  economics  class_struggles_in_america  inequality  productivity  computers  innovation  tacit_knowledge  technological_change
16 days ago
The Weekly Ansible, 50 Sci-Fi & Fantasy Works Every Socialist Should...
I am struck by how many of the books are praised for undermining the fantasy genre. I could see recommending them to a comrade if you (1) thought they un-self-consciously enjoyed genre fantasy, and (2) that enjoyment/acceptance hurt the development of their political awareness, but Mieville doesn't really say why should "every socialist" read them. An uncharitable critic could suggest that Mieville is here elevating a personal ambivalence to a universal standard of taste --- but honestly I have no idea _why_ he made such odd suggestions.
literary_criticism  fantasy  science_fiction  mieville.china  via:james-nicoll  socialism
16 days ago
Ancestry - Ann Leckie
I am enough of a Cherryh admirer to have written about it (http://bactra.org/notebooks/cherryh.html), but I have to say that it had never occurred to me that the _Ancillary_ books, which I like very much, were in the same lineage. Perhaps this calls for a re-read.
science_fiction  leckie.anne  cherryh.c.j.
18 days ago
The Business of War | Cambridge University Press
"This is a major new approach to the military revolution and the relationship between warfare and the power of the state in early modern Europe. Whereas previous accounts have emphasised the growth of state-run armies during this period, David Parrott argues instead that the delegation of military responsibility to sophisticated and extensive networks of private enterprise reached unprecedented levels. This included not only the hiring of troops but their equipping, the supply of food and munitions, and the financing of their operations. The book reveals the extraordinary prevalence and capability of private networks of commanders, suppliers, merchants and financiers who managed the conduct of war on land and at sea, challenging the traditional assumption that reliance on mercenaries and the private sector results in corrupt and inefficient military force. In so doing, the book provides essential historical context to contemporary debates about the role of the private sector in warfare."
in_NB  books:noted  early_modern_european_history  mother_courage_raises_the_west  war
18 days ago
Christiansen, F.B.: Theories of Population Variation in Genes and Genomes (eBook, Paperback and Hardcover).
"...an authoritative introduction to both classical and coalescent approaches to population genetics. Written for graduate students and advanced undergraduates by one of the world's leading authorities in the field, the book focuses on the theoretical background of population genetics, while emphasizing the close interplay between theory and empiricism. Traditional topics such as genetic and phenotypic variation, mutation, migration, and linkage are covered and advanced by contemporary coalescent theory, which describes the genealogy of genes in a population, ultimately connecting them to a single common ancestor. Effects of selection, particularly genomic effects, are discussed with reference to molecular genetic variation..."
in_NB  books:noted  genetics  evolutionary_biology
18 days ago
How the Bible Works: An Anthropological Study of Evangelical Biblicism, By Brian Malley, 9780759106659 | Rowman & Littlefield
"What do evangelicals believe when they 'believe in the Bible?' Despite hundreds of English versions that differ in their texts, evangelicals continue to believe that there is a stable text—'the Bible'—which is the authoritative word of God and an essential guide to their everyday lives. To understand this phenomenon of evangelical Biblicism, anthropologist and biblical scholar Brian Malley looks not to the words of the Bible but to the Bible-believing communities. For as Malley demonstrates, it is less the meaning of the words of the Bible itself than how 'the Bible' provides a proper ground for beliefs that matters to evangelicals. Drawing on recent cognitive and social theory and extensive fieldwork in an evangelical church, Malley's book is an invaluable guide for seminarians, social scientists of religion, or for anyone who wants to understand just how the Bible works for American evangelicals."
in_NB  books:noted  ethnography  christianity  religion  anthropology  belief  social_life_of_the_mind
18 days ago
How the Other Half Banks — Mehrsa Baradaran | Harvard University Press
"The United States has two separate banking systems today—one serving the well-to-do and another exploiting everyone else. How the Other Half Banks contributes to the growing conversation on American inequality by highlighting one of its prime causes: unequal credit. Mehrsa Baradaran examines how a significant portion of the population, deserted by banks, is forced to wander through a Wild West of payday lenders and check-cashing services to cover emergency expenses and pay for necessities—all thanks to deregulation that began in the 1970s and continues decades later.
"In an age of corporate megabanks with trillions of dollars in assets, it is easy to forget that America’s banking system was originally created as a public service. Banks have always relied on credit from the federal government, provided on favorable terms so that they could issue low-interest loans. But as banks grew in size and political influence, they shed their social contract with the American people, demanding to be treated as a private industry free from any public-serving responsibility. They abandoned less profitable, low-income customers in favor of wealthier clients and high-yield investments. Fringe lenders stepped in to fill the void. This two-tier banking system has become even more unequal since the 2008 financial crisis.
"Baradaran proposes a solution: reenlisting the U.S. Post Office in its historic function of providing bank services. The post office played an important but largely forgotten role in the creation of American democracy, and it could be deployed again to level the field of financial opportunity."
to:NB  books:noted  banking  class_struggles_in_america  inequality  economics
19 days ago
The Orange Trees of Marrakesh — Stephen Frederic Dale | Harvard University Press
"In his masterwork Muqaddimah, the Arab Muslim Ibn Khaldun (1332–1406), a Tunisian descendant of Andalusian scholars and officials in Seville, developed a method of evaluating historical evidence that allowed him to identify the underlying causes of events. His methodology was derived from Aristotelian notions of nature and causation, and he applied it to create a dialectical model that explained the cyclical rise and fall of North African dynasties. The Muqaddimah represents the world’s first example of structural history and historical sociology. Four centuries before the European Enlightenment, this work anticipated modern historiography and social science.
"In Stephen F. Dale’s The Orange Trees of Marrakesh, Ibn Khaldun emerges as a cultured urban intellectual and professional religious judge who demanded his fellow Muslim historians abandon their worthless tradition of narrative historiography and instead base their works on a philosophically informed understanding of social organizations. His strikingly modern approach to historical research established him as the premodern world’s preeminent historical scholar. It also demonstrated his membership in an intellectual lineage that begins with Plato, Aristotle, and Galen; continues with the Greco-Muslim philosophers al-Farabi, Avicenna, and Averroes; and is renewed with Montesquieu, Hume, Adam Smith, and Durkheim."

--- Isn't the emphasis on the Aristotlean background just what M. Mahdi said in his old book from the 1950s?
in_NB  books:noted  history_of_ideas  ibn_khaldun  lives_of_the_scholars  social_science_methodology
19 days ago
Ashoka in Ancient India — Nayanjot Lahiri | Harvard University Press
"In the third century BCE, Ashoka ruled an empire encompassing much of modern-day India, Pakistan, Afghanistan, and Bangladesh. During his reign, Buddhism proliferated across the South Asian subcontinent, and future generations of Asians came to see him as the ideal Buddhist king. Disentangling the threads of Ashoka’s life from the knot of legend that surrounds it, Nayanjot Lahiri presents a vivid biography of this extraordinary Indian emperor and deepens our understanding of a legacy that extends beyond the bounds of Ashoka’s lifetime and dominion.
"At the center of Lahiri’s account is the complex personality of the Maurya dynasty’s third emperor—a strikingly contemplative monarch, at once ambitious and humane, who introduced a unique style of benevolent governance. Ashoka’s edicts, carved into rock faces and stone pillars, reveal an eloquent ruler who, unusually for the time, wished to communicate directly with his people. The voice he projected was personal, speaking candidly about the watershed events in his life and expressing his regrets as well as his wishes to his subjects.
"Ashoka’s humanity is conveyed most powerfully in his tale of the Battle of Kalinga. Against all conventions of statecraft, he depicts his victory as a tragedy rather than a triumph—a shattering experience that led him to embrace the Buddha’s teachings. Ashoka in Ancient India breathes new life into a towering figure of the ancient world, one who, in the words of Jawaharlal Nehru, “was greater than any king or emperor.”"
to:NB  books:noted  ancient_history  buddhism  lives_of_the_monarchs
19 days ago
Information, Territory, and Networks — Hilde De Weerdt | Harvard University Press
"The occupation of the northern half of the Chinese territories in the 1120s brought about a transformation in political communication in the south that had lasting implications for imperial Chinese history. By the late eleventh century, the Song court no longer dominated the production of information about itself and its territories. Song literati gradually consolidated their position as producers, users, and discussants of court gazettes, official records, archival compilations, dynastic histories, military geographies, and maps. This development altered the relationship between court and literati in political communication for the remainder of the imperial period. Based on a close reading of reader responses to official records and derivatives and on a mapping of literati networks, the author further proposes that the twelfth-century geopolitical crisis resulted in a lasting literati preference for imperial restoration and unified rule."
in_NB  books:noted  medieval_eurasian_history  social_networks  china  political_networks
19 days ago
Why Torture Doesn't Work — Shane O'Mara | Harvard University Press
"Torture is banned because it is cruel and inhumane. But as Shane O’Mara writes in this account of the human brain under stress, another reason torture should never be condoned is because it does not work the way torturers assume it does.
"In countless films and TV shows such as Homeland and 24, torture is portrayed as a harsh necessity. If cruelty can extract secrets that will save lives, so be it. CIA officers and others conducted torture using precisely this justification. But does torture accomplish what its defenders say it does? For ethical reasons, there are no scientific studies of torture. But neuroscientists know a lot about how the brain reacts to fear, extreme temperatures, starvation, thirst, sleep deprivation, and immersion in freezing water, all tools of the torturer’s trade. These stressors create problems for memory, mood, and thinking, and sufferers predictably produce information that is deeply unreliable—and, for intelligence purposes, even counterproductive. As O’Mara guides us through the neuroscience of suffering, he reveals the brain to be much more complex than the brute calculations of torturers have allowed, and he points the way to a humane approach to interrogation, founded in the science of brain and behavior.
"Torture may be effective in forcing confessions, as in Stalin’s Russia. But if we want information that we can depend on to save lives, O’Mara writes, our model should be Napoleon: “It has always been recognized that this way of interrogating men, by putting them to torture, produces nothing worthwhile.”"
in_NB  books:noted  torture  psychology  popular_social_science  our_national_shame
19 days ago
Trusting What You're Told — Paul L. Harris | Harvard University Press
"If children were little scientists who learn best through firsthand observations and mini-experiments, as conventional wisdom holds, how would a child discover that the earth is round—never mind conceive of heaven as a place someone might go after death? Overturning both cognitive and commonplace theories about how children learn, Trusting What You’re Told begins by reminding us of a basic truth: Most of what we know we learned from others.
"Children recognize early on that other people are an excellent source of information. And so they ask questions. But youngsters are also remarkably discriminating as they weigh the responses they elicit. And how much they trust what they are told has a lot to do with their assessment of its source. Trusting What You’re Told opens a window into the moral reasoning of elementary school vegetarians, the preschooler’s ability to distinguish historical narrative from fiction, and the six-year-old’s nuanced stance toward magic: skeptical, while still open to miracles. Paul Harris shares striking cross-cultural findings, too, such as that children in religious communities in rural Central America resemble Bostonian children in being more confident about the existence of germs and oxygen than they are about souls and God.
"We are biologically designed to learn from one another, Harris demonstrates, and this greediness for explanation marks a key difference between human beings and our primate cousins."

--- Well, _scientists_ take most of their knowledge on trust from others...
in_NB  books:noted  social_life_of_the_mind  cognitive_development  cultural_transmission_of_cognitive_tools  cultural_transmission  re:democratic_cognition
19 days ago
The Epic Rhapsode and His Craft — José M. González | Harvard University Press
"The Epic Rhapsode and His Craft studies Homeric performance from archaic to Roman imperial times. It argues that oracular utterance, dramatic acting, and rhetorical delivery powerfully elucidate the practice of epic rhapsodes. Attention to the ways in which these performance domains informed each other over time reveals a shifting dynamic of competition and emulation among rhapsodes, actors, and orators that shaped their texts and their crafts. A diachronic analysis of this web of influences illuminates fundamental aspects of Homeric poetry: its inspiration and composition, the notional fixity of its poetic tradition, and the performance-driven textual fixation and writing of the Homeric poems. It also shows that rhapsodic practice is best understood as an evolving combination of revelation, interpretation, recitation, and dramatic delivery."
in_NB  books:noted  ancient_history  epics
19 days ago
A Natural History of Human Thinking — Michael Tomasello | Harvard University Press
"Tool-making or culture, language or religious belief: ever since Darwin, thinkers have struggled to identify what fundamentally differentiates human beings from other animals. In this much-anticipated book, Michael Tomasello weaves his twenty years of comparative studies of humans and great apes into a compelling argument that cooperative social interaction is the key to our cognitive uniqueness. Once our ancestors learned to put their heads together with others to pursue shared goals, humankind was on an evolutionary path all its own.
"Tomasello argues that our prehuman ancestors, like today’s great apes, were social beings who could solve problems by thinking. But they were almost entirely competitive, aiming only at their individual goals. As ecological changes forced them into more cooperative living arrangements, early humans had to coordinate their actions and communicate their thoughts with collaborative partners. Tomasello’s “shared intentionality hypothesis” captures how these more socially complex forms of life led to more conceptually complex forms of thinking. In order to survive, humans had to learn to see the world from multiple social perspectives, to draw socially recursive inferences, and to monitor their own thinking via the normative standards of the group. Even language and culture arose from the preexisting need to work together. What differentiates us most from other great apes, Tomasello proposes, are the new forms of thinking engendered by our new forms of collaborative and communicative interaction."
in_NB  books:noted  human_evolution  evolutionary_psychology  psychology  collective_cognition  social_life_of_the_mind  tomasello.michael  part_played_by_social_labor_in_the_transition_from_ape_to_man  re:democratic_cognition
19 days ago
A Natural History of Human Morality — Michael Tomasello | Harvard University Press
"A Natural History of Human Morality offers the most detailed account to date of the evolution of human moral psychology. Based on extensive experimental data comparing great apes and human children, Michael Tomasello reconstructs how early humans gradually became an ultra-cooperative and, eventually, a moral species.
"There were two key evolutionary steps, each founded on a new way that individuals could act together as a plural agent “we”. The first step occurred as ecological challenges forced early humans to forage together collaboratively or die. To coordinate these collaborative activities, humans evolved cognitive skills of joint intentionality, ensuring that both partners knew together the normative standards governing each role. To reduce risk, individuals could make an explicit joint commitment that “we” forage together and share the spoils together as equally deserving partners, based on shared senses of trust, respect, and responsibility. The second step occurred as human populations grew and the division of labor became more complex. Distinct cultural groups emerged that demanded from members loyalty, conformity, and cultural identity. In becoming members of a new cultural “we”, modern humans evolved cognitive skills of collective intentionality, resulting in culturally created and objectified norms of right and wrong that everyone in the group saw as legitimate morals for anyone who would be one of “us”.
"As a result of this two-stage process, contemporary humans possess both a second-personal morality for face-to-face engagement with individuals and a group-minded “objective” morality that obliges them to the moral community as a whole."
in_NB  books:noted  human_evolution  evolution_of_cooperation  evolutionary_psychology  moral_psychology  part_played_by_social_labor_in_the_transition_from_ape_to_man  tomasello.michael  collective_cognition  re:democratic_cognition
19 days ago
The Society of Genes — Itai Yanai, Martin Lercher | Harvard University Press
"Nearly four decades ago Richard Dawkins published The Selfish Gene, famously reducing humans to “survival machines” whose sole purpose was to preserve “the selfish molecules known as genes.” How these selfish genes work together to construct the organism, however, remained a mystery. Standing atop a wealth of new research, The Society of Genes now provides a vision of how genes cooperate and compete in the struggle for life.
"Pioneers in the nascent field of systems biology, Itai Yanai and Martin Lercher present a compelling new framework to understand how the human genome evolved and why understanding the interactions among our genes shifts the basic paradigm of modern biology. Contrary to what Dawkins’s popular metaphor seems to imply, the genome is not made of individual genes that focus solely on their own survival. Instead, our genomes comprise a society of genes which, like human societies, is composed of members that form alliances and rivalries.
"In language accessible to lay readers, The Society of Genes uncovers genetic strategies of cooperation and competition at biological scales ranging from individual cells to entire species. It captures the way the genome works in cancer cells and Neanderthals, in sexual reproduction and the origin of life, always underscoring one critical point: that only by putting the interactions among genes at center stage can we appreciate the logic of life."

--- Very nice, except that this is exactly what _Dawkins_ said. I can't remember if the "parliament of genes" metaphor, complete with alliances and rivalries, was in _The Selfish Gene_ or in _The Extended Phenotype_, but this is the line he's been pushing since at least 1982, so this framing does not seem altogether honest to me.
to:NB  books:noted  evolutionary_biology  genetics  evolution_of_cooperation  popular_science  to_be_shot_after_a_fair_trial
19 days ago
Sensitive Matter — Michel Mitov | Harvard University Press
"Life would not exist without sensitive, or soft, matter. All biological structures depend on it, including red blood globules, lung fluid, and membranes. So do industrial emulsions, gels, plastics, liquid crystals, and granular materials. What makes sensitive matter so fascinating is its inherent versatility. Shape-shifting at the slightest provocation, whether a change in composition or environment, it leads a fugitive existence.
"Physicist Michel Mitov brings drama to molecular gastronomy (as when two irreconcilable materials are mixed to achieve the miracle of mayonnaise) and offers answers to everyday questions, such as how does paint dry on canvas, why does shampoo foam better when you “repeat,” and what allows for the controlled release of drugs? Along the way we meet a futurist cook, a scientist with a runaway imagination, and a penniless inventor named Goodyear who added sulfur to latex, quite possibly by accident, and created durable rubber.
"As Mitov demonstrates, even religious ritual is a lesson in the surprising science of sensitive matter. Thrice yearly, the reliquary of St. Januarius is carried down cobblestone streets from the Cathedral to the Church of St. Clare in Naples. If all goes as hoped—and since 1389 it often has—the dried blood contained in the reliquary’s largest vial liquefies on reaching its destination, and Neapolitans are given a reaffirming symbol of renewal."
in_NB  books:noted  physics  condensed_matter  popular_science
19 days ago
The Black Box Society — Frank Pasquale | Harvard University Press
"Every day, corporations are connecting the dots about our personal behavior—silently scrutinizing clues left behind by our work habits and Internet use. The data compiled and portraits created are incredibly detailed, to the point of being invasive. But who connects the dots about what firms are doing with this information? The Black Box Society argues that we all need to be able to do so—and to set limits on how big data affects our lives.
"Hidden algorithms can make (or ruin) reputations, decide the destiny of entrepreneurs, or even devastate an entire economy. Shrouded in secrecy and complexity, decisions at major Silicon Valley and Wall Street firms were long assumed to be neutral and technical. But leaks, whistleblowers, and legal disputes have shed new light on automated judgment. Self-serving and reckless behavior is surprisingly common, and easy to hide in code protected by legal and real secrecy. Even after billions of dollars of fines have been levied, underfunded regulators may have only scratched the surface of this troubling behavior.
"Frank Pasquale exposes how powerful interests abuse secrecy for profit and explains ways to rein them in. Demanding transparency is only the first step. An intelligible society would assure that key decisions of its most important firms are fair, nondiscriminatory, and open to criticism. Silicon Valley and Wall Street need to accept as much accountability as they impose on others."
to:NB  books:noted  data_mining  public_policy  the_wired_ideology  to_teach:data-mining  to_be_shot_after_a_fair_trial
19 days ago
Smart Citizens, Smarter State — Beth Simone Noveck | Harvard University Press
"Government “of the people, by the people, for the people” expresses an ideal that resonates in all democracies. Yet poll after poll reveals deep distrust of institutions that seem to have left “the people” out of the governing equation. Government bureaucracies that are supposed to solve critical problems on their own are a troublesome outgrowth of the professionalization of public life in the industrial age. They are especially ill-suited to confronting today’s complex challenges.
"Offering a far-reaching program for innovation, Smart Citizens, Smarter State suggests that public decisionmaking could be more effective and legitimate if government were smarter—if our institutions knew how to use technology to leverage citizens’ expertise. Just as individuals use only part of their brainpower to solve most problems, governing institutions make far too little use of the skills and experience of those inside and outside of government with scientific credentials, practical skills, and ground-level street smarts. New tools—what Beth Simone Noveck calls technologies of expertise—are making it possible to match the supply of citizen expertise to the demand for it in government.
"Drawing on a wide range of academic disciplines and practical examples from her work as an adviser to governments on institutional innovation, Noveck explores how to create more open and collaborative institutions. In so doing, she puts forward a profound new vision for participatory democracy rooted not in the paltry act of occasional voting or the serendipity of crowdsourcing but in people’s knowledge and know-how."

--- Probably worth reading, despite the "Just as individuals use only part of their brainpower to solve most problems".
to:NB  books:noted  decision-making  public_policy  democracy  collective_cognition  social_life_of_the_mind  institutions  re:democratic_cognition  to_be_shot_after_a_fair_trial
19 days ago
Exposed — Bernard E. Harcourt | Harvard University Press
"Social media compile data on users, retailers mine information on consumers, Internet giants create dossiers of who we know and what we do, and intelligence agencies collect all this plus billions of communications daily. Exploiting our boundless desire to access everything all the time, digital technology is breaking down whatever boundaries still exist between the state, the market, and the private realm. Exposed offers a powerful critique of our new virtual transparence, revealing just how unfree we are becoming and how little we seem to care.
"Bernard Harcourt guides us through our new digital landscape, one that makes it so easy for others to monitor, profile, and shape our every desire. We are building what he calls the expository society—a platform for unprecedented levels of exhibition, watching, and influence that is reconfiguring our political relations and reshaping our notions of what it means to be an individual.
"We are not scandalized by this. To the contrary: we crave exposure and knowingly surrender our privacy and anonymity in order to tap into social networks and consumer convenience—or we give in ambivalently, despite our reservations. But we have arrived at a moment of reckoning. If we do not wish to be trapped in a steel mesh of wireless digits, we have a responsibility to do whatever we can to resist. Disobedience to a regime that relies on massive data mining can take many forms, from aggressively encrypting personal information to leaking government secrets, but all will require conviction and courage."

--- This description seems much too credulous about the effectiveness of data mining to me, but Harcourt has a lot of credit with me from previous work so the last tag applies with force.
to:NB  books:noted  data_mining  national_surveillance_state  social_media  networked_life  harcourt.bernard_e.  to_be_shot_after_a_fair_trial
19 days ago
The Technological Indian — Ross Bassett | Harvard University Press
"In the late 1800s, Indians seemed to be a people left behind by the Industrial Revolution, dismissed as “not a mechanical race.” Today Indians are among the world’s leaders in engineering and technology. In this international history spanning nearly 150 years, Ross Bassett—drawing on a unique database of every Indian to graduate from the Massachusetts Institute of Technology between its founding and 2000—charts their ascent to the pinnacle of high-tech professions.
"As a group of Indians sought a way forward for their country, they saw a future in technology. Bassett examines the tensions and surprising congruences between this technological vision and Mahatma Gandhi’s nonindustrial modernity. India’s first prime minister, Jawaharlal Nehru, sought to use MIT-trained engineers to build an India where the government controlled technology for the benefit of the people. In the private sector, Indian business families sent their sons to MIT, while MIT graduates established India’s information technology industry.
"By the 1960s, students from the Indian Institutes of Technology (modeled on MIT) were drawn to the United States for graduate training, and many of them stayed, as prominent industrialists, academics, and entrepreneurs. The MIT-educated Indian engineer became an integral part of a global system of technology-based capitalism and focused less on India and its problems—a technological Indian created at the expense of a technological India."

--- Much of my family resembles those remarks.
to:NB  books:noted  history_of_technology  india  engineering  lumpentechnocracy  development_policy  mit
19 days ago
Animal Electricity — Robert B. Campenot | Harvard University Press
"Like all cellular organisms, humans run on electricity. Slight imbalances of electric charge across cell membranes result in sensation, movement, awareness, and thinking—nearly everything we associate with being alive. Robert Campenot offers a comprehensive overview of animal electricity, examining its physiological mechanisms as well as the experimental discoveries that form the basis for our modern understanding of nervous systems across the animal kingdom.
"Cells work much like batteries. Concentration gradients of sodium and potassium cause these ions to flow in and out of cells by way of protein channels, creating tiny voltages across the cell membrane. The cellular mechanisms that switch these ion currents on and off drive all the functions associated with animal nervous systems, from nerve impulses and heartbeats to the 600-volt shocks produced by electric eels.
"Campenot’s examination of the nervous system is presented in the context of ideas as they evolved in the past, as well as today’s research and its future implications. The discussion ranges from the pre-Renaissance notion of animal spirits and Galvani’s eighteenth-century discovery of animal electricity, to modern insights into how electrical activity produces learning and how electrical signals in the cortex can be used to connect the brains of paralyzed individuals to limbs or prosthetic devices. Campenot provides the necessary scientific background to make the book highly accessible for general readers while conveying much about the process of scientific discovery."
in_NB  books:noted  history_of_science  biology  biophysics  neuroscience
19 days ago
Afghan Modern — Robert D. Crews | Harvard University Press
"Rugged, remote, riven by tribal rivalries and religious violence, Afghanistan seems to many a country frozen in time and forsaken by the world. Afghan Modern presents a bold challenge to these misperceptions, revealing how Afghans, over the course of their history, have engaged and connected with a wider world and come to share in our modern globalized age.
"Always a mobile people, Afghan travelers, traders, pilgrims, scholars, and artists have ventured abroad for centuries, their cosmopolitan sensibilities providing a compass for navigating a constantly changing world. Robert Crews traces the roots of Afghan globalism to the early modern period, when, as the subjects of sprawling empires, the residents of Kabul, Kandahar, and other urban centers forged linkages with far-flung imperial centers throughout the Middle East and Asia. Focusing on the emergence of an Afghan state out of this imperial milieu, he shows how Afghan nation-making was part of a series of global processes, refuting the usual portrayal of Afghans as pawns in the “Great Game” of European powers and of Afghanistan as a “hermit kingdom.”
"In the twentieth century, the pace of Afghan interaction with the rest of the world dramatically increased, and many Afghan men and women came to see themselves at the center of ideological struggles that spanned the globe. Through revolution, war, and foreign occupations, Afghanistan became even more enmeshed in the global circulation of modern politics, occupying a pivotal position in the Cold War and the tumultuous decades that followed."
in_NB  books:noted  afghanistan  state-building  imperialism  globalization  coveted
19 days ago
Tying Flood Insurance to Flood Risk for Low-Lying Structures in the Floodplains | The National Academies Press
"Floods take a heavy toll on society, costing lives, damaging buildings and property, disrupting livelihoods, and sometimes necessitating federal disaster relief, which has risen to record levels in recent years. The National Flood Insurance Program (NFIP) was created in 1968 to reduce the flood risk to individuals and their reliance on federal disaster relief by making federal flood insurance available to residents and businesses if their community adopted floodplain management ordinances and minimum standards for new construction in flood prone areas. Insurance rates for structures built after a flood plain map was adopted by the community were intended to reflect the actual risk of flooding, taking into account the likelihood of inundation, the elevation of the structure, and the relationship of inundation to damage to the structure. Today, rates are subsidized for one-fifth of the NFIP's 5.5 million policies. Most of these structures are negatively elevated, that is, the elevation of the lowest floor is lower than the NFIP construction standard. Compared to structures built above the base flood elevation, negatively elevated structures are more likely to incur a loss because they are inundated more frequently, and the depths and durations of inundation are greater.
"Tying Flood Insurance to Flood Risk for Low-Lying Structures in the Floodplains studies the pricing of negatively elevated structures in the NFIP. This report review current NFIP methods for calculating risk-based premiums for these structures, including risk analysis, flood maps, and engineering data. The report then evaluates alternative approaches for calculating risk-based premiums and discusses engineering hydrologic and property assessment data needs to implement full risk-based premiums. The findings and conclusions of this report will help to improve the accuracy and precision of loss estimates for negatively elevated structures, which in turn will increase the credibility, fairness, and transparency of premiums for policyholders."
books:noted  insurance  public_policy  disasters  floods
21 days ago
Mark Zuckerberg, Let Me Pay for Facebook - The New York Times
I very strongly suspect this will never happen, because it would mean admitting that the ads will _never_ be worth that much, and a mere subscription-model business can't support those valuations. But maybe I'm too cynical.
tufecki.zeynep  social_media  if_youre_not_paying_for_the_service_youre_the_product  to:blog
21 days ago
Hoffman, P.T.: Why Did Europe Conquer the World? (eBook and Hardcover).
"Between 1492 and 1914, Europeans conquered 84 percent of the globe. But why did Europe rise to the top, when for centuries the Chinese, Japanese, Ottomans, and South Asians were far more advanced? Why didn’t these powers establish global dominance? In Why Did Europe Conquer the World?, distinguished economic historian Philip Hoffman demonstrates that conventional explanations—such as geography, epidemic disease, and the Industrial Revolution—fail to provide answers. Arguing instead for the pivotal role of economic and political history, Hoffman shows that if variables had been at all different, Europe would not have achieved critical military innovations, and another power could have become master of the world.
"In vivid detail, Hoffman sheds light on the two millennia of economic, political, and historical changes that set European states on a distinctive path of development and military rivalry. Compared to their counterparts in China, Japan, South Asia, and the Middle East, European leaders—whether chiefs, lords, kings, emperors, or prime ministers—had radically different incentives, which drove them to make war. These incentives, which Hoffman explores using an economic model of political costs and financial resources, resulted in astonishingly rapid growth in Europe’s military sector from the Middle Ages on, and produced an insurmountable lead in gunpowder technology. The consequences determined which states established colonial empires or ran the slave trade, and even which economies were the first to industrialize."
to:NB  books:noted  early_modern_world_history  mother_courage_raises_the_west
22 days ago
Alien abduction insurance - Wikipedia, the free encyclopedia
As Ceglowski says, the premium/payout ratio here seems high, suggesting a real market opportunity.
insurance  psychoceramics  aliens  via:pinboard
23 days ago
Drinking Soylent With The Last Of The California War Boys | MORNING, COMPUTER
"There are doubtless a ton of hot takes about Soylent founder Rob Rhinehart’s recent detailed statement about his current lifestyle and philosophy.  Everyone’s done jokes about Soylent, including me, so we’ll leave that to the side.  One summation of his new statement would be that he’s living the classic 80s cyberpunk lifestyle – living off a single solar panel and a butane burner, wearing clothes made by subsistence-wage workers in China that he throws away when they get dirty, and writing long, confused philosophical screeds that probably largely make sense only in his head.  It would be both pointless and cruel to go after every single example of choplogic and error.  All that should be taken from his statement is that he treats humanity in much the same way he treats food — as something “rotting.”  The guy’s going to be found living in an old bath in Oakland in five years, and we should only feel pity and concern for his well-being.
"Seasteading’s been and gone for the second (third?) time, the secession and Six-State-California guys have been and gone.  It is that time in the cycle where the Libertarian App Future Brothers start living off the grid, buying guns and getting good and weird out there alone in the dark.  I wonder how we’ll look back at this whole period of the last five or ten years.  At how the digital gold rush and the strange pressures of a new, yet accelerated, period of cultural invention cooked a whole new set of mental wounds out of the people swept up in it.  Yes, sure, it gave us sociopaths who prefer humans to be drones and believe that everything is rotting.  But I think, reviewing the era, that we will be sad.  I think we may look back and consider that, one more time, we saw the best minds of our generation destroyed by madness, starving hysterical naked, dragging themselves after an Uber that isn’t actually there because Uber fake most of those little cars you see on the Uber app map."
libertarianism  the_wired_ideology  cultural_criticism  ellis.warren  to:blog
26 days ago
AEAweb: AER (105,8) p. 2695 - In the Name of the Son (and the Daughter): Intergenerational Mobility in the United States, 1850-1940
"This paper estimates historical intergenerational elasticities between fathers and children of both sexes in the United States using a novel empirical strategy. The key insight of our approach is that the information about socioeconomic status conveyed by first names can be used to create pseudo-links across generations. We find that both father-son and father-daughter elasticities were flat during the nineteenth century, increased sharply between 1900 and 1920, and declined slightly thereafter. We discuss the role of regional disparities in economic development, trends in inequality and returns to human capital, and the marriage market in explaining these patterns."

--- I will be interested to see how exactly the do status-estimation-by-first-names, and whether its convincing. (E.g., once everyone knows that names like "Dwayne" are low-class but "Chet" is high-class, surely the [ambitious] prole will become more likely to name their son "Chet", diluting the signal.)
to:NB  economics  economic_history  inequality  names
26 days ago
AEAweb: AER (105,8) p. 2570 - Back to Fundamentals: Equilibrium in Abstract Economies
"We propose a new abstract definition of equilibrium in the spirit of competitive equilibrium: a profile of alternatives and a public ordering (expressing prestige, price, or a social norm) such that each agent prefers his assigned alternative to all lower-ranked ones. The equilibrium operates in an abstract setting built upon a concept of convexity borrowed from convex geometry. We apply the concept to a variety of convex economies and relate it to Pareto optimality. The "magic" of linear equilibrium prices is put into perspective by establishing an analogy between linear functions in the standard convexity and "primitive orderings" in the abstract convexity."
to:NB  economics  convexity  rubinstein.ariel  mathematics
26 days ago
[1507.01059] Remarks on kernel Bayes' rule
"Kernel Bayes' rule has been proposed as a nonparametric kernel-based method to realize Bayesian inference in reproducing kernel Hilbert spaces. However, we demonstrate both theoretically and experimentally that the prediction result by kernel Bayes' rule is in some cases unnatural. We consider that this phenomenon is in part due to the fact that the assumptions in kernel Bayes' rule do not hold in general."

--- The point about the "kernel posterior" being insensitive to the prior seems the strongest to me; so strong that despite reading the proof I am not sure it's right, as opposed to an algebraic mistake I'm missing.
to:NB  bayesianism  kernel_methods  hilbert_space  computational_statistics  statistics  have_read
26 days ago
[1507.08376] A Joint Graph Inference Case Study: the C.elegans Chemical and Electrical Connectomes
"We investigate joint graph inference for the chemical and electrical connectomes of the \textit{Caenorhabditis elegans} roundworm. The \textit{C.elegans} connectomes consist of 253 non-isolated neurons with known functional attributes, and there are two types of synaptic connectomes, resulting in a pair of graphs. We formulate our joint graph inference from the perspectives of seeded graph matching and joint vertex classification. Our results suggest that connectomic inference should proceed in the joint space of the two connectomes, which has significant neuroscientific implications."
to:NB  neuroscience  network_data_analysis  statistics  re:network_differences
27 days ago
[1507.05034] Bootstrap tuning in ordered model selection
"In the problem of model selection for a given family of linear estimators, ordered by their variance, we offer a new "smallest accepted" approach motivated by Lepski's method and multiple testing theory. The procedure selects the smallest model which satisfies an acceptance rule based on comparison with all larger models. The method is completely data-driven and does not use any prior information about the variance structure of the noise: its parameters are adjusted to the underlying possibly heterogeneous noise by the so-called "propagation condition" using a wild bootstrap method. The validity of the bootstrap calibration is proved for finite samples with an explicit error bound. We provide a comprehensive theoretical study of the method and describe in detail the set of possible values of the selector $$\hat{m}$$. We also establish some precise oracle error bounds for the corresponding estimator $$\hat{\theta} = \tilde{\theta}_{\hat{m}}$$ which equally applies to estimation of the whole parameter vectors, some subvector or linear mapping, as well as the estimation of a linear functional."
to:NB  model_selection  bootstrap  statistics
27 days ago
[1507.05185] Fast Sparse Least-Squares Regression with Non-Asymptotic Guarantees
"In this paper, we study a fast approximation method for {\it large-scale high-dimensional} sparse least-squares regression problem by exploiting the Johnson-Lindenstrauss (JL) transforms, which embed a set of high-dimensional vectors into a low-dimensional space. In particular, we propose to apply the JL transforms to the data matrix and the target vector and then to solve a sparse least-squares problem on the compressed data with a {\it slightly larger regularization parameter}. Theoretically, we establish the optimization error bound of the learned model for two different sparsity-inducing regularizers, i.e., the elastic net and the ℓ1 norm. Compared with previous relevant work, our analysis is {\it non-asymptotic and exhibits more insights} on the bound, the sample complexity and the regularization. As an illustration, we also provide an error bound of the {\it Dantzig selector} under JL transforms."
to:NB  regression  random_projections  computational_statistics  statistics  sparsity
27 days ago
[1507.05910] Clustering is Efficient for Approximate Maximum Inner Product Search
"Locality Sensitive Hashing (LSH) techniques have recently become a popular solution for solving the approximate Maximum Inner Product Search (MIPS) problem, which arises in many situations and have in particular been used as a speed-up for the training of large neural probabilistic language models.
"In this paper we propose a new approach for solving approximate MIPS based on a variant of the k-means algorithm. We suggest using spherical k-means which is an algorithm that can efficiently be used to solve the approximate Maximum Cosine Similarity Search (MCSS), and basing ourselves on previous work by Shrivastava and Li we show how it can be adapted for approximate MIPS.
"Our new method compares favorably with LSH-based methods on a simple recall rate test, by providing a more accurate set of candidates for the maximum inner product. The proposed method is thus likely to benefit the wide range of problems with very large search spaces where a robust approximate MIPS heuristic could be of interest, such as for providing a high quality short list of candidate words to speed up the training of neural probabilistic language models."

---- I thought people viewed k-means as a _kind_ of locality-sensitive hashing?
to:NB  databases  hashing  clustering  k-means
27 days ago
[1507.00066] Fast Cross-Validation for Incremental Learning
"Cross-validation (CV) is one of the main tools for performance estimation and parameter tuning in machine learning. The general recipe for computing CV estimate is to run a learning algorithm separately for each CV fold, a computationally expensive process. In this paper, we propose a new approach to reduce the computational burden of CV-based performance estimation. As opposed to all previous attempts, which are specific to a particular learning model or problem domain, we propose a general method applicable to a large class of incremental learning algorithms, which are uniquely fitted to big data problems. In particular, our method applies to a wide range of supervised and unsupervised learning tasks with different performance criteria, as long as the base learning algorithm is incremental. We show that the running time of the algorithm scales logarithmically, rather than linearly, in the number of CV folds. Furthermore, the algorithm has favorable properties for parallel and distributed implementation. Experiments with state-of-the-art incremental learning algorithms confirm the practicality of the proposed method."
to:NB  cross-validation  computational_statistics  statistics
27 days ago
[1507.00803] Optimal design of experiments in the presence of network-correlated outcomes
"We consider the problem of how to assign treatment in a randomized experiment, when the correlation among the outcomes is informed by a network available pre-intervention. Working within the potential outcome causal framework, we develop a class of models that posit such a correlation structure among the outcomes, and a strategy for allocating treatment optimally, for the goal of minimizing the integrated mean squared error of the estimated average treatment effect. We provide insights into features of the optimal designs via an analytical decomposition of the mean squared error used for optimization. We illustrate how the proposed treatment allocation strategy improves on allocations that ignore the network structure, with extensive simulations."
to:NB  experimental_design  network_data_analysis  causal_inference  statistics  airoldi.edo  to_read
27 days ago
[1507.00827] Estimating the number of communities in networks by spectral methods
"Community detection is a fundamental problem in network analysis with many methods available to estimate communities. Most of these methods assume that the number of communities is known, which is often not the case in practice. We propose a simple and very fast method for estimating the number of communities based on the spectral properties of certain graph operators, such as the non-backtracking matrix and the Bethe Hessian matrix. We show that the method performs well under several models and a wide range of parameters, and is guaranteed to be consistent under several asymptotic regimes. We compare the new method to several existing methods for estimating the number of communities and show that it is both more accurate and more computationally efficient."
to:NB  community_discovery  model_selection  statistics  spectral_methods  levina.elizaveta
27 days ago
[1507.00964] Non-parametric estimation of Fisher information from real data
"The Fisher Information matrix is a widely used measure for applications ranging from statistical inference, information geometry, experiment design, to the study of criticality in biological systems. Yet there is no commonly accepted non-parametric algorithm to estimate it from real data. In this rapid communication we show how to accurately estimate the Fisher information in a nonparametric way. We also develop a numerical procedure to minimize the errors by choosing the interval of the finite difference scheme necessary to compute the derivatives in the definition of the Fisher information. Our method uses the recently published "Density Estimation using Field Theory" algorithm to compute the probability density functions for continuous densities. We use the Fisher information of the normal distribution to validate our method and as an example we compute the temperature component of the Fisher Information Matrix in the two dimensional Ising model and show that it obeys the expected relation to the heat capacity and therefore peaks at the phase transition at the correct critical temperature."

--- The idea of a non-parametric estimate of Fisher information, which presumes a parametric model, is a little boggling. I should read it, I guess, but perhaps they have some sort of semi-parametric setting in mind?
to:NB  fisher_information  estimation  nonparametrics  statistics  to_be_shot_after_a_fair_trial
27 days ago
[1507.01173] Model Diagnostics Based on Cumulative Residuals: The R-package gof
"The generalized linear model is widely used in all areas of applied statistics and while correct asymptotic inference can be achieved under misspecification of the distributional assumptions, a correctly specified mean structure is crucial to obtain interpretable results. Usually the linearity and functional form of predictors are checked by inspecting various scatter plots of the residuals, however, the subjective task of judging these can be challenging. In this paper we present an implementation of model diagnostics for the generalized linear model as well as structural equation models, based on aggregates of the residuals where the asymptotic behavior under the null is imitated by simulations. A procedure for checking the proportional hazard assumption in the Cox regression is also implemented."
to:NB  model_checking  regression  linear_regression  statistics  to_teach:modern_regression
27 days ago
[1507.02061] Honest confidence regions and optimality in high-dimensional precision matrix estimation
"We propose methodology for estimation of sparse precision matrices and statistical inference for their low-dimensional parameters in a high-dimensional setting where the number of parameters p can be much larger than the sample size. We show that the novel estimator achieves minimax rates in supremum norm and the low-dimensional components of the estimator have a Gaussian limiting distribution. These results hold uniformly over the class of precision matrices with row sparsity of small order n‾‾√/logp and spectrum uniformly bounded, under sub-Gaussian tail assumption on the margins of the true underlying distribution. Consequently, our results lead to uniformly valid confidence regions for low-dimensional parameters of the precision matrix. Thresholding the estimator leads to variable selection without imposing irrepresentability conditions. The performance of the method is demonstrated in a simulation study."
to:NB  confidence_sets  estimation  high-dimensional_statistics  statistics  van_de_geer.sara
27 days ago
[1507.02284] The Information Sieve
"We introduce a new framework for unsupervised learning of deep representations based on a novel hierarchical decomposition of information. Intuitively, data is passed through a series of progressively fine-grained sieves. Each layer of the sieve recovers a single latent factor that is maximally informative about multivariate dependence in the data. The data is transformed after each pass so that the remaining unexplained information trickles down to the next layer. Ultimately, we are left with a set of latent factors explaining all the dependence in the original data and remainder information consisting of independent noise. We present a practical implementation of this framework for discrete variables and apply it to a variety of tasks including independent component analysis, lossy and lossless compression, and predicting missing values in data."
to:NB  information_theory  inference_to_latent_objects  factor_analysis  clustering  statistics  kith_and_kin  ver_steeg.greg  galstyan.aram
27 days ago
[1507.02608] Understanding consistency in hybrid causal structure learning
"We consider causal structure learning from observational data. The main existing approaches can be classified as constraint-based, score-based and hybrid methods, where the latter combine aspects of both constraint-based and score-based approaches. Hybrid methods often apply a greedy search on a restricted search space, where the restricted space is estimated using a constraint-based method. The restriction on the search space is a fundamental principle of the hybrid methods and makes them computationally efficient. However, this can come at the cost of inconsistency or at least at the cost of a lack of consistency proofs. In this paper, we demonstrate such inconsistency in an explicit example. In spite of the lack of consistency results, many hybrid methods have empirically been shown to outperform consistent score-based methods such as greedy equivalence search (GES).
"We present a consistent hybrid method, called adaptively restricted GES (ARGES). It is a modification of GES, where the restriction on the search space depends on an estimated conditional independence graph and also changes adaptively depending on the current state of the algorithm. Although the adaptive modification is necessary to achieve consistency in general, our empirical results show that it has a relatively minor effect. This provides an explanation for the empirical success of (inconsistent) hybrid methods."
to:NB  causal_discovery  graphical_models  statistics  maathuis.marloes
27 days ago
[1507.02925] Completely random measures for modelling block-structured sparse networks
"Many statistical methods for network data parameterize the edge-probability by attributing latent traits to the vertices such as block structure and assume exchangeability in the sense of the Aldous-Hoover representation theorem. Empirical studies of networks indicates that many large, real-world networks have a power-law distribution of the vertices which in turn implies the number of edges scale slower than quadratically in the number of vertices. These assumptions are fundamentally irreconcilable as the Aldous-Hoover theorem implies quadratic scaling of the number of edges. Recently Caron and Fox (2014) proposed the use of a different notion of exchangeability due to Kallenberg (2009) and obtained a network model which admits power-law behaviour while retaining desirable statistical properties, however this model do not capture latent vertex traits such as block-structure. In this work we re-introduce the use of block-structure for network modelling in the new setting and thereby obtain a model which admits the inference of block-structure and edge inhomogeneity. We derive a simple expression for the likelihood and an efficient sampling method. The obtained model is not significantly more difficult to implement than existing methods and performs well on real network datasets."
27 days ago
[1507.03194] A Review of Nonnegative Matrix Factorization Methods for Clustering
"Nonnegative Matrix Factorization (NMF) was first introduced as a low-rank matrix approximation technique, and has enjoyed a wide area of applications. Although NMF does not seem related to the clustering problem at first, it was shown that they are closely linked. In this report, we provide a gentle introduction to clustering and NMF before reviewing the theoretical relationship between them. We then explore several NMF variants, namely Sparse NMF, Projective NMF, Nonnegative Spectral Clustering and Cluster-NMF, along with their clustering interpretations."
to:NB  low-rank_approximation  clustering  statistics  data_mining
27 days ago
[1507.03293] Tail Analysis without Tail Information : A Worst-case Perspective
"Tail modeling refers to the task of selecting the best probability distributions that describe the occurrences of extreme events. One common bottleneck in this task is that, due to their very nature, tail data are often very limited. The conventional approach uses parametric fitting, but the validity of the choice of a parametric model is usually hard to verify. This paper describes a reasonable alternative that does not require any parametric assumption. The proposed approach is based on a worst-case analysis under the geometric premise of tail convexity, a feature shared by all known parametric tail distributions. We demonstrate that the worst-case convex tail behavior is either extremely light-tailed or extremely heavy-tailed. We also construct low-dimensional nonlinear programs that can both distinguish between the two cases and find the worst-case tail. Numerical results show that the proposed approach gives a competitive performance versus using conventional parametric methods."
to:NB  statistics  heavy_tails
27 days ago
[1507.03984] Sensitivity Analysis Without Assumptions
"Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on the causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having a confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one confounder. Without imposing any assumptions on the confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy, but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder."
to:NB  causal_inference  misspecification  partial_identification  statistics  vanderweele.tyler  to_read  re:homophily_and_confounding
27 days ago
[1507.04118] Oracle inequalities for network models and sparse graphon estimation
"Inhomogeneous random graph models encompass many network models such as stochastic block models and latent position models. In this paper, we study two estimators -- the ordinary block constant least squares estimator, and its restricted version. We show that they satisfy oracle inequalities with respect to the block constant oracle. As a consequence, we derive optimal rates of estimation of the probability matrix. Our results cover the important setting of sparse networks. Nonparametric rates for graphon estimation in the L2 norm are also derived when the probability matrix is sampled according to a graphon model. The results shed light on the differences between estimation under the empirical loss (the probability matrix estimation) and under the integrated loss (the graphon estimation)."
to:NB  network_data_analysis  graph_limits  statistics  minimax  estimation  re:smoothing_adjacency_matrices
27 days ago
[1507.04553] Approximate Maximum Likelihood Estimation
"In recent years, methods of approximate parameter estimation have attracted considerable interest in complex problems where exact likelihoods are hard to obtain. In their most basic form, Bayesian methods such as Approximate Bayesian Computation (ABC) involve sampling from the parameter space and keeping those parameters that produce data that fit sufficiently well to the actually observed data. Exploring the whole parameter space, however, makes this approach inefficient in high dimensional problems. This led to the proposal of more sophisticated iterative methods of inference such as particle filters.
"Here, we propose an alternative approach that is based on stochastic gradient methods and applicable both in a frequentist and a Bayesian setting. By moving along a simulated gradient, the algorithm produces a sequence of estimates that will eventually converge either to the maximum likelihood estimate or to the maximum of the posterior distribution, in each case under a set of observed summary statistics. To avoid reaching only a local maximum, we propose to run the algorithm from a set of random starting values.
"As good tuning of the algorithm is important, we explored several tuning strategies, and propose a set of guidelines that worked best in our simulations. We investigate the performance of our approach in simulation studies, and also apply the algorithm to two models with intractable likelihood functions. First, we present an application to inference in the context of queuing systems. We also re-analyze population genetic data and estimate parameters describing the demographic history of Sumatran and Bornean orang-utan populations."
in_NB  statistics  computational_statistics  stochastic_approximation  likelihood  estimation  primates
27 days ago
[1507.02822] Hawkes Processes
"Hawkes processes are a particularly interesting class of stochastic process that have been applied in diverse areas, from earthquake modelling to financial analysis. They are point processes whose defining characteristic is that they 'self-excite', meaning that each arrival increases the rate of future arrivals for some period of time. Hawkes processes are well established, particularly within the financial literature, yet many of the treatments are inaccessible to one not acquainted with the topic. This survey provides background, introduces the field and historical developments, and touches upon all major aspects of Hawkes processes."
to:NB  point_processes  stochastic_processes  time_series  statistics  have_read
27 days ago
[1507.05181] The Mondrian Process for Machine Learning
"This report is concerned with the Mondrian process and its applications in machine learning. The Mondrian process is a guillotine-partition-valued stochastic process that possesses an elegant self-consistency property. The first part of the report uses simple concepts from applied probability to define the Mondrian process and explore its properties.
"The Mondrian process has been used as the main building block of a clever online random forest classification algorithm that turns out to be equivalent to its batch counterpart. We outline a slight adaptation of this algorithm to regression, as the remainder of the report uses regression as a case study of how Mondrian processes can be utilized in machine learning. In particular, the Mondrian process will be used to construct a fast approximation to the computationally expensive kernel ridge regression problem with a Laplace kernel.
"The complexity of random guillotine partitions generated by a Mondrian process and hence the complexity of the resulting regression models is controlled by a lifetime hyperparameter. It turns out that these models can be efficiently trained and evaluated for all lifetimes in a given range at once, without needing to retrain them from scratch for each lifetime value. This leads to an efficient procedure for determining the right model complexity for a dataset at hand.
"The limitation of having a single lifetime hyperparameter will motivate the final Mondrian grid model, in which each input dimension is endowed with its own lifetime parameter. In this model we preserve the property that its hyperparameters can be tweaked without needing to retrain the modified model from scratch."
to:NB  bayesian_nonparametrics  nonparametrics  regression  statistics
27 days ago

Copy this bookmark:

description:

tags: