12918
Vellend, M.: The Theory of Ecological Communities (MPB-57) (eBook and Hardcover).
"A plethora of different theories, models, and concepts make up the field of community ecology. Amid this vast body of work, is it possible to build one general theory of ecological communities? What other scientific areas might serve as a guiding framework? As it turns out, the core focus of community ecology—understanding patterns of diversity and composition of biological variants across space and time—is shared by evolutionary biology and its very coherent conceptual framework, population genetics theory. The Theory of Ecological Communities takes this as a starting point to pull together community ecology’s various perspectives into a more unified whole.
"Mark Vellend builds a theory of ecological communities based on four overarching processes: selection among species, drift, dispersal, and speciation. These are analogues of the four central processes in population genetics theory—selection within species, drift, gene flow, and mutation—and together they subsume almost all of the many dozens of more specific models built to describe the dynamics of communities of interacting species. The result is a theory that allows the effects of many low-level processes, such as competition, facilitation, predation, disturbance, stress, succession, colonization, and local extinction to be understood as the underpinnings of high-level processes with widely applicable consequences for ecological communities."
to:NB  books:noted  ecology  evolutionary_biology
10 days ago
Graphical Modeling for Multivariate Hawkes Processes with Nonparametric Link Functions - Eichler - 2016 - Journal of Time Series Analysis - Wiley Online Library
"Hawkes (1971a) introduced a powerful multivariate point process model of mutually exciting processes to explain causal structure in data. In this article, it is shown that the Granger causality structure of such processes is fully encoded in the corresponding link functions of the model. A new nonparametric estimator of the link functions based on a time-discretized version of the point process is introduced by using an infinite order autoregression. Consistency of the new estimator is derived. The estimator is applied to simulated data and to neural spike train data from the spinal dorsal horn of a rat."
to:NB  time_series  graphical_models  nonparametrics  statistics  point_processes  neural_data_analysis
15 days ago
Turing learning: a metric-free approach to inferring behavior and its application to swarms | SpringerLink
"We propose Turing Learning, a novel system identification method for inferring the behavior of natural or artificial systems. Turing Learning simultaneously optimizes two populations of computer programs, one representing models of the behavior of the system under investigation, and the other representing classifiers. By observing the behavior of the system as well as the behaviors produced by the models, two sets of data samples are obtained. The classifiers are rewarded for discriminating between these two sets, that is, for correctly categorizing data samples as either genuine or counterfeit. Conversely, the models are rewarded for ‘tricking’ the classifiers into categorizing their data samples as genuine. Unlike other methods for system identification, Turing Learning does not require predefined metrics to quantify the difference between the system and its models. We present two case studies with swarms of simulated robots and prove that the underlying behaviors cannot be inferred by a metric-based system identification method. By contrast, Turing Learning infers the behaviors with high accuracy. It also produces a useful by-product—the classifiers—that can be used to detect abnormal behavior in the swarm. Moreover, we show that Turing Learning also successfully infers the behavior of physical robot swarms. The results show that collective behaviors can be directly inferred from motion trajectories of individuals in the swarm, which may have significant implications for the study of animal collectives. Furthermore, Turing Learning could prove useful whenever a behavior is not easily characterizable using metrics, making it suitable for a wide range of applications."

--- Oh FFS. Co-evolutionary learning of classifiers and hard instances was an old idea when I encountered it in graduate school 20+ years ago. (See, e.g., the discussion of Hillis's work in the 1980s in ch. 1 of Mitchell's _Introduction to Genetic Algorithms_ [1996].) I suppose it's possible that the paper acknowledges this is a new implementation of an ancient idea, while the abstract (and the publicity: http://www.defenseone.com/technology/2016/09/new-ai-learns-through-observation-alone-what-means-drone-surveillance/131322/ ) is breathless. It's _possible_.

(Also: anyone who thinks that using classification accuracy means they're doing "metric-free systems identification" fully deserves what will happen to them.)
21 days ago
"One of the deepest ideological divides in contemporary epistemology concerns the relative importance of belief versus credence. A prominent consideration in favor of credence-based epistemology is the ease with which it appears to account for rational action. In contrast, cases with risky payoff structures threaten to break the link between rational belief and rational action. This threat poses a challenge to traditional epistemology, which maintains the theoretical prominence of belief. The core problem, we suggest, is that belief may not be enough to register all aspects of a subject’s epistemic position with respect to any given proposition. We claim this problem can be solved by introducing other doxastic attitudes—genuine representations—that differ in strength from belief. The resulting alternative picture, a kind of doxastic states pluralism, retains the central features of traditional epistemology—most saliently, an emphasis on truth as a kind of objective accuracy—while adequately accounting for rational action."
to:NB  epistemology  decision_theory  rationality
22 days ago
IEEE Xplore Document - Excess-Risk of Distributed Stochastic Learners
"This work studies the learning ability of consensus and diffusion distributed learners from continuous streams of data arising from different but related statistical distributions. Four distinctive features for diffusion learners are revealed in relation to other decentralized schemes even under leftstochastic combination policies. First, closed-form expressions for the evolution of their excess-risk are derived for strongly-convex risk functions under a diminishing step-size rule. Second, using these results, it is shown that the diffusion strategy improves the asymptotic convergence rate of the excess-risk relative to non-cooperative schemes. Third, it is shown that when the innetwork cooperation rules are designed optimally, the performance of the diffusion implementation can outperform that of naive centralized processing. Finally, the arguments further show that diffusion outperforms consensus strategies asymptotically, and that the asymptotic excess-risk expression is invariant to the particular network topology. The framework adopted in this work studies convergence in the stronger mean-squareerror sense, rather than in distribution, and develops tools that enable a close examination of the differences between distributed strategies in terms of asymptotic behavior, as well as in terms of convergence rates."
to:NB  learning_theory  distributed_systems  statistics  collective_cognition  via:ded-maxim
22 days ago
[1609.00037] Good Enough Practices in Scientific Computing
"We present a set of computing tools and techniques that every researcher can and should adopt. These recommendations synthesize inspiration from our own work, from the experiences of the thousands of people who have taken part in Software Carpentry and Data Carpentry workshops over the past six years, and from a variety of other guides. Unlike some other guides, our recommendations are aimed specifically at people who are new to research computing."
to:NB  to_teach:statcomp  to_teach  scientific_computing  have_read  to:blog
23 days ago
[1606.08650] Approximate Smoothing and Parameter Estimation in High-Dimensional State-Space Models
"We present approximate algorithms for performing smoothing in a class of high-dimensional state-space models via sequential Monte Carlo methods ("particle filters"). In high dimensions, a prohibitively large number of Monte Carlo samples ("particles") -- growing exponentially in the dimension of the state space -- is usually required to obtain a useful smoother. Using blocking strategies as in Rebeschini and Van Handel (2015) (and earlier pioneering work on blocking), we exploit the spatial ergodicity properties of the model to circumvent this curse of dimensionality. We thus obtain approximate smoothers that can be computed recursively in time and in parallel in space. First, we show that the bias of our blocked smoother is bounded uniformly in the time horizon and in the model dimension. We then approximate the blocked smoother with particles and derive the asymptotic variance of idealised versions of our blocked particle smoother to show that variance is no longer adversely effected by the dimension of the model. Finally, we employ our method to successfully perform maximum-likelihood estimation via stochastic gradient-ascent and stochastic expectation--maximisation algorithms in a 100-dimensional state-space model."
to:NB  particle_filters  time_series  statistical_inference_for_stochastic_processes  filtering  stochastic_processes  state-space_models  high-dimensional_statistics  singh.sumeetpal_s.  statistics  re:fitness_sampling
23 days ago
[1508.05906] Chaining, Interpolation, and Convexity
"We show that classical chaining bounds on the suprema of random processes in terms of entropy numbers can be systematically improved when the underlying set is convex: the entropy numbers need not be computed for the entire set, but only for certain "thin" subsets. This phenomenon arises from the observation that real interpolation can be used as a natural chaining mechanism. Unlike the general form of Talagrand's generic chaining method, which is sharp but often difficult to use, the resulting bounds involve only entropy numbers but are nonetheless sharp in many situations in which classical entropy bounds are suboptimal. Such bounds are readily amenable to explicit computations in specific examples, and we discover some old and new geometric principles for the control of chaining functionals as special cases."
to:NB  empirical_processes  learning_theory  approximation  convexity  functional_analysis  van_handel.ramon
23 days ago
[1301.6585] Can local particle filters beat the curse of dimensionality?
"The discovery of particle filtering methods has enabled the use of nonlinear filtering in a wide array of applications. Unfortunately, the approximation error of particle filters typically grows exponentially in the dimension of the underlying model. This phenomenon has rendered particle filters of limited use in complex data assimilation problems. In this paper, we argue that it is often possible, at least in principle, to develop local particle filtering algorithms whose approximation error is dimension-free. The key to such developments is the decay of correlations property, which is a spatial counterpart of the much better understood stability property of nonlinear filters. For the simplest possible algorithm of this type, our results provide under suitable assumptions an approximation error bound that is uniform both in time and in the model dimension. More broadly, our results provide a framework for the investigation of filtering problems and algorithms in high dimension."
to:NB  filtering  van_handel.ramon  particle_filters  stochastic_processes  high-dimensional_statistics
23 days ago
[1308.4117] Comparison Theorems for Gibbs Measures
"The Dobrushin comparison theorem is a powerful tool to bound the difference between the marginals of high-dimensional probability distributions in terms of their local specifications. Originally introduced to prove uniqueness and decay of correlations of Gibbs measures, it has been widely used in statistical mechanics as well as in the analysis of algorithms on random fields and interacting Markov chains. However, the classical comparison theorem requires validity of the Dobrushin uniqueness criterion, essentially restricting its applicability in most models to a small subset of the natural parameter space. In this paper we develop generalized Dobrushin comparison theorems in terms of influences between blocks of sites, in the spirit of Dobrushin-Shlosman and Weitz, that substantially extend the range of applicability of the classical comparison theorem. Our proofs are based on the analysis of an associated family of Markov chains. We develop in detail an application of our main results to the analysis of sequential Monte Carlo algorithms for filtering in high dimension."
to:NB  statistical_mechanics  stochastic_processes  ergodic_theory  mixing  van_handel.ramon
23 days ago
The price of complexity in financial networks
"Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises."
to:NB  networks  economics  financial_markets  risk_assessment
24 days ago
How chimpanzees cooperate in a competitive world
to:NB  evolution_of_cooperation  primates
24 days ago
Semantic representations in the temporal pole predict false memories
"Recent advances in neuroscience have given us unprecedented insight into the neural mechanisms of false memory, showing that artificial memories can be inserted into the memory cells of the hippocampus in a way that is indistinguishable from true memories. However, this alone is not enough to explain how false memories can arise naturally in the course of our daily lives. Cognitive psychology has demonstrated that many instances of false memory, both in the laboratory and the real world, can be attributed to semantic interference. Whereas previous studies have found that a diverse set of regions show some involvement in semantic false memory, none have revealed the nature of the semantic representations underpinning the phenomenon. Here we use fMRI with representational similarity analysis to search for a neural code consistent with semantic false memory. We find clear evidence that false memories emerge from a similarity-based neural code in the temporal pole, a region that has been called the “semantic hub” of the brain. We further show that each individual has a partially unique semantic code within the temporal pole, and this unique code can predict idiosyncratic patterns of memory errors. Finally, we show that the same neural code can also predict variation in true-memory performance, consistent with an adaptive perspective on false memory. Taken together, our findings reveal the underlying structure of neural representations of semantic knowledge, and how this semantic structure can both enhance and distort our memories."
to:NB  neuroscience  psychology  memory  fmri
24 days ago
Intellectual Pursuits of Nicolas Rashevsky - The Queer Duck | Maya Shmailov | Springer
"Who was Nicolas Rashevsky? To answer that question, this book draws on Rashevsky’s unexplored personal archival papers and shares interviews with his family, students and friends, as well as discussions with biologists and mathematical biologists, to flesh out and complete the picture.
"“Most modern-day biologists have never heard of Rashevsky. Why?” In what constitutes the first detailed biography of theoretical physicist Nicolas Rashevsky (1899-1972), spanning key aspects of his long scientific career, the book captures Rashevsky’s ways of thinking about the place mathematical biology should have in biology and his personal struggle for the acceptance of his views. It brings to light the tension between mathematicians, theoretical physicists and biologists when it comes to the introduction of physico-mathematical tools into biology. Rashevsky’s successes and failures in his efforts to establish mathematical biology as a subfield of biology provide an important test case for understanding the role of theory (in particular mathematics) in understanding the natural world.
"With the biological sciences moving towards new vistas of inter- and multi-disciplinary collaborations and research programs, the book will appeal to a wide readership ranging from historians, sociologists, and ethnographers of American science and culture to students and general readers with an interest in the history of the life sciences, mathematical biology and the social construction of science."

--- Rashevsky has long seemed to me to be a key player in the secret intellectual history of the 20th century, someone who influenced and encouraged all sorts of people who made more famous (and perhaps more lasting) contributions...
to:NB  books:noted  history_of_science  lives_of_the_scientists  rashevsky.nicolas
24 days ago
[1510.04740] Semiparametric theory and empirical processes in causal inference
"In this paper we review important aspects of semiparametric theory and empirical processes that arise in causal inference problems. We begin with a brief introduction to the general problem of causal inference, and go on to discuss estimation and inference for causal effects under semiparametric models, which allow parts of the data-generating process to be unrestricted if they are not of particular interest (i.e., nuisance functions). These models are very useful in causal problems because the outcome process is often complex and difficult to model, and there may only be information available about the treatment process (at best). Semiparametric theory gives a framework for benchmarking efficiency and constructing estimators in such settings. In the second part of the paper we discuss empirical process theory, which provides powerful tools for understanding the asymptotic behavior of semiparametric estimators that depend on flexible nonparametric estimators of nuisance functions. These tools are crucial for incorporating machine learning and other modern methods into causal inference analyses. We conclude by examining related extensions and future directions for work in semiparametric causal inference."
to:NB  statistics  causal_inference  nonparametrics  empirical_processes  to_read  kith_and_kin  kennedy.edward_h.
29 days ago
Measuring Paradigmaticness of Disciplines Using Text | Sociological Science
"In this paper, we describe new methods that use the text of publications to measure the paradigmaticness of disciplines. Drawing on the text of published articles in the Web of Science, we build samples of disciplinary discourse. Using these language samples, we measure the two core concepts of paradigmaticness—consensus and rapid discovery (Collins 1994)—and show the relative positioning of eight example disciplines on each of these measures. Our measures show consistent differences between the “hard” sciences and “soft” social sciences. Deviations in the expected ranking of disciplines within the sciences and social sciences suggest new interpretations of the hierarchy of disciplines, directions for future research, and further insight into the developments in disciplinary structure and discourse that shape paradigmaticness."
to:NB  sociology_of_science  text_mining  via:kjhealy
4 weeks ago
Asymmetric Information and Intermediation Chains
"We propose a parsimonious model of bilateral trade under asymmetric information to shed light on the prevalence of intermediation chains that stand between buyers and sellers in many decentralized markets. Our model features a classic problem in economics where an agent uses his market power to inefficiently screen a privately informed counterparty. Paradoxically, involving moderately informed intermediaries also endowed with market power can improve trade efficiency. Long intermediation chains in which each trader's information set is similar to those of his direct counterparties limit traders' incentives to post prices that reduce trade volume and jeopardize gains to trade."
to:NB  economics  market_failures_in_everything  economics_of_imperfect_information
4 weeks ago
Robust Social Decisions
"We propose and operationalize normative principles to guide social decisions when individuals potentially have imprecise and heterogeneous beliefs, in addition to conflicting tastes or interests. To do so, we adapt the standard Pareto principle to those preference comparisons that are robust to belief imprecision and characterize social preferences that respect this robust principle. We also characterize a suitable restriction of this principle. The former principle provides stronger guidance when it can be satisfied; when it cannot, the latter always provides minimal guidance."
to:NB  decision_theory  social_choice  risk_vs_uncertainty  re:knightian_uncertainty
4 weeks ago
Interactive R On-Line
"IROL was developed by the team of Howard Seltman (email feedback), Rebecca Nugent, Sam Ventura, Ryan Tibshirani, and Chris Genovese at the Department of Statistics at Carnegie Mellon University."

--- I mark this as "to_teach:statcomp", but of course the point is to have people go through this _before_ that course, so the class can cover more interesting stuff.
R  kith_and_kin  seltman.howard  nugent.rebecca  genovese.christopher  ventura.samuel  tibshirani.ryan  to_teach:statcomp
4 weeks ago
shinyTex
"ShinyTex is a system for authoring interactive World Wide Web applications (apps) which includes the full capabilities of the R statistical language, particularly in the context of Technology Enhanced Learning (TEL). It uses a modified version of the LaTeX syntax that is standard for document creation among mathematicians and statisticians. It is built on the Shiny platform, an extension of R designed by RStudio to produce web apps. The goal is to provide an easy to use TEL authoring environment with excellent mathematical and statistical support using only free software. ShinyTex authoring can be performed on Windows, OS X, and Linux. Users may view the app on any system with a standard web browser."
R  latex  kith_and_kin  seltman.howard
4 weeks ago
Red to Blue | DCCC
For those of us in securely-Democratic districts...
us_politics
4 weeks ago
Factor Modelling for High-Dimensional Time Series: Inference and Model Selection - Chan - 2016 - Journal of Time Series Analysis - Wiley Online Library
"Analysis of high-dimensional time series data is of increasing interest among different fields. This article studies high-dimensional time series from a dimension reduction perspective using factor modelling. Statistical inference is conducted using eigen-analysis of a certain non-negative definite matrix related to autocovariance matrices of the time series, which is applicable to fixed or increasing dimension. When the dimension goes to infinity, the rate of convergence and limiting distributions of estimated factors are established. Using the limiting distributions of estimated factors, a high-dimensional final prediction error criterion is proposed to select the number of factors. Asymptotic properties of the criterion are illustrated by simulation studies and real applications."
to:NB  time_series  factor_analysis  high-dimensional_statistics  statistics
5 weeks ago
Conjuring Asia | East Asian History | Cambridge University Press
"The promise of magic has always commanded the human imagination, but the story of industrial modernity is usually seen as a process of disenchantment. Drawing on the writings and performances of the so-called 'Golden Age Magicians' from the turn of the twentieth century, Chris Goto-Jones unveils the ways in which European and North American encounters with (and representations of) Asia - the fabled Mystic East - worked to re-enchant experiences of the modern world. Beginning with a reconceptualization of the meaning of 'modern magic' itself - moving beyond conventional categories of 'real' and 'fake' magic - Goto-Jones' acclaimed book guides us on a magical mystery tour around India, China and Japan, showing us levitations and decapitations, magic duels and bullet catches, goldfish bowls and paper butterflies. In the end, this mesmerizing book reveals Orientalism as a kind of magic in itself, casting a spell over Western culture that leaves it transformed even today."
to:NB  books:noted  magic  orientalism  modernity  history
5 weeks ago
Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures
"Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM."
to:NB  phylogenetics  statistics  re:phil-of-bayes_paper
5 weeks ago
Constraint, natural selection, and the evolution of human body form

--- Not obvious to me how they can pick out the constraints here (maybe assuming equal within-grup covariances, and assuming they reflect constraints?), but presumably that's addressed in the paper.
to:NB  human_evolution  evolutionary_biology
5 weeks ago
Unpaid, stressed, and confused: patients are the health care system's free labor - Vox
I'll just add that in my experience, this work often falls on the healthy spouse or children of a seriously ill person.
5 weeks ago
Spiritual Despots: Modern Hinduism and the Genealogies of Self-Rule, Scott
"Historians of religion have examined at length the Protestant Reformation and the liberal idea of the self-governing individual that arose from it. In Spiritual Despots, J. Barton Scott reveals an unexamined piece of this story: how Protestant technologies of asceticism became entangled with Hindu spiritual practices to create an ideal of the “self-ruling subject” crucial to both nineteenth-century reform culture and early twentieth-century anticolonialism in India. Scott uses the quaint term “priestcraft” to track anticlerical polemics that vilified religious hierarchy, celebrated the individual, and endeavored to reform human subjects by freeing them from external religious influence. By drawing on English, Hindi, and Gujarati reformist writings, Scott provides a panoramic view of precisely how the specter of the crafty priest transformed religion and politics in India.
"Through this alternative genealogy of the self-ruling subject, Spiritual Despots demonstrates that Hindu reform movements cannot be understood solely within the precolonial tradition, but rather need to be read alongside other movements of their period. The book’s focus moves fluidly between Britain and India—engaging thinkers such as James Mill, Keshub Chunder Sen, Max Weber, Karsandas Mulji, Helena Blavatsky, M. K. Gandhi, and others—to show how colonial Hinduism shaped major modern discourses about the self. Throughout, Scott sheds much-needed light how the rhetoric of priestcraft and practices of worldly asceticism played a crucial role in creating a new moral and political order for twentieth-century India and demonstrates the importance of viewing the emergence of secularism through the colonial encounter."
to:NB  books:noted  india  cultural_exchange  rhetorical_self-fashioning  religion  history_of_religion  asceticism
5 weeks ago
After the Map: Cartography, Navigation, and the Transformation of Territory in the Twentieth Century, Rankin
"For most of the twentieth century, maps were indispensable. They were how governments understood, managed, and defended their territory, and during the two world wars they were produced by the hundreds of millions. Cartographers and journalists predicted the dawning of a “map-minded age,” where increasingly state-of-the-art maps would become everyday tools. By the century’s end, however, there had been decisive shift in mapping practices, as the dominant methods of land surveying and print publication were increasingly displaced by electronic navigation systems.
"In After the Map, William Rankin argues that although this shift did not render traditional maps obsolete, it did radically change our experience of geographic knowledge, from the God’s-eye view of the map to the embedded subjectivity of GPS. Likewise, older concerns with geographic truth and objectivity have been upstaged by a new emphasis on simplicity, reliability, and convenience. After the Map shows how this change in geographic perspective is ultimately a transformation of the nature of territory, both social and political."

--- Some of these claims seem just bizarre, especially that last sentence.
books:noted  to:NB  maps  to_be_shot_after_a_fair_trial
5 weeks ago
Segregation: A Global History of Divided Cities, Nightingale
"When we think of segregation, what often comes to mind is apartheid South Africa, or the American South in the age of Jim Crow—two societies fundamentally premised on the concept of the separation of the races. But as Carl H. Nightingale shows us in this magisterial history, segregation is everywhere, deforming cities and societies worldwide.
"Starting with segregation’s ancient roots, and what the archaeological evidence reveals about humanity’s long-standing use of urban divisions to reinforce political and economic inequality, Nightingale then moves to the world of European colonialism. It was there, he shows, segregation based on color—and eventually on race—took hold; the British East India Company, for example, split Calcutta into “White Town” and “Black Town.” As we follow Nightingale’s story around the globe, we see that division replicated from Hong Kong to Nairobi, Baltimore to San Francisco, and more. The turn of the twentieth century saw the most aggressive segregation movements yet, as white communities almost everywhere set to rearranging whole cities along racial lines. Nightingale focuses closely on two striking examples: Johannesburg, with its state-sponsored separation, and Chicago, in which the goal of segregation was advanced by the more subtle methods of real estate markets and housing policy.
"For the first time ever, the majority of humans live in cities, and nearly all those cities bear the scars of segregation. This unprecedented, ambitious history lays bare our troubled past, and sets us on the path to imagining the better, more equal cities of the future."
to:NB  books:noted  cities  racism  world_history
6 weeks ago
Radium and the Secret of Life, Campos
"Before the hydrogen bomb indelibly associated radioactivity with death, many chemists, physicians, botanists, and geneticists believed that radium might hold the secret to life. Physicists and chemists early on described the wondrous new element in lifelike terms such as “decay” and “half-life,” and made frequent references to the “natural selection” and “evolution” of the elements. Meanwhile, biologists of the period used radium in experiments aimed at elucidating some of the most basic phenomena of life, including metabolism and mutation.
"From the creation of half-living microbes in the test tube to charting the earliest histories of genetic engineering, Radium and the Secret of Life highlights previously unknown interconnections between the history of the early radioactive sciences and the sciences of heredity. Equating the transmutation of radium with the biological transmutation of living species, biologists saw in metabolism and mutation properties that reminded them of the new element. These initially provocative metaphoric links between radium and life proved remarkably productive and ultimately led to key biological insights into the origin of life, the nature of heredity, and the structure of the gene. Radium and the Secret of Life recovers a forgotten history of the connections between radioactivity and the life sciences that existed long before the dawn of molecular biology."
to:NB  books:noted  radioactivity  molecular_biology  genetics  physics  biology  history_of_science  history_of_physics
6 weeks ago
Dogon Restudied: A Field Evaluation of the Work of Marcel Griaule [and Comments and Replies] on JSTOR
"This restudy of the Dogon of Mali asks whether the texts produced by Marcel Griaule depict a society that is recognizable to the researcher and to the Dogon today and answers the question more or less in the negative. The picture of Dogon religion presented in _Dieu d'eau_ and _Le renard pale_ proved impossible to replicate in the field, even as the shadowy remnant of a largely forgotten past. The reasons for this, it is suggested, lie in the particular field situation of Griaule's research, including features of the ethnographer's approach, the political setting, the experience and predilections of the informants, and the values of Dogon culture."

--- This is an extraordinary story of a cluster of (mostly) good intentions producing horribly skewed results, which then took on a bizarre life of their own.
to:NB  ethnography  epidemiology_of_representations  scholarly_misconstruction_of_reality  to:blog  natural_history_of_truthiness
6 weeks ago
How political idealism leads us astray - Vox
The book sounds like warmed-over Popper, but then I suppose he does need that every so often. (I say this as someone who imprinted _very_ thoroughly on _The Open Society and Its Enemies_.)
6 weeks ago
Learning Minimal Latent Directed Information Polytrees
"We propose an approach for learning latent directed polytrees as long as there exists an appropriately defined discrepancy measure between the observed nodes. Specifically, we use our approach for learning directed information polytrees where samples are available from only a subset of processes. Directed information trees are a new type of probabilistic graphical models that represent the causal dynamics among a set of random processes in a stochastic system. We prove that the approach is consistent for learning minimal latent directed trees. We analyze the sample complexity of the learning task when the empirical estimator of mutual information is used as the discrepancy measure."
to:NB  to_read  information_theory  causal_inference  causal_discovery  statistics  chow-liu_trees  coleman.todd
6 weeks ago
Ours to Hack and to Own, ed. Scholz and Schneider - OR Books
"Here, for the first time in one volume, are some of the most cogent thinkers and doers on the subject of the cooptation of the Internet, and how we can resist and reverse the process. The activists who have put together Ours to Hack and to Own argue for a new kind of online economy: platform cooperativism, which combines the rich heritage of cooperatives with the promise of 21st-century technologies, free from monopoly, exploitation, and surveillance.
"The on-demand economy is reversing the rights and protections workers fought for centuries to win. Ordinary Internet users, meanwhile, retain little control over their personal data. While promising to be the great equalizers, online platforms have often exacerbated social inequalities. Can the Internet be owned and governed differently? What if Uber drivers set up their own platform, or if a city’s residents controlled their own version of Airbnb? This book shows that another kind of Internet is possible—and that, in a new generation of online platforms, it is already taking shape."
to:NB  books:noted  workers_cooperatives  networked_life
6 weeks ago
Bail, C.A.: Terrified: How Anti-Muslim Fringe Organizations Became Mainstream. (eBook, Paperback and Hardcover)
"In July 2010, Terry Jones, the pastor of a small fundamentalist church in Florida, announced plans to burn two hundred Qur’ans on the anniversary of the September 11 attacks. Though he ended up canceling the stunt in the face of widespread public backlash, his threat sparked violent protests across the Muslim world that left at least twenty people dead. In Terrified, Christopher Bail demonstrates how the beliefs of fanatics like Jones are inspired by a rapidly expanding network of anti-Muslim organizations that exert profound influence on American understanding of Islam.
"Bail traces how the anti-Muslim narrative of the political fringe has captivated large segments of the American media, government, and general public, validating the views of extremists who argue that the United States is at war with Islam and marginalizing mainstream Muslim-Americans who are uniquely positioned to discredit such claims. Drawing on cultural sociology, social network theory, and social psychology, he shows how anti-Muslim organizations gained visibility in the public sphere, commandeered a sense of legitimacy, and redefined the contours of contemporary debate, shifting it ever outward toward the fringe. Bail illustrates his pioneering theoretical argument through a big-data analysis of more than one hundred organizations struggling to shape public discourse about Islam, tracing their impact on hundreds of thousands of newspaper articles, television transcripts, legislative debates, and social media messages produced since the September 11 attacks. The book also features in-depth interviews with the leaders of these organizations, providing a rare look at how anti-Muslim organizations entered the American mainstream."
to:NB  books:noted  islamophobia  running_dogs_of_reaction  social_movements  whats_gone_wrong_with_america  the_continuing_crises
6 weeks ago
How Multiple Imputation Makes a Difference
"Political scientists increasingly recognize that multiple imputation represents a superior strategy for analyzing missing data to the widely used method of list- wise deletion. However, there has been little systematic investigation of how mul- tiple imputation affects existing empirical knowledge in the discipline. This article presents the first large-scale examination of the empirical effects of substituting mul- tiple imputation for listwise deletion in political science. The examination focuses on research in the major subfield of comparative and international political economy (CIPE) as an illustrative example. Specifically, I use multiple imputation to reana- lyze the results of almost every quantitative CIPE study published during a recent five-year period in International Organization and World Politics, two of the leading subfield journals in CIPE. The outcome is striking: in almost half of the studies, key results “disappear” (by conventional statistical standards) when reanalyzed."
to:NB  have_skimmed  re:ADAfaEPoV  missing_data  statistics  political_science  via:henry_farrell
6 weeks ago
teachers are laborers, not merchants – Fredrik deBoer
"Here’s the model that the constant “online education will replace physical colleges” types advance: education is about gaining knowledge; knowledge is stored in the heads of teachers; schooling is the transfer of that knowledge from the teacher’s head to the student’s head; physical facilities are expensive, but online equivalents are cheap; therefore someone will build an Amazon that cuts out the overhead of the physical campus and connects students to teachers in the online space or, alternatively, cuts teachers out altogether and just transfers the information straight into the brains of the student.
"The basic failure here is the basic model of transfer of information, like teachers are merchants who sell discrete products known as knowledge or skills. In fact education is far more a matter of labor, of teachers working to push that information into the heads of students, or more accurately, to compel students to push it into their own heads. And this work is fundamentally social, and requires human accountability, particularly for those who lack prerequisite skills.
"I’ve said this before: if education was really about access to information, then anyone with a library card could have skipped college well before the internet. The idea that the internet suddenly made education obsolete because it freed information from being hidden away presumes that information was kept under lock and key. But, you know, books exist and are pretty cheap and they contain information. Yet if you have a class of undergraduates sit in a room for an hour twice a week with some chemistry textbooks, I can tell you that most of them aren’t going to learn a lot of chemistry. The printing press did not make teachers obsolete, and neither has the internet."
6 weeks ago
Temporal Evolution of Social Innovation: What Matters? : SIAM Journal on Applied Dynamical Systems: Vol. 15, No. 3 (Society for Industrial and Applied Mathematics)
"Variations in patterns of innovation propagation found across complex networks are governed by the preference for an innovation and the topology (or connectivity pattern) of a network. This paper incorporates the interplay of these two features, which has received scant attention so far, in a simple model to study the temporal evolution of innovation in a social network. An individual upon interaction with an acceptor in the neighborhood progresses from an uninformed state to being informed before accepting the innovation, with a probability $\lambda$ that specifies the preference for innovation. Using only one intermediate information acquisition stage, the model concisely brings out a variety of patterns. Time taken to attain maximum velocity in a class of connectivity $k$ and population $N_k$ depends on $\lambda^{-2}k^{-2}N_{k}^{1/2}$. More importantly, we establish the lower bound that the average connectivity of a random network having minimum connectivity as low as 2 can attain and still be able to overtake the corresponding innovation emergence in a scale-free network. We show computationally and analytically the conditions in which the propagation in random networks may lead or lag behind that in scale-free networks. Hierarchical propagation is evident across connectivity classes within scale-free networks, as well as across random networks with distinct values of $k$, population. For highly preferred innovations, however, the hierarchy within scale-free networks tends to be insignificant. We verify using stochastic dominance, an uncertainty in class contributions in the upper range of connectivity. This makes innovation hard to administer in finite size networks."

--- This is almost what's needed, except there should be variation in susceptibility, correlated with degree. (Perhaps that would work out as amplifying or reducing effective degree, with constant susceptibility?)
to:NB  to_read  diffusion_of_innovations  epidemic_models  social_networks  re:do-institutions-evolve
6 weeks ago
Ancestors, Territoriality, and Gods - A Natural History of Religion | Ina Wunn | Springer
"This books sets out to explain how and why religion came into being. Today this question is as fascinating as ever, especially since religion has moved to the centre of socio-political relationships. In contrast to the current, but incomplete approaches from disciplines such as cognitive science and psychology, the present authors adopt a new approach, equally manifest and constructive, that explains the origins of religion based strictly on behavioural biology. They employ accepted research results that remove all need for speculation. Decisive factors for the earliest demonstrations of religion are thus territorial behaviour and ranking, coping with existential fears, and conflict solution with the help of rituals. These in turn, in a process of cultural evolution, are shown to be the roots of the historical and contemporary religions."

--- Because "existential fears" is clearly a non-psychological concept.
to:NB  books:noted  religion  psychoceramica
7 weeks ago
A Delayed Review of This Changes Everything: Capitalism vs the Climate by Naomi Klein
"The view that capitalism is a style of thinking, progress is a myth, and political contestation is irrelevant to “true” social change belongs not just to this one book but to all the commentators who found nothing to criticize. That’s the real problem."
book_reviews  climate_change  progressive_forces  dorman.peter  have_read  via:?
7 weeks ago
Does Peer Review Work? An Experiment of Experimentalism by Daniel E. Ho :: SSRN
"Ensuring the accuracy and consistency of highly decentralized and discretionary decision making is a core challenge for the administrative state. The widely influential school of “democratic experimentalism” posits that peer review provides a way forward, but systematic evidence remains limited. This Article provides the first empirical study of the feasibility and effects of peer review as a governance mechanism, based on a unique randomized controlled trial conducted with the largest health department in Washington State (Public Health-Seattle and King County). We randomly assigned half of the food safety inspection staff to engage in an intensive peer review process for a 4-month period. Pairs of inspectors jointly visited establishments, separately assessed health code violations, and deliberated about divergences on health code implementation. Our findings are threefold. First, observing identical conditions, inspectors disagreed 60% of the time. These joint inspection results in turn helped to pinpoint challenging code items and to efficiently develop training and guidance documents during weekly sessions. Second, analyzing over 28,000 independently conducted inspections across the peer review and control groups, we find that the intervention caused an increase in violations detected and scored by 17-19%. Third, peer review appeared to decrease variability across inspectors, thereby improving the consistency of inspections. As a result of this trial, King County has now instituted peer review as a standard practice. Our study has rich implications for the feasibility, promise, practice, and pitfalls of peer review, democratic experimentalism, and the administrative state."
to:NB  to_read  peer_review  public_policy  regulation  re:democratic_cognition  experimental_sociology
7 weeks ago
Campbell, J.: Polarized: Making Sense of a Divided America. (eBook and Hardcover)
"Many continue to believe that the United States is a nation of political moderates. In fact, it is a nation divided. It has been so for some time and has grown more so. This book provides a new and historically grounded perspective on the polarization of America, systematically documenting how and why it happened.
"Polarized presents commonsense benchmarks to measure polarization, draws data from a wide range of historical sources, and carefully assesses the quality of the evidence. Through an innovative and insightful use of circumstantial evidence, it provides a much-needed reality check to claims about polarization. This rigorous yet engaging and accessible book examines how polarization displaced pluralism and how this affected American democracy and civil society.
"Polarized challenges the widely held belief that polarization is the product of party and media elites, revealing instead how the American public in the 1960s set in motion the increase of polarization. American politics became highly polarized from the bottom up, not the top down, and this began much earlier than often thought. The Democrats and the Republicans are now ideologically distant from each other and about equally distant from the political center. Polarized also explains why the parties are polarized at all, despite their battle for the decisive median voter. No subject is more central to understanding American politics than political polarization, and no other book offers a more in-depth and comprehensive analysis of the subject than this one."
to:NB  books:noted  us_politics  political_science  whats_gone_wrong_with_america
7 weeks ago
[1608.00607] Configuring Random Graph Models with Fixed Degree Sequences
"Random graph null models have found widespread application in diverse research communities analyzing network datasets. The most popular family of random graph null models, called configuration models, are defined as uniform distributions over a space of graphs with a fixed degree sequence. Commonly, properties of an empirical network are compared to properties of an ensemble of graphs from a configuration model in order to quantify whether empirical network properties are meaningful or whether they are instead a common consequence of the particular degree sequence. In this work we study the subtle but important decisions underlying the specification of a configuration model, and investigate the role these choices play in graph sampling procedures and a suite of applications. We place particular emphasis on the importance of specifying the appropriate graph labeling---stub-labeled or vertex-labeled---under which to consider a null model, a choice that closely connects the study of random graphs to the study of random contingency tables. We show that the choice of graph labeling is inconsequential for studies of simple graphs, but can have a significant impact on analyses of multigraphs or graphs with self-loops. The importance of these choices is demonstrated through a series of three in-depth vignettes, analyzing three different network datasets under many different configuration models and observing substantial differences in study conclusions under different models. We argue that in each case, only one of the possible configuration models is appropriate. While our work focuses on undirected static networks, it aims to guide the study of directed networks, dynamic networks, and all other network contexts that are suitably studied through the lens of random graph null models."
to:NB  network_data_analysis  statistics  null_models  to_teach:baby-nets
7 weeks ago
Headley, J.M.: The Europeanization of the World: On the Origins of Human Rights and Democracy. (eBook, Paperback and Hardcover)
"The Europeanization of the World puts forward a defense of Western civilization and the unique gifts it has bequeathed to the world-in particular, human rights and constitutional democracy-at a time when many around the globe equate the West with hubris and thinly veiled imperialism. John Headley argues that the Renaissance and the Reformation provided the effective currents for the development of two distinctive political ideas. The first is the idea of a common humanity, derived from antiquity, developed through natural law, and worked out in the new emerging global context to provide the basis for today's concept of universal human rights. The second is the idea of political dissent, first posited in the course of the Protestant Reformation and later maturing in the politics of the British monarchy.
"Headley traces the development and implications of this first idea from antiquity to the present. He examines the English revolution of 1688 and party government in Britain and America into the early nineteenth century. And he challenges the now--common stance in historical studies of moral posturing against the West. Headley contends that these unique ideas are Western civilization's most precious export, however presently distorted. Certainly European culture has its dark side--Auschwitz is but one example. Yet as Headley shows, no other civilization in history has bequeathed so sustained a tradition of universalizing aspirations as the West. The Europeanization of the World makes an argument that is controversial but long overdue. Written by one of our preeminent scholars of the Renaissance and Reformation, this elegantly reasoned book is certain to spark a much-needed reappraisal of the Western tradition."
to:NB  books:noted  democracy  human_rights  modernity  to_be_shot_after_a_fair_trial
7 weeks ago
Governed by a Spirit of Opposition: The Origins of American Political Practice in Colonial Philadelphia
"During the colonial era, ordinary Philadelphians played an unusually active role in political life. Because the city lacked a strong central government, private individuals working in civic associations of their own making shouldered broad responsibility for education, poverty relief, church governance, fire protection, and even taxation and military defense. These organizations dramatically expanded the opportunities for white men—rich and poor alike—to shape policies that immediately affected their communities and their own lives.
"In Governed by a Spirit of Opposition, Jessica Choppin Roney explains how allowing people from all walks of life to participate in political activities amplified citizen access and democratic governance. Merchants, shopkeepers, carpenters, brewers, shoemakers, and silversmiths served as churchwardens, street commissioners, constables, and Overseers of the Poor. They volunteered to fight fires, organized relief for the needy, contributed money toward the care of the sick, took up arms in defense of the community, raised capital for local lending, and even interjected themselves in Indian diplomacy. Ultimately, Roney suggests, popular participation in charity, schools, the militia, and informal banks empowered people in this critically important colonial city to overthrow the existing government in 1776 and re-envision the parameters of democratic participation.
"Governed by a Spirit of Opposition argues that the American Revolution did not occasion the birth of commonplace political activity or of an American culture of voluntary association. Rather, the Revolution built upon a long history of civic engagement and a complicated relationship between the practice of majority-rule and exclusionary policy-making on the part of appointed and self-selected constituencies."
to:NB  books:noted  american_history  civil_society  self-organization  heard_the_talk  where_by_"heard_the_talk"_i_mean_"heard_it_explained_over_drinks"  institutions  re:democratic_cognition  democracy
7 weeks ago
The Rise of Modern Science Explained | History Science and Technology | Cambridge University Press
"For centuries, laymen and priests, lone thinkers and philosophical schools in Greece, China, the Islamic world and Europe reflected with wisdom and perseverance on how the natural world fits together. As a rule, their methods and conclusions, while often ingenious, were misdirected when viewed from the perspective of modern science. In the 1600s thinkers such as Galileo, Kepler, Descartes, Bacon and many others gave revolutionary new twists to traditional ideas and practices, culminating in the work of Isaac Newton half a century later. It was as if the world was being created anew. But why did this recreation begin in Europe rather than elsewhere? This book caps H. Floris Cohen's career-long effort to find answers to this classic question. Here he sets forth a rich but highly accessible account of what, against many odds, made it happen and why."
to:NB  books:noted  history_of_science  comparative_history  scientific_revolution
9 weeks ago
Struck, P.T.: Divination and Human Nature: A Cognitive History of Intuition in Classical Antiquity. (eBook and Hardcover)
"Divination and Human Nature casts a new perspective on the rich tradition of ancient divination—the reading of divine signs in oracles, omens, and dreams. Popular attitudes during classical antiquity saw these readings as signs from the gods while modern scholars have treated such beliefs as primitive superstitions. In this book, Peter Struck reveals instead that such phenomena provoked an entirely different accounting from the ancient philosophers. These philosophers produced subtle studies into what was an odd but observable fact—that humans could sometimes have uncanny insights—and their work signifies an early chapter in the cognitive history of intuition.
"Examining the writings of Plato, Aristotle, the Stoics, and the Neoplatonists, Struck demonstrates that they all observed how, setting aside the charlatans and swindlers, some people had premonitions defying the typical bounds of rationality. Given the wide differences among these ancient thinkers, Struck notes that they converged on seeing this surplus insight as an artifact of human nature, projections produced under specific conditions by our physiology. For the philosophers, such unexplained insights invited a speculative search for an alternative and more naturalistic system of cognition.
"Recovering a lost piece of an ancient tradition, Divination and Human Nature illustrates how philosophers of the classical era interpreted the phenomena of divination as a practice closer to intuition and instinct than magic."
to:NB  books:noted  divination  history_of_ideas  philosophy  ancient_history  re:evidence-based_haruspicy
9 weeks ago
Common Property | Boston Review
Social insurance as rents from a share in (fairly literally) the commonwealth.
political_philosophy  welfare_state  hayek.f.a._von  paine.thomas  anderson.elizabeth  have_read
9 weeks ago
Politics and Institutionalism: Explaining Durability and Change on JSTOR
"From the complex literatures on "institutionalisms" in political science and sociology, various components of institutional change are identified: mutability, contradiction, multiplicity, containment and diffusion, learning and innovation, and mediation. This exercise results in a number of clear prescriptions for the analysis of politics and institutional change: disaggregate institutions into schemas and resources; decompose institutional durability into processes of reproduction, disruption, and response to disruption; and, above all, appreciate the multiplicity and heterogeneity of the institutions that make up the social world. Recent empirical work on identities, interests, alternatives, and political innovation illustrates how political scientists and sociologists have begun to document the consequences of institutional contradiction and multiplicity and to trace the workings of institutional containment, diffusion, and mediation."
to:NB  to_read  institutions  social_theory  diffusion_of_innovations  re:do-institutions-evolve  via:henry_farrell
9 weeks ago
Transformative Treatments
"Contemporary social-scientific research seeks to identify specific causal mechanisms for outcomes of theoretical interest. Experiments that randomize populations to treatment and control conditions are the “gold standard” for causal inference. We identify, describe, and analyze the problem posed by transformative treatments. Such treatments radically change treated individuals in a way that creates a mismatch in populations, but this mismatch is not empirically detectable at the level of counterfactual dependence. In such cases, the identification of causal pathways is underdetermined in a previously unrecognized way. Moreover, if the treatment is indeed transformative it breaks the inferential structure of the experimental design. Transformative treatments are not curiosities or “corner cases”, but are plausible mechanisms in a large class of events of theoretical interest, particularly ones where deliberate randomization is impractical and quasi-experimental designs are sought instead. They cast long-running debates about treatment and selection effects in a new light, and raise new methodological challenges."

--- After skimming, I'm left spluttering "but, but, _every_ intervention creates a new population!", so I am probably missing something fundamental, and should do more than just skim.
to:NB  causality  causal_inference  barely-comprehensible_metaphysics  healy.kieran  have_skimmed
9 weeks ago
[1607.05506] Distribution-dependent concentration inequalities for tighter generalization bounds
"We prove several distribution-dependent extensions of Hoeffding and McDiarmid's inequalities with (difference-) unbounded and hierarchically (difference-) bounded functions. For this purpose, several assumptions about the probabilistic boundedness and bounded differences are introduced. Our approaches improve the previous concentration inequalities' bounds, and achieve tight bounds in some exceptional cases where the original inequalities cannot hold. Furthermore, we discuss the potential applications of our extensions in VC dimension and Rademacher complexity. Then we obtain generalization bounds for (difference-) unbounded loss functions and tighten the existing generalization bounds."
to:NB  deviation_inequalities  probability  learning_theory
9 weeks ago
[1604.01575] Clustering implies geometry in networks
"Network models with latent geometry have been used successfully in many applications in network science and other disciplines, yet it is usually impossible to tell if a given real network is geometric, meaning if it is a typical element in an ensemble of random geometric graphs. Here we identify structural properties of networks that guarantee that random graphs having these properties are geometric. Specifically we show that random graphs in which expected degree and clustering of every node are fixed to some constants are equivalent to random geometric graphs on the real line, if clustering is sufficiently strong. Large numbers of triangles, homogeneously distributed across all nodes as in real networks, are thus a consequence of network geometricity. The methods we use to prove this are quite general and applicable to other network ensembles, geometric or not, and to certain problems in quantum gravity."
to:NB  network_data_analysis  network_formation  latent_space_network_models
9 weeks ago

Copy this bookmark:

description:

tags: