Spiral precipitation patterns in confined chemical gardens
"Chemical gardens are mineral aggregates that grow in three dimensions with plant-like forms and share properties with self-assembled structures like nanoscale tubes, brinicles, or chimneys at hydrothermal vents. The analysis of their shapes remains a challenge, as their growth is influenced by osmosis, buoyancy, and reaction–diffusion processes. Here we show that chemical gardens grown by injection of one reactant into the other in confined conditions feature a wealth of new patterns including spirals, flowers, and filaments. The confinement decreases the influence of buoyancy, reduces the spatial degrees of freedom, and allows analysis of the patterns by tools classically used to analyze 2D patterns. Injection moreover allows the study in controlled conditions of the effects of variable concentrations on the selected morphology. We illustrate these innovative aspects by characterizing quantitatively, with a simple geometrical model, a new class of self-similar logarithmic spirals observed in a large zone of the parameter space."

- D'Arcy Thompson would've been all over this.
to:NB  pattern_formation  self-organization  physics  chemistry  reaction-diffusion 
4 hours ago
Forecasting Nonstationary Time Series: From Theory to Algorithms
"Generalization bounds for time series prediction and other non-i.i.d. learning sce- narios that can be found in the machine learning and statistics literature assume that observations come from a (strictly) stationary distribution. The first bounds for completely non-stationary setting were proved in [6]. In this work we present an extension of these results and derive novel algorithms for forecasting non- stationary time series. Our experimental results show that our algorithms sig- nificantly outperform standard autoregressive models commonly used in practice."

--- Assumes mixing but not stationary.
to:NB  to_read  mixing  learning_theory  re:your_favorite_dsge_sucks  re:XV_for_mixing  time_series 
4 hours ago
[1412.3730] Inconsistency of Bayesian Inference for Misspecified Linear Models, and a Proposal for Repairing It
"We empirically show that Bayesian inference can be inconsistent under misspecification in simple linear regression problems, both in a model averaging/selection and in a Bayesian ridge regression setting. We use the standard linear model, which assumes homoskedasticity, whereas the data are heteroskedastic, and observe that the posterior puts its mass on ever more high-dimensional models as the sample size increases. To remedy the problem, we equip the likelihood in Bayes' theorem with an exponent called the learning rate, and we propose the Safe Bayesian method to learn the learning rate from the data. SafeBayes tends to select small learning rates as soon the standard posterior is not `cumulatively concentrated', and its results on our data are quite encouraging."
to:NB  to_read  linear_regression  bayesianism  bayesian_consistency  misspecification  statistics  grunwald.peter 
4 hours ago
Spatiotemporal transcriptomics reveals the evolutionary history of the endoderm germ layer : Nature : Nature Publishing Group
"The concept of germ layers has been one of the foremost organizing principles in developmental biology, classification, systematics and evolution for 150 years (refs 1, 2, 3). Of the three germ layers, the mesoderm is found in bilaterian animals but is absent in species in the phyla Cnidaria and Ctenophora, which has been taken as evidence that the mesoderm was the final germ layer to evolve1, 4, 5. The origin of the ectoderm and endoderm germ layers, however, remains unclear, with models supporting the antecedence of each as well as a simultaneous origin4, 6, 7, 8, 9. Here we determine the temporal and spatial components of gene expression spanning embryonic development for all Caenorhabditis elegans genes and use it to determine the evolutionary ages of the germ layers. The gene expression program of the mesoderm is induced after those of the ectoderm and endoderm, thus making it the last germ layer both to evolve and to develop. Strikingly, the C. elegans endoderm and ectoderm expression programs do not co-induce; rather the endoderm activates earlier, and this is also observed in the expression of endoderm orthologues during the embryology of the frog Xenopus tropicalis, the sea anemone Nematostella vectensis and the sponge Amphimedon queenslandica. Querying the phylogenetic ages of specifically expressed genes reveals that the endoderm comprises older genes. Taken together, we propose that the endoderm program dates back to the origin of multicellularity, whereas the ectoderm originated as a secondary germ layer freed from ancestral feeding functions."
to:NB  bioinformatics  developmental_biology  evolutionary_biology  biology 
5 hours ago
Network dynamics of the brain and influence of the epileptic seizure onset zone
"The human brain is a dynamic networked system. Patients with partial epileptic seizures have focal regions that periodically diverge from normal brain network dynamics during seizures. We studied the evolution of brain connectivity before, during, and after seizures with graph-theoretic techniques on continuous electrocorticographic (ECoG) recordings (5.4 ± 1.7 d per patient, mean ± SD) from 12 patients with temporal, occipital, or frontal lobe partial onset seizures. Each electrode was considered a node in a graph, and edges between pairs of nodes were weighted by their coherence within a frequency band. The leading eigenvector of the connectivity matrix, which captures network structure, was tracked over time and clustered to uncover a finite set of brain network states. Across patients, we found that (i) the network connectivity is structured and defines a finite set of brain states, (ii) seizures are characterized by a consistent sequence of states, (iii) a subset of nodes is isolated from the network at seizure onset and becomes more connected with the network toward seizure termination, and (iv) the isolated nodes may identify the seizure onset zone with high specificity and sensitivity. To localize a seizure, clinicians visually inspect seizures recorded from multiple intracranial electrode contacts, a time-consuming process that may not always result in definitive localization. We show that network metrics computed from all ECoG channels capture the dynamics of the seizure onset zone as it diverges from normal overall network structure. This suggests that a state space model can be used to help localize the seizure onset zone in ECoG recordings."
to:NB  functional_connectivity  neural_data_analysis  neuroscience  network_data_analysis  re:network_differences 
7 hours ago
Phylogenetic reconstruction of Bantu kinship challenges Main Sequence Theory of human social evolution
"Kinship provides the fundamental structure of human society: descent determines the inheritance pattern between generations, whereas residence rules govern the location a couple moves to after they marry. In turn, descent and residence patterns determine other key relationships such as alliance, trade, and marriage partners. Hunter-gatherer kinship patterns are viewed as flexible, whereas agricultural societies are thought to have developed much more stable kinship patterns as they expanded during the Holocene. Among the Bantu farmers of sub-Saharan Africa, the ancestral kinship patterns present at the beginning of the expansion are hotly contested, with some arguing for matrilineal and matrilocal patterns, whereas others maintain that any kind of lineality or sex-biased dispersal only emerged much later. Here, we use Bayesian phylogenetic methods to uncover the history of Bantu kinship patterns and trace the interplay between descent and residence systems. The results suggest a number of switches in both descent and residence patterns as Bantu farming spread, but that the first Bantu populations were patrilocal with patrilineal descent. Across the phylogeny, a change in descent triggered a switch away from patrifocal kinship, whereas a change in residence triggered a switch back from matrifocal kinship. These results challenge “Main Sequence Theory,” which maintains that changes in residence rules precede change in other social structures. We also indicate the trajectory of kinship change, shedding new light on how this fundamental structure of society developed as farming spread across the globe during the Neolithic."

- So much does the prior drive this result?
to:NB  cultural_evolution  anthropology  phylogenetics 
7 hours ago
A general construction for parallelizing Metropolis−Hastings algorithms
"Markov chain Monte Carlo methods (MCMC) are essential tools for solving many modern-day statistical and computational problems; however, a major limitation is the inherently sequential nature of these algorithms. In this paper, we propose a natural generalization of the Metropolis−Hastings algorithm that allows for parallelizing a single chain using existing MCMC methods. We do so by proposing multiple points in parallel, then constructing and sampling from a finite-state Markov chain on the proposed points such that the overall procedure has the correct target density as its stationary distribution. Our approach is generally applicable and straightforward to implement. We demonstrate how this construction may be used to greatly increase the computational speed and statistical efficiency of a variety of existing MCMC methods, including Metropolis-Adjusted Langevin Algorithms and Adaptive MCMC. Furthermore, we show how it allows for a principled way of using every integration step within Hamiltonian Monte Carlo methods; our approach increases robustness to the choice of algorithmic parameters and results in increased accuracy of Monte Carlo estimates with little extra computational cost."
to:NB  to_read  monte_carlo  parallelism  computational_statistics  statistics 
7 hours ago
Socialize Uber | The Nation
I like the proposal, but it's not "socialization"; it's a workers' cooperative, which is just as much private property as the current arrangement, just run more democratically and cutting out the parasitic middle-men.
modest_proposals  economics  corporations  the_wired_ideology 
[1401.3841] Narrative Planning: Balancing Plot and Character
"Narrative, and in particular storytelling, is an important part of the human experience. Consequently, computational systems that can reason about narrative can be more effective communicators, entertainers, educators, and trainers. One of the central challenges in computational narrative reasoning is narrative generation, the automated creation of meaningful event sequences. There are many factors -- logical and aesthetic -- that contribute to the success of a narrative artifact. Central to this success is its understandability. We argue that the following two attributes of narratives are universal: (a) the logical causal progression of plot, and (b) character believability. Character believability is the perception by the audience that the actions performed by characters do not negatively impact the audiences suspension of disbelief. Specifically, characters must be perceived by the audience to be intentional agents. In this article, we explore the use of refinement search as a technique for solving the narrative generation problem -- to find a sound and believable sequence of character actions that transforms an initial world state into a world state in which goal propositions hold. We describe a novel refinement search planning algorithm -- the Intent-based Partial Order Causal Link (IPOCL) planner -- that, in addition to creating causally sound plot progression, reasons about character intentionality by identifying possible character goals that explain their actions and creating plan structures that explain why those characters commit to their goals. We present the results of an empirical evaluation that demonstrates that narrative plans generated by the IPOCL algorithm support audience comprehension of character intentions better than plans generated by conventional partial-order planners."
to:NB  narrative  optimization  via:vaguery 
6 days ago
[1401.5856] Narrative Planning: Compilations to Classical Planning
"A model of story generation recently proposed by Riedl and Young casts it as planning, with the additional condition that story characters behave intentionally. This means that characters have perceivable motivation for the actions they take. I show that this condition can be compiled away (in more ways than one) to produce a classical planning problem that can be solved by an off-the-shelf classical planner, more efficiently than by Riedl and Youngs specialised planner."
to:NB  narrative  optimization  via:vaguery 
6 days ago
Lurching Toward Happiness in America | The MIT Press
"The promise of America has long been conceived as the promise of happiness. Being American is all about the opportunity to pursue one’s own bliss. But what is the good life, and are we getting closer to its attainment? In the cacophony of competing conceptions of the good, technological interventions that claim to help us achieve it, and rancorous debate over government’s role in securing it for us, every step toward happiness seems to come with at least one step back.
"In Lurching Toward Happiness in America, acclaimed sociologist Claude Fischer explores the data, the myths, and history to understand how far America has come in delivering on its promise. Are Americans getting lonelier? Is the gender revolution over? Does income shape the way Americans see their life prospects? In the end, Fischer paints a broad picture of what Americans say they want. And, as he considers how close they are to achieving that goal, he also suggests what might finally get them there."
to:NB  books:noted  sociology  something_about_america  moral_psychology 
6 days ago
Investigating the Psychological World | The MIT Press
"This book considers scientific method in the behavioral sciences, with particular reference to psychology. Psychologists learn about research methods and use them to conduct their research, but their training teaches them little about the nature of scientific method itself. In Investigating the Psychological World, Brian Haig fills this gap. Drawing on behavioral science methodology, the philosophy of science, and statistical theory, Haig constructs a broad theory of scientific method that has particular relevance for the behavioral sciences. He terms this account of method the abductive theory of method (ATOM) in recognition of the importance it assigns to explanatory reasoning. ATOM offers the framework for a coherent treatment of a range of quantitative and qualitative behavioral research methods, giving equal treatment to data-analytic methods and methods of theory construction.
"Haig draws on the new experimentalism in the philosophy of science to reconstruct the process of phenomena detection as it applies to psychology; he considers the logic and purpose of exploratory factor analysis; he discusses analogical modeling as a means of theory development; and he recommends the use of inference to the best explanation for evaluating theories in psychology. Finally, he outlines the nature of research problems, discusses the nature of the abductive method, and describes applications of the method to grounded theory method and clinical reasoning. The book will be of interest not only to philosophers of science but also to psychological researchers who want to deepen their conceptual understanding of research methods and methodological concerns."

--- "Abduction" is always a warning sign.
to:NB  books:noted  psychology  social_science_methodology  methodological_advice  to_be_shot_after_a_fair_trial 
6 days ago
[1412.2309] Visual Causal Feature Learning
"We react to what we see, but what exactly is it that we react to? What are the visual causes of be- havior? Can we identify such causes from raw image data? If the visual features are causes, how can we manipulate them? Here we provide a rigorous definition of the visual cause of a behavior that is broadly applicable to the visually driven behavior in humans, animals, neurons, robots and other per- ceiving systems. Our framework generalizes standard accounts of causal learning to settings in which the causal variables need to be constructed from microvariables (raw image pixels in this case). We prove the Causal Coarsening Theorem, which allows us to gain causal knowledge from observational data with minimal experimental effort. The theorem provides a connection to standard inference techniques in machine learning that identify features of an image that correlate with, but may not cause, the target behavior. Finally, we propose an active learning scheme to learn a manipulator function that performs optimal manipulations on the image to automatically identify the visual cause of a target behavior. We illustrate our inference and learning algorithms in experiments based on both synthetic and real data. To our knowledge, our account is the first demonstration of true causal feature learning in the literature."

--- Where by "heart the talk" I mean Frederick explained it while he was visiting...
to:NB  heard_the_talk  classifiers  machine_learning  causal_inference  kith_and_kin  eberhardt.frederick  have_read  to:blog 
7 days ago
A Natural History of Natural Theology | The MIT Press
"Questions about the existence and attributes of God form the subject matter of natural theology, which seeks to gain knowledge of the divine by relying on reason and experience of the world. Arguments in natural theology rely largely on intuitions and inferences that seem natural to us, occurring spontaneously—at the sight of a beautiful landscape, perhaps, or in wonderment at the complexity of the cosmos—even to a nonphilosopher. In this book, Helen De Cruz and Johan De Smedt examine the cognitive origins of arguments in natural theology. They find that although natural theological arguments can be very sophisticated, they are rooted in everyday intuitions about purpose, causation, agency, and morality. Using evidence and theories from disciplines including the cognitive science of religion, evolutionary ethics, evolutionary aesthetics, and the cognitive science of testimony, they show that these intuitions emerge early in development and are a stable part of human cognition.
"De Cruz and De Smedt analyze the cognitive underpinnings of five well-known arguments for the existence of God: the argument from design, the cosmological argument, the moral argument, the argument from beauty, and the argument from miracles. Finally, they consider whether the cognitive origins of these natural theological arguments should affect their rationality."
to:NB  books:noted  cognitive_science  psychology  theology  natural_history_of_truthiness  anima_naturaliter_christina_vel_superstitiosam 
8 days ago
Advanced Structured Prediction | The MIT Press
"The goal of structured prediction is to build machine learning models that predict relational information that itself has structure, such as being composed of multiple interrelated parts. These models, which reflect prior knowledge, task-specific relations, and constraints, are used in fields including computer vision, speech recognition, natural language processing, and computational biology. They can carry out such tasks as predicting a natural language sentence, or segmenting an image into meaningful components.
"These models are expressive and powerful, but exact computation is often intractable. A broad research effort in recent years has aimed at designing structured prediction models and approximate inference and learning procedures that are computationally efficient. This volume offers an overview of this recent research in order to make the work accessible to a broader research community. The chapters, by leading researchers in the field, cover a range of topics, including research trends, the linear programming relaxation approach, innovations in probabilistic modeling, recent theoretical progress, and resource-aware learning."
to:NB  books:noted  relational_learning  machine_learning  structured_data  statistics  computational_statistics  coveted 
8 days ago
The Geometry of Meaning | The MIT Press
"In The Geometry of Meaning, Peter Gärdenfors proposes a theory of semantics that bridges cognitive science and linguistics and shows how theories of cognitive processes, in particular concept formation, can be exploited in a general semantic model. He argues that our minds organize the information involved in communicative acts in a format that can be modeled in geometric or topological terms—in what he terms conceptual spaces, extending the theory he presented in an earlier book by that name.
"Many semantic theories consider the meanings of words as relatively stable and independent of the communicative context. Gärdenfors focuses instead on how various forms of communication establish a system of meanings that becomes shared between interlocutors. He argues that these “meetings of mind” depend on the underlying geometric structures, and that these structures facilitate language learning. Turning to lexical semantics, Gärdenfors argues that a unified theory of word meaning can be developed by using conceptual spaces. He shows that the meaning of different word classes can be given a cognitive grounding, and offers semantic analyses of nouns, adjectives, verbs, and prepositions. He also presents models of how the meanings of words are composed to form new meanings and of the basic semantic role of sentences. Finally, he considers the future implications of his theory for robot semantics and the Semantic Web."
to:NB  books:noted  semantics  cognitive_science  linguistics  social_life_of_the_mind  geometry 
8 days ago
[1412.1897] Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images
"Deep neural networks (DNNs) have recently been achieving state-of-the-art performance on a variety of pattern-recognition tasks, most notably visual classification problems. Given that DNNs are now able to classify objects in images with near-human-level performance, questions naturally arise as to what differences remain between computer and human vision. A recent study revealed that changing an image (e.g. of a lion) in a way imperceptible to humans can cause a DNN to label the image as something else entirely (e.g. mislabeling a lion a library). Here we show a related result: it is easy to produce images that are completely unrecognizable to humans, but that state-of-the-art DNNs believe to be recognizable objects with 99.99% confidence (e.g. labeling with certainty that white noise static is a lion). Specifically, we take convolutional neural networks trained to perform well on either the ImageNet or MNIST datasets and then find images with evolutionary algorithms or gradient ascent that DNNs label with high confidence as belonging to each dataset class. It is possible to produce images totally unrecognizable to human eyes that DNNs believe with near certainty are familiar objects. Our results shed light on interesting differences between human vision and current DNNs, and raise questions about the generality of DNN computer vision."

--- The pictures really have to be seen to be believed.
to:NB  have_read  machine_learning  classifiers  neural_networks  to:blog 
8 days ago
Political Language in Economics
"Does political ideology influence economic research? We rely upon purely inductive methods in natural language processing and machine learning to examine patterns of implicit political ideology in economic articles. Using observed political behavior of economists and the phrases from their academic articles, we construct a high-dimensional predictor of political ideology by article, economist, school, and journal. In addition to field, journal, and editor ideology, we look at the correlation of author ideology with magnitudes of reported policy relevant elasticities. Overall our results suggest that there is substantial sorting by ideology into fields, departments, and methodologies, and that political ideology influences the results of economic research."
in_NB  economics  ideology  political_economy  text_mining  naidu.suresh  jelveh.zubin  to:blog  topic_models  linear_regression  to_teach:data-mining 
8 days ago
[1405.5594] From Finite Automata to Regular Expressions and Back--A Summary on Descriptional Complexity
"The equivalence of finite automata and regular expressions dates back to the seminal paper of Kleene on events in nerve nets and finite automata from 1956. In the present paper we tour a fragment of the literature and summarize results on upper and lower bounds on the conversion of finite automata to regular expressions and vice versa. We also briefly recall the known bounds for the removal of spontaneous transitions (epsilon-transitions) on non-epsilon-free nondeterministic devices. Moreover, we report on recent results on the average case descriptional complexity bounds for the conversion of regular expressions to finite automata and brand new developments on the state elimination algorithm that converts finite automata to regular expressions."
to:NB  automata_theory  regular_expressions  theoretical_computer_science  via:vaguery 
8 days ago
Why America Is Not a New Rome | The MIT Press
"America’s post–Cold War strategic dominance and its pre-recession affluence inspired pundits to make celebratory comparisons to ancient Rome at its most powerful. Now, with America no longer perceived as invulnerable, engaged in protracted fighting in Iraq and Afghanistan, and suffering the worst economic downturn since the Great Depression, comparisons are to the bloated, decadent, ineffectual later Empire. In Why America Is Not a New Rome, Vaclav Smil looks at these comparisons in detail, going deeper than the facile analogy-making of talk shows and glossy magazine articles. He finds profound differences.
"Smil, a scientist and a lifelong student of Roman history, focuses on several fundamental concerns: the very meaning of empire; the actual extent and nature of Roman and American power; the role of knowledge and innovation; and demographic and economic basics--population dynamics, illness, death, wealth, and misery. America is not a latter-day Rome, Smil finds, and we need to understand this in order to look ahead without the burden of counterproductive analogies. Superficial similarities do not imply long-term political, demographic, or economic outcomes identical to Rome’s."

--- I'd be extra interested if he goes into why people are drawn to this bad analogy.
to:NB  books:noted  something_about_america  uses_of_the_past  roman_empire  smil.vaclav  debunking 
10 days ago
Inventions That Didn't Change the World by Julie Halls - Powell's Books
"a fascinating visual tour through some of the most bizarre inventions registered with the British authorities in the nineteenth century. In an era when Britain was the workshop of the world, design protection (nowadays patenting) was all the rage, and the apparently lenient approval process meant that all manner of bizarre curiosities were painstakingly recorded, in beautiful color illustrations and well-penned explanatory text, alongside the genuinely great inventions of the period. Irreverent commentary contextualizes each submission as well as taking a humorous view on how each has stood the test of time. This book introduces such gems as a ventilating top hat; an artificial leech; a design for an aerial machine adapted for the arctic regions; an anti-explosive alarm whistle; a tennis racket with ball-picker; and a currant-cleaning machine. Here is everything the end user could possibly require for a problem he never knew he had. Organized by area of application--industry, clothing, transportation, medical, health and safety, the home, and leisure-- reveals the concerns of a bygone era giddy with the possibilities of a newly industrialized world."
books:noted  history_of_technology  the_garden_of_forking_paths_weighs_like_a_nightmare_on_the_brains_of_the_living 
10 days ago
Grimmer, J. and Westwood, S., Messing, S.: The Impression of Influence: Legislator Communication, Representation, and Democratic Accountability. (eBook, Paperback and Hardcover)
"Constituents often fail to hold their representatives accountable for federal spending decisions—even though those very choices have a pervasive influence on American life. Why does this happen? Breaking new ground in the study of representation, The Impression of Influence demonstrates how legislators skillfully inform constituents with strategic communication and how this facilitates or undermines accountability. Using a massive collection of Congressional texts and innovative experiments and methods, the book shows how legislators create an impression of influence through credit claiming messages.
"Anticipating constituents’ reactions, legislators claim credit for programs that elicit a positive response, making constituents believe their legislator is effectively representing their district. This spurs legislators to create and defend projects popular with their constituents. Yet legislators claim credit for much more—they announce projects long before they begin, deceptively imply they deserve credit for expenditures they had little role in securing, and boast about minuscule projects. Unfortunately, legislators get away with seeking credit broadly because constituents evaluate the actions that are reported, rather than the size of the expenditures."
to:NB  books:noted  us_politics  congress  political_science  rhetorical_self-fashioning  natural_history_of_truthiness 
11 days ago
Opium for the Masses? Conflict-Induced Narcotics Production in Afghanistan
"To explain the rise in Afghan opium production, we explore how rising conflicts change the incentives of farmers. Conflicts make illegal opportunities more profitable as they increase the perceived lawlessness and destroy infrastructure crucial to alternative crops. Exploiting a unique data set, we show that Western hostile casualties, our proxy for conflict, have a strong impact on subsequent local opium production. Using the period after the planting season as a placebo test, we show that conflict has a strong effect before but no effect after planting, indicating causality."

- Replication data available?
to:NB  to_read  war  afghanistan  causal_inference  drugs  time_series  statistics  to_teach:undergrad-ADA  economics 
11 days ago
Probabilistic cognition in two indigenous Mayan groups
"Is there a sense of chance shared by all individuals, regardless of their schooling or culture? To test whether the ability to make correct probabilistic evaluations depends on educational and cultural guidance, we investigated probabilistic cognition in preliterate and prenumerate Kaqchikel and K’iche’, two indigenous Mayan groups, living in remote areas of Guatemala. Although the tested individuals had no formal education, they performed correctly in tasks in which they had to consider prior and posterior information, proportions and combinations of possibilities. Their performance was indistinguishable from that of Mayan school children and Western controls. Our results provide evidence for the universal nature of probabilistic cognition."
to:NB  to_read  cognitive_development  cognitive_science  psychology  cultural_differences 
11 days ago
Rethinking natural altruism: Simple reciprocal interactions trigger children’s benevolence
"A very simple reciprocal activity elicited high degrees of altruism in 1- and 2-y-old children, whereas friendly but nonreciprocal activity yielded little subsequent altruism. In a second study, reciprocity with one adult led 1- and 2-y-olds to provide help to a new person. These results question the current dominant claim that social experiences cannot account for early occurring altruistic behavior. A third study, with preschool-age children, showed that subtle reciprocal cues remain potent elicitors of altruism, whereas a fourth study with preschoolers showed that even a brief reciprocal experience fostered children’s expectation of altruism from others. Collectively, the studies suggest that simple reciprocal interactions are a potent trigger of altruism for young children, and that these interactions lead children to believe that their relationships are characterized by mutual care and commitment."

- Contributed, so who knows?
to:NB  evolution_of_cooperation  experimental_psychology  psychology 
11 days ago
Rapid climate change did not cause population collapse at the end of the European Bronze Age
"The impact of rapid climate change on contemporary human populations is of global concern. To contextualize our understanding of human responses to rapid climate change it is necessary to examine the archeological record during past climate transitions. One episode of abrupt climate change has been correlated with societal collapse at the end of the northwestern European Bronze Age. We apply new methods to interrogate archeological and paleoclimate data for this transition in Ireland at a higher level of precision than has previously been possible. We analyze archeological 14C dates to demonstrate dramatic population collapse and present high-precision proxy climate data, analyzed through Bayesian methods, to provide evidence for a rapid climatic transition at ca. 750 calibrated years B.C. Our results demonstrate that this climatic downturn did not initiate population collapse and highlight the nondeterministic nature of human responses to past climate change."
to:NB  archaeology  ancient_history  climate_change  statistics  to_read 
11 days ago
Amazonian landscapes and the bias in field studies of forest structure and biomass
"Tropical forests convert more atmospheric carbon into biomass each year than any terrestrial ecosystem on Earth, underscoring the importance of accurate tropical forest structure and biomass maps for the understanding and management of the global carbon cycle. Ecologists have long used field inventory plots as the main tool for understanding forest structure and biomass at landscape-to-regional scales, under the implicit assumption that these plots accurately represent their surrounding landscape. However, no study has used continuous, high-spatial-resolution data to test whether field plots meet this assumption in tropical forests. Using airborne LiDAR (light detection and ranging) acquired over three regions in Peru, we assessed how representative a typical set of field plots are relative to their surrounding host landscapes. We uncovered substantial mean biases (9–98%) in forest canopy structure (height, gaps, and layers) and aboveground biomass in both lowland Amazonian and montane Andean landscapes. Moreover, simulations reveal that an impractical number of 1-ha field plots (from 10 to more than 100 per landscape) are needed to develop accurate estimates of aboveground biomass at landscape scales. These biases should temper the use of plots for extrapolations of forest dynamics to larger scales, and they demonstrate the need for a fundamental shift to high-resolution active remote sensing techniques as a primary sampling tool in tropical forest biomass studies. The potential decrease in the bias and uncertainty of remotely sensed estimates of forest structure and biomass is a vital step toward successful tropical forest conservation and climate-change mitigation policy."
to:NB  ecology  surveys  statistics 
11 days ago
Business culture and dishonesty in the banking industry : Nature : Nature Publishing Group
"Trust in others’ honesty is a key component of the long-term performance of firms, industries, and even whole countries1, 2, 3, 4. However, in recent years, numerous scandals involving fraud have undermined confidence in the financial industry5, 6, 7. Contemporary commentators have attributed these scandals to the financial sector’s business culture8, 9, 10, but no scientific evidence supports this claim. Here we show that employees of a large, international bank behave, on average, honestly in a control condition. However, when their professional identity as bank employees is rendered salient, a significant proportion of them become dishonest. This effect is specific to bank employees because control experiments with employees from other industries and with students show that they do not become more dishonest when their professional identity or bank-related items are rendered salient. Our results thus suggest that the prevailing business culture in the banking industry weakens and undermines the honesty norm, implying that measures to re-establish an honest culture are very important."

--- Comment is superfluous, except that as a customer, I really want to know which bank.
to:NB  experimental_economics  experimental_psychology  trust  evolution_of_cooperation  institutions 
11 days ago
An enteric virus can replace the beneficial function of commensal bacteria : Nature : Nature Publishing Group
"Intestinal microbial communities have profound effects on host physiology1. Whereas the symbiotic contribution of commensal bacteria is well established, the role of eukaryotic viruses that are present in the gastrointestinal tract under homeostatic conditions is undefined2, 3. Here we demonstrate that a common enteric RNA virus can replace the beneficial function of commensal bacteria in the intestine. Murine norovirus (MNV) infection of germ-free or antibiotic-treated mice restored intestinal morphology and lymphocyte function without inducing overt inflammation and disease. The presence of MNV also suppressed an expansion of group 2 innate lymphoid cells observed in the absence of bacteria, and induced transcriptional changes in the intestine associated with immune development and type I interferon (IFN) signalling. Consistent with this observation, the IFN-α receptor was essential for the ability of MNV to compensate for bacterial depletion. Importantly, MNV infection offset the deleterious effect of treatment with antibiotics in models of intestinal injury and pathogenic bacterial infection. These data indicate that eukaryotic viruses have the capacity to support intestinal homeostasis and shape mucosal immunity, similarly to commensal bacteria."

--- I see no way in which this could provide the premise for a very squicky SF/horror story about a well-intentioned medical intervention going horribly awry.
to:NB  viruses  biology  your_body_is_an_ecosystem 
11 days ago
Knowns and unknowns for psychophysiological endophenotypes: Integration and response to commentaries - Iacono - 2014 - Psychophysiology - Wiley Online Library
"We review and summarize seven molecular genetic studies of 17 psychophysiological endophenotypes that comprise this special issue of Psychophysiology, address criticisms raised in accompanying Perspective and Commentary pieces, and offer suggestions for future research. Endophenotypes are polygenic, and possibly influenced by rare genetic variants. Because they are not simpler genetically than clinical phenotypes, they are unlikely to assist gene discovery for psychiatric disorder. Once genetic variants for clinical phenotypes are identified, associated endophenotypes are likely to provide valuable insights into the psychological and neural mechanisms important to disorder pathology. This special issue provides a foundation for informed future steps in endophenotype genetics, including the formation of large sample consortia capable of fleshing out the many genetic variants contributing to individual differences in psychophysiological measures."
to:NB  psychology  human_genetics  genomics 
11 days ago
The Hellenistic Far East - Rachel Mairs - Hardcover - University of California Press
"In the aftermath of Alexander the Great’s conquests in the late fourth century B.C., Greek garrisons and settlements were established across Central Asia, through Bactria (modern-day Afghanistan) and into India. Over the next three hundred years, these settlements evolved into multiethnic, multilingual communities as much Greek as they were indigenous. To explore the lives and identities of the inhabitants of the Graeco-Bactrian and Indo-Greek kingdoms, Rachel Mairs marshals a variety of evidence, from archaeology, to coins, to documentary and historical texts. Looking particularly at the great city of Ai Khanoum, the only extensively excavated Hellenistic period urban site in Central Asia, Mairs explores how these ancient people lived, communicated, and understood themselves. Significant and original, The Hellenistic Far East will highlight Bactrian studies as an important part of our understanding of the ancient world."
in_NB  books:noted  ancient_history  afghanistan  hellenstic_era  imperialism  central_asia  cultural_exchange 
12 days ago
[1302.5847] Characterizing Branching Processes from Sampled Data
"Branching processes model the evolution of populations of agents that randomly generate offsprings. These processes, more patently Galton-Watson processes, are widely used to model biological, social, cognitive, and technological phenomena, such as the diffusion of ideas, knowledge, chain letters, viruses, and the evolution of humans through their Y-chromosome DNA or mitochondrial RNA. A practical challenge of modeling real phenomena using a Galton-Watson process is the offspring distribution, which must be measured from the population. In most cases, however, directly measuring the offspring distribution is unrealistic due to lack of resources or the death of agents. So far, researchers have relied on informed guesses to guide their choice of offspring distribution. In this work we propose two methods to estimate the offspring distribution from real sampled data. Using a small sampled fraction of the agents and instrumented with the identity of the ancestors of the sampled agents, we show that accurate offspring distribution estimates can be obtained by sampling as little as 14% of the population."
in_NB  statistics  branching_processes  stochastic_processes  statistical_inference_for_stochastic_processes 
12 days ago
[1209.6254] Diagnostics for Respondent-driven Sampling
"Respondent-driven sampling (RDS) is a widely used method for sampling from hard-to-reach human populations, especially groups most at-risk for HIV/AIDS. Data are collected through a peer-referral process in which current sample members harness existing social networks to recruit additional sample members. RDS has proven to be a practical method of data collection in many difficult settings and has been adopted by leading public health organizations around the world. Unfortunately, inference from RDS data requires many strong assumptions because the sampling design is not fully known and is partially beyond the control of the researcher. In this paper, we introduce diagnostic tools for most of the assumptions underlying RDS inference. We also apply these diagnostics in a case study of 12 populations at increased risk for HIV/AIDS. We developed these diagnostics to enable RDS researchers to better understand their data and to encourage future statistical research on RDS."
in_NB  statistics  surveys  model_checking  network_data_analysis  respondent-driven_sampling  gile.krista_j.  salganik.matthew_j. 
13 days ago
Modeling Individual Expertise in Group Judgments
"Group judgments are often—implicitly or explicitly—influenced by their members’ individual expertise. However, given that expertise is seldom recognized fully and that some distortions may occur (bias, correlation, etc.), it is not clear that differential weighting is an epistemically advantageous strategy with respect to straight averaging. Our paper characterizes a wide set of conditions under which differential weighting outperforms straight averaging and embeds the results into the multidisciplinary group decision-making literature."
to:NB  to_read  collective_cognition  re:democratic_cognition  decision_theory 
13 days ago
AER (104,12) p. 3921 - The Effects of Poor Neonatal Health on Children's Cognitive Development
"We make use of a new data resource -- merged birth and school records for all children born in Florida from 1992 to 2002 -- to study the relationship between birth weight and cognitive development. Using singletons as well as twin and sibling fixed effects models, we find that the effects of early health on cognitive development are essentially constant through the school career; that these effects are similar across a wide range of family backgrounds; and that they are invariant to measures of school quality. We conclude that the effects of early health on adult outcomes are therefore set very early."

- The _economic_ content of this is not exactly obvious.
to:NB  cognitive_development  statistics 
14 days ago
Seeing the Light: The Social Logic of Personal Discovery, DeGloma
"When people discover a life-changing truth, they typically ally with a new community. Individuals then use these autobiographical stories to shape their stances on highly controversial issues such as childhood abuse, war and patriotism, political ideology, human sexuality, and religion. Thus, while such stories are seemingly very personal, they also have a distinctly social nature. Tracing a wide variety of narratives through nearly three thousand years of history, Seeing the Light uncovers the common threads of such stories and reveals the crucial, little-recognized social logic of personal discovery."

- Well, _publishing stories_ is a social act, sure; what about people who "discover life-changing truths" but just quietly change their own lives?
to:NB  books:noted  moral_psychology  rhetorical_self-fashioning 
14 days ago
The Romantic Machine: Utopian Science and Technology after Napoleon, Tresch
"Focusing on a set of celebrated technologies, including steam engines, electromagnetic and geophysical instruments, early photography, and mass-scale printing, Tresch looks at how new conceptions of energy, instrumentality, and association fueled such diverse developments as fantastic literature, popular astronomy, grand opera, positivism, utopian socialism, and the Revolution of 1848. He shows that those who attempted to fuse organicism and mechanism in various ways, including Alexander von Humboldt and Auguste Comte, charted a road not taken that resonates today."

- Cf. Barzun's _Classic, Romantic, and Modern_?
to:NB  books:noted  history_of_ideas  romanticism 
14 days ago
Juvenescence: A Cultural History of Our Age, Harrison
"Drawing on the scientific concept of neotony, or the retention of juvenile characteristics through adulthood, and extending it into the cultural realm, Harrison argues that youth is essential for culture’s innovative drive and flashes of genius. At the same time, however, youth—which Harrison sees as more protracted than ever—is a luxury that requires the stability and wisdom of our elders and the institutions. “While genius liberates the novelties of the future,” Harrison writes, “wisdom inherits the legacies of the past, renewing them in the process of handing them down.”"
to:NB  books:noted  cultural_criticism  to_be_shot_after_a_fair_trial 
14 days ago
The Greatest Shows on Earth: A History of the Circus, Simon
"Traveling back to the circus’s early days, Linda Simon takes us to eighteenth-century hippodromes in Great Britain and intimate one-ring circuses in nineteenth-century Paris, where Toulouse-Lautrec and Picasso became enchanted with aerialists and clowns. She introduces us to P. T. Barnum, James Bailey, and the enterprising Ringling Brothers and reveals how they created the golden age of American circuses. Moving forward to the whimsical Circus Oz in Australia and to New York City’s Big Apple Circus and the grand spectacle of Cirque du Soleil, she shows how the circus has transformed in recent years..."
books:noted  history  circus 
14 days ago
How TED (Really) Works — The Message — Medium
"Division of labor is the magic that lets us get on an airplane, confident that the pilots are well-slept, the engine inspected, and the tower on top of the scheduling landings. I don’t want my pilot to be one-of-a-kind, or have her own theory of airplane flying. Standardization makes the Internet run, and allow generativity to flourish on top of it. Standardization, proper regulation and competency make the world possible, and make it a magical place. Otherwise, everything would be a struggle, and creativity could not thrive. From our food to our health, we trust everything to strangers, and everything works to the degree that they can take care of their end, and do it well. It’s on that base we innovate, and create the unexpected. It’s also what underlies the creativity unleashed by successful technologies: base aspects are standardized so people can run with the rest.
"How to structure things so that people are properly trained, credentialed and have the right incentives is a whole complex field, and it sounds mundane and less sexy than studying 21st century virality, but it is essential. It’s why you can walk onto a big stage without having looked at a mirror, because you know you can trust the person whose job it is to take care of this. And the opposite is the anxiety we feel in situations where institutions don’t function as well, and where one keeps having to acquire competencies just to take care of basic functions: most places on the planet. The inability to trust this division of labor is among the most tiring aspects of living in less developed countries. The trauma is micro—but it adds up."
division_of_labor  modernity  sociology  institutions  have_read  tufekci.zeynep  to:blog 
15 days ago
[1410.8260] Selecting the number of principal components: estimation of the true rank of a noisy matrix
"Principal component analysis (PCA) is a well-known tool in multivariate statistics. One big challenge in using the method is the choice of the number of components. In this paper, we propose an exact distribution-based method for this purpose: our approach is related to the adaptive regression framework of Taylor et al. (2013). Assuming Gaussian noise, we use the conditional distribution of the eigenvalues of a Wishart matrix as our test statistic, and derive exact hypothesis tests and confidence intervals for the true singular values. In simulation studies we find that our proposed method compares well to the proposal of Kritchman & Nadler (2008), which uses the asymptotic distribution of singular values based on the Tracy-Widom laws."
to:NB  principal_components  model_selection  statistics  tibshirani.robert 
15 days ago
[1410.7690] Trend Filtering on Graphs
"We introduce a family of adaptive estimators on graphs, based on penalizing the ℓ1 norm of discrete graph differences. This generalizes the idea of trend filtering [Kim et al. (2009), Tibshirani (2014)], used for univariate nonparametric regression, to graphs. Analogous to the univariate case, graph trend filtering exhibits a level of local adaptivity unmatched by the usual ℓ2-based graph smoothers. It is also defined by a convex minimization problem that is readily solved (e.g., by fast ADMM or Newton algorithms). We demonstrate the merits of graph trend filtering through examples and theory."
to:NB  network_data_analysis  smoothing  statistics  nonparametrics  splines  kith_and_kin  tibshirani.ryan_j.  sharpnack.james 
15 days ago
[1405.0558] The Falling Factorial Basis and Its Statistical Applications
"We study a novel spline-like basis, which we name the "falling factorial basis", bearing many similarities to the classic truncated power basis. The advantage of the falling factorial basis is that it enables rapid, linear-time computations in basis matrix multiplication and basis matrix inversion. The falling factorial functions are not actually splines, but are close enough to splines that they provably retain some of the favorable properties of the latter functions. We examine their application in two problems: trend filtering over arbitrary input points, and a higher-order variant of the two-sample Kolmogorov-Smirnov test."
to:NB  have_read  splines  nonparametrics  statistics  two-sample_tests  kith_and_kin  tibshirani.ryan_j. 
15 days ago
The Superiority of Economists
"In this essay, we investigate the dominant position of economics within the network of the social sciences in the United States. We begin by documenting the relative insularity of economics, using bibliometric data. Next we analyze the tight management of the field from the top down, which gives economics its characteristic hierarchical structure. Economists also distinguish themselves from other social scientists through their much better material situation (many teach in business schools, have external consulting ac- tivities), their more individualist worldviews, and in the confidence they have in their discipline’s ability to fix the world’s problems. Taken together, these traits constitute what we call the superiority of economists, where economists’ objective supremacy is intimately linked with their subjective sense of authority and entitlement. While this superiority has certainly fueled economists’ practical involvement and their consider- able influence over the economy, it has also exposed them more to conflicts of interests, political critique, even derision."
to:NB  have_read  economics  sociology_of_science  to:blog 
15 days ago
[1411.1715] Network Cross-Validation for Determining the Number of Communities in Network Data
"The stochastic block model and its variants have been a popular tool in analyzing large network data with community structures. Model selection for these network models, such as determining the number of communities, has been a challenging statistical inference task. In this paper we develop an efficient cross-validation approach to determine the number of communities, as well as to choose between the regular stochastic block model and the degree corrected block model. Our method, called network cross-validation, is based on a block-wise edge splitting technique, combined with an integrated step of community recovery using sub-blocks of the adjacency matrix. The solid performance of our method is supported by theoretical analysis of the sub-block parameter estimation, and is demonstrated in extensive simulations and a data example. Extensions to more general network models are also discussed."
to_read  cross-validation  community_discovery  network_data_analysis  statistics  kith_and_kin  lei.jing  re:XV_for_networks  in_NB  to:blog 
16 days ago
Diagrams as Vehicles of Scientific Reasoning
"Diagrams are ubiquitous in scientific talks, papers, and textbooks. Although diagrams are clearly a tool for communicating experimental procedures, empirical results, relations between causal factors, and mechanistic explanations, they are also key vehicles of reasoning—diagrams provide essential tools for exploring variations in experimental design, identifying new explanatory relations in experimental data, and advancing and revising mechanistic models. They also play a crucial role in the design of computational models that show how an identified mechanism would behave under a variety of conditions (including alterations to the environment and to the mechanism).
"This interdisciplinary workshop seeks to expand our understanding of the ways in which diagrams contribute to science through analysis of diagrams used in actual scientific research and theoretical accounts and experimental investigations of the ways scientists construct or reason with diagrams. We invite papers from both philosophers and historians of science, cognitive scientists who study diagrams, and scientists who use them. Hotel accommodations for three nights will be provided for those whose papers are accepted, but we are not able to cover travel costs. "
conferences  visual_display_of_quantitative_information  cognition  natural_born_cyborgs 
16 days ago
Multiscale Modeling and Emergence
"There has been much interest of late in issues of emergence and reduction in the philosophy of science literature. The battle line is largely drawn between reductive "bottom-up" modeling and "top-down" modeling employing so-called "phenomenological" theories. This workshop aims to examine the nature and plausibility of structuring the debate in this way. We bring physicists and mathematicians together with philosophers interested in modeling systems across scales. Multiscale models and beginning to succeed in showing how to upscale from statistical/atomistic models to continuum/hydrodynamic models. A proper understanding of the mathematics involved in such multiscale modeling should show how overly simplified the philosophical debates have been and should refocus the debate on questions of explaining the (relative) autonomy of upper scale models and theories."
emergence  conferences  macro_from_micro  philosophy_of_science  pittsburgh 
16 days ago
Cohen and Ruths: "Classifying Political Orientation on Twitter: It’s Not Easy!" (ICWSM, 2013)
"Numerous papers have reported great success at inferring the political orientation of Twitter users. This paper has some unfortunate news to deliver: while past work has been sound and often methodologically novel, we have discovered that reported accuracies have been systemically overoptimistic due to the way in which validation datasets have been collected, reporting accuracy levels nearly 30% higher than can be expected in populations of general Twitter users. Using careful and novel data collection and annotation techniques, we collected three different sets of Twitter users, each characterizing a different degree of political engagement on Twitter - from politicians (highly politically vocal) to "normal" users (those who rarely discuss politics). Applying standard techniques for inferring political orientation, we show that methods which previously reported greater than 90% inference accuracy, actually achieve barely 65% accuracy on normal users. We also show that classifiers cannot be used to classify users outside the narrow range of political orientation on which they were trained. While a sobering finding, our results quantify and call attention to overlooked problems in the latent attribute inference literature that, no doubt, extend beyond political orientation inference: the way in which datasets are assembled and the transferability of classifiers."
to:NB  political_science  text_mining  social_media  classifiers  social_measurement 
16 days ago
The ecology of religious beliefs
"Although ecological forces are known to shape the expression of sociality across a broad range of biological taxa, their role in shaping human behavior is currently disputed. Both comparative and experimental evidence indicate that beliefs in moralizing high gods promote cooperation among humans, a behavioral attribute known to correlate with environmental harshness in nonhuman animals. Here we combine fine-grained bioclimatic data with the latest statistical tools from ecology and the social sciences to evaluate the potential effects of environmental forces, language history, and culture on the global distribution of belief in moralizing high gods (n = 583 societies). After simultaneously accounting for potential nonindependence among societies because of shared ancestry and cultural diffusion, we find that these beliefs are more prevalent among societies that inhabit poorer environments and are more prone to ecological duress. In addition, we find that these beliefs are more likely in politically complex societies that recognize rights to movable property. Overall, our multimodel inference approach predicts the global distribution of beliefs in moralizing high gods with an accuracy of 91%, and estimates the relative importance of different potential mechanisms by which this spatial pattern may have arisen. The emerging picture is neither one of pure cultural transmission nor of simple ecological determinism, but rather a complex mixture of social, cultural, and environmental influences. Our methods and findings provide a blueprint for how the increasing wealth of ecological, linguistic, and historical data can be leveraged to understand the forces that have shaped the behavior of our own species."
to:NB  statistics  causal_inference  religion  evolution_of_cooperation  to_be_shot_after_a_fair_trial  historical_materialism 
22 days ago
Effects of temperature and precipitation variability on the risk of violence in sub-Saharan Africa, 1980–2012
"Ongoing debates in the academic community and in the public policy arena continue without clear resolution about the significance of global climate change for the risk of increased conflict. Sub-Saharan Africa is generally agreed to be the region most vulnerable to such climate impacts. Using a large database of conflict events and detailed climatological data covering the period 1980–2012, we apply a multilevel modeling technique that allows for a more nuanced understanding of a climate–conflict link than has been seen heretofore. In the aggregate, high temperature extremes are associated with more conflict; however, different types of conflict and different subregions do not show consistent relationship with temperature deviations. Precipitation deviations, both high and low, are generally not significant. The location and timing of violence are influenced less by climate anomalies (temperature or precipitation variations from normal) than by key political, economic, and geographic factors. We find important distinctions in the relationship between temperature extremes and conflict by using multiple methods of analysis and by exploiting our time-series cross-sectional dataset for disaggregated analyses."

- Last tag inspired by the supposed existence of replication R code.
to:NB  war  violence  instrumental_variables  social_science_methodology  statistics  causal_inference  hierarchical_statistical_models  political_science  to_teach:undergrad-ADA 
22 days ago
[1411.4040] Kernel Density Estimation on Symmetric Spaces
"We investigate a natural variant of kernel density estimation on a large class of symmetric spaces and prove a minimax rate of convergence as fast as the minimax rate on Euclidean space. We make neither compactness assumptions on the space nor Holder-class assumptions on the densities. A main tool used in proving the convergence rate is the Helgason-Fourier transform, a generalization of the Fourier transform for semisimple Lie groups modulo maximal compact subgroups. This paper obtains a simplified formula in the special case when the symmetric space is the 2-dimensional hyperboloid."
in_NB  density_estimation  statistics  nonparametrics  kith_and_kin  asta.dena  statistics_on_manifolds 
22 days ago
The War on Learning | The MIT Press
"Behind the lectern stands the professor, deploying course management systems, online quizzes, wireless clickers, PowerPoint slides, podcasts, and plagiarism-detection software. In the seats are the students, armed with smartphones, laptops, tablets, music players, and social networking. Although these two forces seem poised to do battle with each other, they are really both taking part in a war on learning itself. In this book, Elizabeth Losh examines current efforts to “reform” higher education by applying technological solutions to problems in teaching and learning. She finds that many of these initiatives fail because they treat education as a product rather than a process. Highly touted schemes—video games for the classroom, for example, or the distribution of iPads—let students down because they promote consumption rather than intellectual development.
"Losh analyzes recent trends in postsecondary education and the rhetoric around them, often drawing on first-person accounts. In an effort to identify educational technologies that might actually work, she looks at strategies including MOOCs (massive open online courses), the gamification of subject matter, remix pedagogy, video lectures (from Randy Pausch to “the Baked Professor”), and educational virtual worlds. Finally, Losh outlines six basic principles of digital learning and describes several successful university-based initiatives. Her book will be essential reading for campus decision makers—and for anyone who cares about education and technology."
to:NB  books:noted  education  academia  pedagogy 
22 days ago
Rational Action | The MIT Press
"During World War II, the Allied military forces faced severe problems integrating equipment, tactics, and logistics into successful combat operations. To help confront these problems, scientists and engineers developed new means of studying which equipment designs would best meet the military’s requirements and how the military could best use the equipment it had on hand. By 1941 they had also begun to gather and analyze data from combat operations to improve military leaders’ ordinary planning activities. In Rational Action, William Thomas details these developments, and how they gave rise during the 1950s to a constellation of influential new fields—which he terms the “sciences of policy”—that included operations research, management science, systems analysis, and decision theory.
"Proponents of these new sciences embraced a variety of agendas. Some aimed to improve policymaking directly, while others theorized about how one decision could be considered more rational than another. Their work spanned systems engineering, applied mathematics, nuclear strategy, and the philosophy of science, and it found new niches in universities, in businesses, and at think tanks such as the RAND Corporation. The sciences of policy also took a prominent place in epic narratives told about the relationships among science, state, and society in an intellectual culture preoccupied with how technology and reason would shape the future. Thomas follows all these threads to illuminate and make new sense of the intricate relationships among scientific analysis, policymaking procedure, and institutional legitimacy at a crucial moment in British and American history."
to:NB  books:noted  decision_theory  statistics  economics  history_of_science  cold_war  WWII  optimization 
22 days ago
Distributed Algorithms | The MIT Press
"This book offers students and researchers a guide to distributed algorithms that emphasizes examples and exercises rather than the intricacies of mathematical models. It avoids mathematical argumentation, often a stumbling block for students, teaching algorithmic thought rather than proofs and logic. This approach allows the student to learn a large number of algorithms within a relatively short span of time. Algorithms are explained through brief, informal descriptions, illuminating examples, and practical exercises. The examples and exercises allow readers to understand algorithms intuitively and from different perspectives. Proof sketches, arguing the correctness of an algorithm or explaining the idea behind fundamental results, are also included. An appendix offers pseudocode descriptions of many algorithms.
"Distributed algorithms are performed by a collection of computers that send messages to each other or by multiple software threads that use the same shared memory. The algorithms presented in the book are for the most part “classics,” selected because they shed light on the algorithmic design of distributed systems or on key issues in distributed computing and concurrent programming."
to:NB  programming  theoretical_computer_science  distributed_systems 
22 days ago
Cultural Evolution | The MIT Press
"Over the past few decades, a growing body of research has emerged from a variety of disciplines to highlight the importance of cultural evolution in understanding human behavior. Wider application of these insights, however, has been hampered by traditional disciplinary boundaries. To remedy this, in this volume leading researchers from theoretical biology, developmental and cognitive psychology, linguistics, anthropology, sociology, religious studies, history, and economics come together to explore the central role of cultural evolution in different aspects of human endeavor.
"The contributors take as their guiding principle the idea that cultural evolution can provide an important integrating function across the various disciplines of the human sciences, as organic evolution does for biology. The benefits of adopting a cultural evolutionary perspective are demonstrated by contributions on social systems, technology, language, and religion. Topics covered include enforcement of norms in human groups, the neuroscience of technology, language diversity, and prosociality and religion. The contributors evaluate current research on cultural evolution and consider its broader theoretical and practical implications, synthesizing past and ongoing work and sketching a roadmap for future cross-disciplinary efforts."
to:NB  books:noted  cultural_evolution 
22 days ago
[1411.5279] What Teachers Should Know about the Bootstrap: Resampling in the Undergraduate Statistics Curriculum
"I have three goals in this article: (1) To show the enormous potential of bootstrapping and permutation tests to help students understand statistical concepts including sampling distributions, standard errors, bias, confidence intervals, null distributions, and P-values. (2) To dig deeper, understand why these methods work and when they don't, things to watch out for, and how to deal with these issues when teaching. (3) To change statistical practice---by comparing these methods to common t tests and intervals, we see how inaccurate the latter are; we confirm this with asymptotics. n >= 30 isn't enough---think n >= 5000. Resampling provides diagnostics, and more accurate alternatives. Sadly, the common bootstrap percentile interval badly under-covers in small samples; there are better alternatives. The tone is informal, with a few stories and jokes."

--- Not sure the slagging on the "basic bootstrap interval" is really justified, but otherwise, really good.
to:NB  statistics  bootstrap  via:civilstat  to:blog 
23 days ago
When economic models are unable to fit the data | VOX, CEPR’s Policy Portal
Shorter: if your model claims to include all the relevant variables and throwing more covariates into your regression improves your fit, you have a problem. (But I would be shocked if they are really doing an adequate job of accounting for specification-search and model-selection issues here.)
track_down_references  economics  model_selection  misspecification  goodness-of-fit  econometrics  statistics  baby_steps  to:blog 
23 days ago
Hall of Mirrors: The Great Depression, the Great Recession, and the Uses-And Misuses-Of History by Barry Eichengreen - Powell's Books
"There have been two global financial crises in the past century: the Great Depression of the 1930s and the Great Recession that began in 2008. Both featured loose credit, precarious real estate and stock market bubbles, suspicious banking practices, an inflexible monetary system, and global imbalances; both had devastating economic consequences. In both cases, people in the prosperous decade preceding the crash believed they were living in a post-volatility economy, one that had tamed the cycle of boom and bust. When the global financial system began to totter in 2008, policymakers were able to draw on the lessons of the Great Depression in order to prevent a repeat, but their response was still inadequate to prevent massive economic turmoil on a global scale.
"In Hall of Mirrors, renowned economist Barry Eichengreen provides the first book-length analysis of the two crises and their aftermaths. Weaving together the narratives of the 30s and recent years, he shows how fear of another Depression greatly informed the policy response after the Lehman Brothers collapse, with both positive and negative results. On the positive side, institutions took the opposite paths that they had during the Depression; government increased spending and cut taxes, and central banks reduced interest rates, flooded the market with liquidity, and coordinated international cooperation. This in large part prevented the bank failures, 25% unemployment rate, and other disasters that characterized the Great Depression. But they all too often hewed too closely and too literally to the lessons of the Depression, seeing it as a mirror rather than focusing on the core differences. Moreover, in their haste to differentiate themselves from their forbears, today's policymakers neglected the constructive but ultimately futile steps that the Federal Reserve took in the 1930s. While the rapidly constructed policies of late 2008 did succeed in staving off catastrophe in the years after, policymakers, institutions, and society as a whole were too eager to get back to normal, even when that meant stunting the recovery via harsh austerity policies and eschewing necessary long-term reforms. The result was a grindingly slow recovery in the US and a devastating recession in Europe."
to:NB  books:noted  economics  economic_history  macroeconomics  great_depression  financial_crisis_of_2007--  via:jbdelong 
23 days ago
[1409.4813] Identification of core-periphery structure in networks
"Many networks can be usefully decomposed into a dense core plus an outlying, loosely-connected periphery. Here we propose an algorithm for performing such a decomposition on empirical network data using methods of statistical inference. Our method fits a generative model of core-periphery structure to observed data using a combination of an expectation--maximization algorithm for calculating the parameters of the model and a belief propagation algorithm for calculating the decomposition itself. We find the method to be efficient, scaling easily to networks with a million or more nodes and we test it on a range of networks, including real-world examples as well as computer-generated benchmarks, for which it successfully identifies known core-periphery structure with low error rate. We also demonstrate that the method is immune from the detectability transition observed in the related community detection problem, which prevents the detection of community structure when that structure is too weak. There is no such transition for core-periphery structure, which is detectable, albeit with some statistical error, no matter how weak it is."
to:NB  network_data_analysis  statistics  em_algorithm  kith_and_kin  newman.mark 
25 days ago
Maximum likelihood inference of reticulate evolutionary histories
"Hybridization plays an important role in the evolution of certain groups of organisms, adaptation to their environments, and diversification of their genomes. The evolutionary histories of such groups are reticulate, and methods for reconstructing them are still in their infancy and have limited applicability. We present a maximum likelihood method for inferring reticulate evolutionary histories while accounting simultaneously for incomplete lineage sorting. Additionally, we propose methods for assessing confidence in the amount of reticulation and the topology of the inferred evolutionary history. Our method obtains accurate estimates of reticulate evolutionary histories on simulated datasets. Furthermore, our method provides support for a hypothesis of a reticulate evolutionary history inferred from a set of house mouse (Mus musculus) genomes. As evidence of hybridization in eukaryotic groups accumulates, it is essential to have methods that infer reticulate evolutionary histories. The work we present here allows for such inference and provides a significant step toward putting phylogenetic networks on par with phylogenetic trees as a model of capturing evolutionary relationships."
to:NB  evolutionary_biology  statistics  phylogenetics 
25 days ago
Spatial embedding of structural similarity in the cerebral cortex
"Recent anatomical tracing studies have yielded substantial amounts of data on the areal connectivity underlying distributed processing in cortex, yet the fundamental principles that govern the large-scale organization of cortex remain unknown. Here we show that functional similarity between areas as defined by the pattern of shared inputs or outputs is a key to understanding the areal network of cortex. In particular, we report a systematic relation in the monkey, human, and mouse cortex between the occurrence of connections from one area to another and their similarity distance. This characteristic relation is rooted in the wiring distance dependence of connections in the brain. We introduce a weighted, spatially embedded random network model that robustly gives rise to this structure, as well as many other spatial and topological properties observed in cortex. These include features that were not accounted for in any previous model, such as the wide range of interareal connection weights. Connections in the model emerge from an underlying distribution of spatially embedded axons, thereby integrating the two scales of cortical connectivity—individual axons and interareal pathways—into a common geometric framework. These results provide insights into the origin of large-scale connectivity in cortex and have important implications for theories of cortical organization."
to:NB  neuroscience  networks  network_formation 
25 days ago
Decreased segregation of brain systems across the healthy adult lifespan
"Healthy aging has been associated with decreased specialization in brain function. This characterization has focused largely on describing age-accompanied differences in specialization at the level of neurons and brain areas. We expand this work to describe systems-level differences in specialization in a healthy adult lifespan sample (n = 210; 20–89 y). A graph-theoretic framework is used to guide analysis of functional MRI resting-state data and describe systems-level differences in connectivity of individual brain networks. Young adults’ brain systems exhibit a balance of within- and between-system correlations that is characteristic of segregated and specialized organization. Increasing age is accompanied by decreasing segregation of brain systems. Compared with systems involved in the processing of sensory input and motor output, systems mediating “associative” operations exhibit a distinct pattern of reductions in segregation across the adult lifespan. Of particular importance, the magnitude of association system segregation is predictive of long-term memory function, independent of an individual’s age."
to:NB  neuroscience  functional_connectivity  re:network_differences 
25 days ago
Frontiers of Test Validity Theory: Measurement, Causation, and Meaning (Paperback) - Routledge
"This book examines test validity in the behavioral, social, and educational sciences by exploring three fundamental problems: measurement, causation and meaning. Psychometric and philosophical perspectives receive attention along with unresolved issues. The authors explore how measurement is conceived from both the classical and modern perspectives. The importance of understanding the underlying concepts as well as the practical challenges of test construction and use receive emphasis throughout. The book summarizes the current state of the test validity theory field. Necessary background on test theory and statistics is presented as a conceptual overview where needed.
"Each chapter begins with an overview of key material reviewed in previous chapters, concludes with a list of suggested readings, and features boxes with examples that connect theory to practice. These examples reflect actual situations that occurred in psychology, education, and other disciplines in the US and around the globe, bringing theory to life. Critical thinking questions related to the boxed material engage and challenge readers. A few examples include:
"What is the difference between intelligence and IQ?
"Can people disagree on issues of value but agree on issues of test validity?
"Is it possible to ask the same question in two different languages?
"The first part of the book contrasts theories of measurement as applied to the validity of behavioral science measures.The next part considers causal theories of measurement in relation to alternatives such as behavior domain sampling, and then unpacks the causal approach in terms of alternative theories of causation.The final section explores the meaning and interpretation of test scores as it applies to test validity. Each set of chapters opens with a review of the key theories and literature and concludes with a review of related open questions in test validity theory.
"Researchers, practitioners and policy makers interested in test validity or developing tests appreciate the book's cutting edge review of test validity. The book also serves as a supplement in graduate or advanced undergraduate courses on test validity, psychometrics, testing or measurement taught in psychology, education, sociology, social work, political science, business, criminal justice and other fields. The book does not assume a background in measurement."
to:NB  books:noted  psychometrics  social_measurement  statistics  causality  borsboom.denny 
26 days ago
Empirical Model Discovery and Theory Evaluation: Automatic Selection Methods in Econometrics | The MIT Press
"Economic models of empirical phenomena are developed for a variety of reasons, the most obvious of which is the numerical characterization of available evidence, in a suitably parsimonious form. Another is to test a theory, or evaluate it against the evidence; still another is to forecast future outcomes. Building such models involves a multitude of decisions, and the large number of features that need to be taken into account can overwhelm the researcher. Automatic model selection, which draws on recent advances in computation and search algorithms, can create, and then empirically investigate, a vastly wider range of possibilities than even the greatest expert. In this book, leading econometricians David Hendry and Jurgen Doornik report on their several decades of innovative research on automatic model selection.
"After introducing the principles of empirical model discovery and the role of model selection, Hendry and Doornik outline the stages of developing a viable model of a complicated evolving process. They discuss the discovery stages in detail, considering both the theory of model selection and the performance of several algorithms. They describe extensions to tackling outliers and multiple breaks, leading to the general case of more candidate variables than observations. Finally, they briefly consider selecting models specifically for forecasting."
books:noted  econometrics  statistics  model_selection  model_discovery  in_NB 
26 days ago
The Worst Job I Ever Had | STANTON AND DELIVER
The thing I find implausible about this (very well-told) story is the absence of _live_ insects.
insects  gross  labor  moral_psychology  adolescence  via:unfogged  have_read  to:blog 
26 days ago
Developing Scaffolds in Evolution, Culture, and Cognition | The MIT Press
""Scaffolding" is a concept that is becoming widely used across disciplines. This book investigates common threads in diverse applications of scaffolding, including theoretical biology, cognitive science, social theory, science and technology studies, and human development. Despite its widespread use, the concept of scaffolding is often given short shrift; the contributors to this volume, from a range of disciplines, offer a more fully developed analysis of scaffolding that highlights the role of temporal and temporary resources in development, broadly conceived, across concepts of culture, cognition, and evolution.
"The book emphasizes reproduction, repeated assembly, and entrenchment of heterogeneous relations, parts, and processes as a complement to neo-Darwinism in the developmentalist tradition of conceptualizing evolutionary change. After describing an integration of theoretical perspectives that can accommodate different levels of analysis and connect various methodologies, the book discusses multilevel organization; differences (and reciprocality) between individuals and institutions as units of analysis; and perspectives on development that span brains, careers, corporations, and cultural cycles."
to:NB  developmental_biology  cognitive_development  books:noted 
27 days ago
Gotze : On the Rate of Convergence in the Multivariate CLT
"Berry-Esseen theorems are proved in the multidimensional central limit theorem without using Fourier methods. An effective and simple estimate of the error in the CLT for sums and convex sets using Stein's method and induction is derived. Furthermore, the error in the CLT for multivariate functions of independent random elements is estimated extending results of van Zwet and Friedrich to the multivariate case."
to:NB  probability  central_limit_theorem 
27 days ago
Memes in Digital Culture | The MIT Press
"In December 2012, the exuberant video “Gangnam Style” became the first YouTube clip to be viewed more than one billion times. Thousands of its viewers responded by creating and posting their own variations of the video--“Mitt Romney Style,” “NASA Johnson Style,” “Egyptian Style,” and many others. “Gangnam Style” (and its attendant parodies, imitations, and derivations) is one of the most famous examples of an Internet meme: a piece of digital content that spreads quickly around the web in various iterations and becomes a shared cultural experience. In this book, Limor Shifman investigates Internet memes and what they tell us about digital culture.
"Shifman discusses a series of well-known Internet memes—including “Leave Britney Alone,” the pepper-spraying cop, LOLCats, Scumbag Steve, and Occupy Wall Street’s “We Are the 99 Percent.” She offers a novel definition of Internet memes: digital content units with common characteristics, created with awareness of each other, and circulated, imitated, and transformed via the Internet by many users. She differentiates memes from virals; analyzes what makes memes and virals successful; describes popular meme genres; discusses memes as new modes of political participation in democratic and nondemocratic regimes; and examines memes as agents of globalization.
"Memes, Shifman argues, encapsulate some of the most fundamental aspects of the Internet in general and of the participatory Web 2.0 culture in particular. Internet memes may be entertaining, but in this book Limor Shifman makes a compelling argument for taking them seriously."
to:NB  books:noted  epidemiology_of_representations  networked_life  computer_networks_as_provinces_of_the_commonwealth_of_letters 
27 days ago
ReCombinatorics: The Algorithmics of Ancestral Recombination Graphs and Explicit Phylogenetic Networks | The MIT Press
"In this book, Dan Gusfield examines combinatorial algorithms to construct genealogical and exact phylogenetic networks, particularly ancestral recombination graphs (ARGs). The algorithms produce networks (or information about networks) that serve as hypotheses about the true genealogical history of observed biological sequences and can be applied to practical biological problems.
"Phylogenetic trees have been the traditional means to represent evolutionary history, but there is a growing realization that networks rather than trees are often needed, most notably for recent human history. This has led to the development of ARGs in population genetics and, more broadly, to phylogenetic networks. ReCombinatorics offers an in-depth, rigorous examination of current research on the combinatorial, graph-theoretic structure of ARGs and explicit phylogenetic networks, and algorithms to reconstruct or deduce information about those networks.
"ReCombinatorics, a groundbreaking contribution to the emerging field of phylogenetic networks, connects and unifies topics in population genetics and phylogenetics that have traditionally been discussed separately and considered to be unrelated. It covers the necessary combinatorial and algorithmic background material; the various biological phenomena; the mathematical, population genetic, and phylogenetic models that capture the essential elements of these phenomena; the combinatorial and algorithmic problems that derive from these models; the theoretical results that have been obtained; related software that has been developed; and some empirical testing of the software on simulated and real biological data."
to:NB  books:noted  evolutionary_biology  historical_genetics  bioinformatics  cultural_evolution 
27 days ago
« earlier      
academia afghanistan agent-based_models american_history archaeology art bad_data_analysis bad_science_journalism bayesian_consistency bayesianism biochemical_networks book_reviews books:noted books:owned books:recommended bootstrap cartoons cats causal_inference causality central_asia central_limit_theorem class_struggles_in_america classifiers climate_change clustering cognitive_science collective_cognition comics community_discovery complexity computational_statistics computer_networks_as_provinces_of_the_commonwealth_of_letters confidence_sets corruption coveted cthulhiana cultural_criticism cultural_exchange data_analysis data_mining debunking decision-making decision_theory delong.brad democracy density_estimation dimension_reduction distributed_systems dynamical_systems econometrics economic_history economic_policy economics education empirical_processes ensemble_methods entropy_estimation epidemic_models epistemology ergodic_theory estimation evisceration evolution_of_cooperation evolutionary_biology experimental_psychology finance financial_crisis_of_2007-- financial_markets financial_speculation fmri food fraud funny funny:academic funny:geeky funny:laughing_instead_of_screaming funny:malicious funny:pointed graph_theory graphical_models have_read heard_the_talk heavy_tails high-dimensional_statistics history_of_ideas history_of_science human_genetics hypothesis_testing ideology imperialism in_nb inequality inference_to_latent_objects information_theory institutions kernel_estimators kernel_methods kith_and_kin krugman.paul large_deviations lasso learning_theory liberman.mark likelihood linguistics literary_criticism machine_learning macro_from_micro macroeconomics manifold_learning market_failures_in_everything markov_models mathematics mixing mixture_models model_selection modeling modern_ruins monte_carlo moral_psychology moral_responsibility mortgage_crisis natural_history_of_truthiness network_data_analysis networked_life networks neural_data_analysis neuroscience non-equilibrium nonparametrics optimization our_decrepit_institutions philosophy philosophy_of_science photos physics pittsburgh political_economy political_science practices_relating_to_the_transmission_of_genetic_information prediction pretty_pictures principal_components probability programming progressive_forces psychology r racism random_fields re:almost_none re:aos_project re:democratic_cognition re:do-institutions-evolve re:g_paper re:homophily_and_confounding re:network_differences re:smoothing_adjacency_matrices re:social_networks_as_sensor_networks re:stacs re:your_favorite_dsge_sucks recipes regression regulation running_dogs_of_reaction science_as_a_social_process science_fiction simulation social_influence social_life_of_the_mind social_media social_networks social_science_methodology sociology something_about_america sparsity spatial_statistics state-space_models statistical_inference_for_stochastic_processes statistical_mechanics statistics stochastic_processes text_mining the_american_dilemma the_continuing_crises time_series to:blog to:nb to_be_shot_after_a_fair_trial to_read to_teach:complexity-and-inference to_teach:data-mining to_teach:statcomp to_teach:undergrad-ada track_down_references us-iraq_war us_politics utter_stupidity variable_selection vast_right-wing_conspiracy via:? via:henry_farrell via:jbdelong via:klk visual_display_of_quantitative_information whats_gone_wrong_with_america why_oh_why_cant_we_have_a_better_academic_publishing_system why_oh_why_cant_we_have_a_better_press_corps

Copy this bookmark: