Hirschler , Cover : A Finite Memory Test of the Irrationality of the Parameter of a Coin
"Let X1,X2,... be a Bernoulli sequence with parameter p. An algorithm ... is found such that [a function of the data is 0] all but a finite number of times with probability one if p is rational, and [that same function is one] all but a finite number of times with probability one if p is irrational (and not in a given null set of irrationals). Thus, an 8-state memory with a time-varying algorithm makes only a finite number of mistakes with probability one on determining the rationality of the parameter of a coin. Thus, determining the rationality of the Bernoulli parameter p does not depend on infinite memory of the data."

--- I would not have thought this was possible.
to:NB  statistics  cover.thomas  hypothesis_testing  automata_theory 
[1602.00721] Concentration of measure without independence: a unified approach via the martingale method
"The concentration of measure phenomenon may be summarized as follows: a function of many weakly dependent random variables that is not too sensitive to any of its individual arguments will tend to take values very close to its expectation. This phenomenon is most completely understood when the arguments are mutually independent random variables, and there exist several powerful complementary methods for proving concentration inequalities, such as the martingale method, the entropy method, and the method of transportation inequalities. The setting of dependent arguments is much less well understood. This chapter focuses on the martingale method for deriving concentration inequalities without independence assumptions. In particular, we use the machinery of so-called Wasserstein matrices to show that the Azuma-Hoeffding concentration inequality for martingales with almost surely bounded differences, when applied in a sufficiently abstract setting, is powerful enough to recover and sharpen several known concentration results for nonproduct measures. Wasserstein matrices provide a natural formalism for capturing the interplay between the metric and the probabilistic structures, which is fundamental to the concentration phenomenon."
to:NB  concentration_of_measure  martingales  stochastic_processes  kontorovich.aryeh  raginsky.maxim  kith_and_kin 
Freedman : On Tail Probabilities for Martingales
"Watch a martingale with uniformly bounded increments until it first crosses the horizontal line of height $a$. The sum of the conditional variances of the increments given the past, up to the crossing, is an intrinsic measure of the crossing time. Simple and fairly sharp upper and lower bounds are given for the Laplace transform of this crossing time, which show that the distribution is virtually the same as that for the crossing time of Brownian motion, even in the tail. The argument can be adapted to extend inequalities of Bernstein and Kolmogorov to the dependent case, proving the law of the iterated logarithm for martingales. The argument can also be adapted to prove Levy's central limit theorem for martingales. The results can be extended to martingales whose increments satisfy a growth condition."
to:NB  deviation_inequalities  martingales  probability  re:AoS_project 
[1602.00355] A Spectral Series Approach to High-Dimensional Nonparametric Regression
"A key question in modern statistics is how to make fast and reliable inferences for complex, high-dimensional data. While there has been much interest in sparse techniques, current methods do not generalize well to data with nonlinear structure. In this work, we present an orthogonal series estimator for predictors that are complex aggregate objects, such as natural images, galaxy spectra, trajectories, and movies. Our series approach ties together ideas from kernel machine learning, and Fourier methods. We expand the unknown regression on the data in terms of the eigenfunctions of a kernel-based operator, and we take advantage of orthogonality of the basis with respect to the underlying data distribution, P, to speed up computations and tuning of parameters. If the kernel is appropriately chosen, then the eigenfunctions adapt to the intrinsic geometry and dimension of the data. We provide theoretical guarantees for a radial kernel with varying bandwidth, and we relate smoothness of the regression function with respect to P to sparsity in the eigenbasis. Finally, using simulated and real-world data, we systematically compare the performance of the spectral series approach with classical kernel smoothing, k-nearest neighbors regression, kernel ridge regression, and state-of-the-art manifold and local regression methods."
to:NB  have_read  statistics  regression  nonparametrics  sparsity  kernel_methods  kith_and_kin  lee.ann 
Bottlenecks, A New Theory of Equal Opportunity // Reviews // Notre Dame Philosophical Reviews // University of Notre Dame
This seems intriguing, but a lot of work would have to be done by way of distinguishing capacities which are worth developing and those which are not. E.g., some societies would offer many more opportunities to develop talents for theft, fraud, bloody vengeance and/or boot-licking ingratiation with bosses even than our own. Eliminating sanitation and vaccinations would give us all the opportunity to develop our capacities to deal with the early and random death of friends and family. Etc., etc.
to:NB  books:noted  book_reviews  inequality  equality_of_opportunity  political_philosophy  institutions 
Negishi welfare weights in integrated assessment models: the mathematics of global inequality - Springer
"In a global climate policy debate fraught with differing understandings of right and wrong, the importance of making transparent the ethical assumptions used in climate-economics models cannot be overestimated. Negishi weighting is a key ethical assumption in climate-economics models, but it is virtually unknown to most model users. Negishi weights freeze the current distribution of income between world regions; without this constraint, IAMs that maximize global welfare would recommend an equalization of income across regions as part of their policy advice. With Negishi weights in place, these models instead recommend a course of action that would be optimal only in a world in which global income redistribution cannot and will not take place. This article describes the Negishi procedure and its origin in theoretical and applied welfare economics, and discusses the policy implications of the presentation and use of Negishi-weighted model results, as well as some alternatives to Negishi weighting in climate-economics models."

Ungated: http://sei-us.org/Publications_PDF/SEI-Stanton2010_ClimaticChange_Negishi.pdf
to:NB  economics  economic_policy  cost-benefit_analysis  moral_philosophy  inequality  climate_change  have_read  via:jbdelong 
Credential Privilege or Cumulative Advantage? Prestige, Productivity, and Placement in the Academic Sociology Job Market by Spencer Headworth, Jeremy Freese :: SSRN
"Using data on the population of US sociology doctorates over a five-year period, we examine different predictors of placement in a research-oriented, tenure-track academic sociology jobs. More completely than prior studies, we document the enormous relationship between PhD institution and job placement that has, in part, prompted a popular metaphor that academic job allocation processes are like a caste system. Yet we also find comparable relationships between PhD program and both graduate student publishing and awards. Overall, we find results more consistent with PhD prestige operating indirectly through mediating achievements or as a quality signal than as a “pure prestige” effect. We suggest sociologists think of stratification in their profession as not requiring exceptionalist historical metaphors, but rather as involving the same ordinary but powerful processes of cumulative advantage that pervade contemporary life."
to:NB  sociology  sociology_of_science  inequality  cumulative_advantage  freese.jeremy  academia  science_as_a_social_process 
2 days ago
AEAweb: JEP (30,1) p. 53 - The New Role for the World Bank
"The World Bank was founded to address what we would today call imperfections in international capital markets. Its founders thought that countries would borrow from the Bank temporarily until they grew enough to borrow commercially. Some critiques and analyses of the Bank are based on the assumption that this continues to be its role. For example, some argue that the growth of private capital flows to the developing world has rendered the Bank irrelevant. However, we will argue that modern analyses should proceed from the premise that the World Bank's central goal is and should be to reduce extreme poverty, and that addressing failures in global capital markets is now of subsidiary importance. In this paper, we discuss what the Bank does: how it spends money, how it influences policy, and how it presents its mission. We argue that the role of the Bank is now best understood as facilitating international agreements to reduce poverty, and we examine implications of this perspective."
to:NB  world_bank  development_economics  political_economy 
4 days ago
AEAweb: JEP (30,1) p. 77 - The World Bank: Why It Is Still Needed and Why It Still Disappoints
"Does the World Bank still have an important role to play? How might it fulfill that role? The paper begins with a brief account of how the Bank works. It then argues that, while the Bank is no longer the primary conduit for capital from high-income to low-income countries, it still has an important role in supplying the public good of development knowledge—a role that is no less pressing today than ever. This argument is not a new one. In 1996, the Bank's President at the time, James D. Wolfensohn, laid out a vision for the "knowledge bank," an implicit counterpoint to what can be called the "lending bank." The paper argues that the past rhetoric of the "knowledge bank" has not matched the reality. An institution such as the World Bank—explicitly committed to global poverty reduction—should be more heavily invested in knowing what is needed in its client countries as well as in international coordination. It should be consistently arguing for well-informed pro-poor policies in its member countries, tailored to the needs of each country, even when such policies are unpopular with the powers-that-be. It should also be using its financial weight, combined with its analytic and convening powers, to support global public goods. In all this, there is a continuing role for lending, but it must be driven by knowledge—both in terms of what gets done and how it is geared to learning. The paper argues that the Bank disappoints in these tasks but that it could perform better."
to:NB  world_bank  development_economics  economics  political_economy 
4 days ago
AEAweb: JEP (30,1) p. 185 - Power Laws in Economics: An Introduction
"Many of the insights of economics seem to be qualitative, with many fewer reliable quantitative laws. However a series of power laws in economics do count as true and nontrivial quantitative laws—and they are not only established empirically, but also understood theoretically. I will start by providing several illustrations of empirical power laws having to do with patterns involving cities, firms, and the stock market. I summarize some of the theoretical explanations that have been proposed. I suggest that power laws help us explain many economic phenomena, including aggregate economic fluctuations. I hope to clarify why power laws are so special, and to demonstrate their utility. In conclusion, I list some power-law-related economic enigmas that demand further exploration."
to:NB  heavy_tails  economics  to_be_shot_after_a_fair_trial  gabaix.xaiver 
4 days ago
Needham, A.: Power Lines: Phoenix and the Making of the Modern Southwest. (eBook and Hardcover)
"In 1940, Phoenix was a small, agricultural city of sixty-five thousand, and the Navajo Reservation was an open landscape of scattered sheepherders. Forty years later, Phoenix had blossomed into a metropolis of 1.5 million people and the territory of the Navajo Nation was home to two of the largest strip mines in the world. Five coal-burning power plants surrounded the reservation, generating electricity for export to Phoenix, Los Angeles, and other cities. Exploring the postwar developments of these two very different landscapes, Power Lines tells the story of the far-reaching environmental and social inequalities of metropolitan growth, and the roots of the contemporary coal-fueled climate change crisis.
"Andrew Needham explains how inexpensive electricity became a requirement for modern life in Phoenix—driving assembly lines and cooling the oppressive heat. Navajo officials initially hoped energy development would improve their lands too, but as ash piles marked their landscape, air pollution filled the skies, and almost half of Navajo households remained without electricity, many Navajos came to view power lines as a sign of their subordination in the Southwest. Drawing together urban, environmental, and American Indian history, Needham demonstrates how power lines created unequal connections between distant landscapes and how environmental changes associated with suburbanization reached far beyond the metropolitan frontier. Needham also offers a new account of postwar inequality, arguing that residents of the metropolitan periphery suffered similar patterns of marginalization as those faced in America’s inner cities.
"Telling how coal from Indian lands became the fuel of modernity in the Southwest, Power Lines explores the dramatic effects that this energy system has had on the people and environment of the region."
to:NB  books:noted  american_history  american_southwest  native_american_history  electricity  20th_century_history  pollution 
9 days ago
The Economist's Tale: A Consultant Encounters Hunger and the World Bank | Zed Books
"What really happens when the World Bank imposes its policies on a country? This is an insider‘s view of one aid-made crisis. Peter Griffiths was at the interface between government and the Bank. In this ruthlessly honest, day by day account of a mission he undertook in Sierra Leone, he uses his diary to tell the story of how the World Bank, obsessed with the free market, imposed a secret agreement on the government, banning all government food imports or subsidies. The collapsing economy meant that the private sector would not import. Famine loomed. No ministry, no state marketing organization, no aid organization could reverse the agreement. It had to be a top-level government decision, whether Sierra Leone could afford to annoy minor World Bank officials. This is a rare and important portrait of the aid world which insiders will recognize, but of which the general public seldom get a glimpse."
in_NB  books:noted  economics  development_economics  political_economy  world_bank  via:crooked_timber 
9 days ago
AEAweb: AER (95,3) p. 546 - The Rise of Europe: Atlantic Trade, Institutional Change, and Economic Growth
"The rise of Western Europe after 1500 is due largely to growth in countries with access to the Atlantic Ocean and with substantial trade with the New World, Africa, and Asia via the Atlantic. This trade and the associated colonialism affected Europe not only directly, but also indirectly by inducing institutional change. Where "initial" political institutions (those established before 1500) placed significant checks on the monarchy, the growth of Atlantic trade strengthened merchant groups by constraining the power of the monarchy, and helped merchants obtain changes in institutions to protect property rights. These changes were central to subsequent economic growth."
to:NB  economics  economic_history  institutions  economic_growth  to_teach:undergrad-ADA  via:jbdelong  have_read 
10 days ago
Kutz, C.: On War and Democracy (eBook and Hardcover).
"On War and Democracy provides a richly nuanced examination of the moral justifications democracies often invoke to wage war. In this compelling and provocative book, Christopher Kutz argues that democratic principles can be both fertile and toxic ground for the project of limiting war’s violence. Only by learning to view war as limited by our democratic values—rather than as a tool for promoting them—can we hope to arrest the slide toward the borderless, seemingly endless democratic "holy wars" and campaigns of remote killings we are witnessing today, and to stop permanently the use of torture and secret law.
"Kutz shows how our democratic values, understood incautiously and incorrectly, can actually undermine the goal of limiting war. He helps us better understand why we are tempted to believe that collective violence in the name of politics can be legitimate when individual violence is not. In doing so, he offers a bold new account of democratic agency that acknowledges the need for national defense and the promotion of liberty abroad while limiting the temptations of military intervention. Kutz demonstrates why we must address concerns about the means of waging war—including remote war and surveillance—and why we must create institutions to safeguard some nondemocratic values, such as dignity and martial honor, from the threat of democratic politics."
to:NB  books:noted  war  democracy  political_philosophy  the_continuing_crises  to_be_shot_after_a_fair_trial 
12 days ago
Cities, Business, and the Politics of Urban Violence in Latin America | Eduardo Moncada
"This book analyzes and explains the ways in which major developing world cities respond to the challenge of urban violence. The study shows how the political projects that cities launch to confront urban violence are shaped by the interaction between urban political economies and patterns of armed territorial control. It introduces business as a pivotal actor in the politics of urban violence, and argues that how business is organized within cities and its linkages to local governments impacts whether or not business supports or subverts state efforts to stem and prevent urban violence. A focus on city mayors finds that the degree to which politicians rely upon clientelism to secure and maintain power influences whether they favor responses to violence that perpetuate or weaken local political exclusion. The book builds a new typology of patterns of armed territorial control within cities, and shows that each poses unique challenges and opportunities for confronting urban violence. The study develops sub-national comparative analyses of puzzling variation in the institutional outcomes of the politics of urban violence across Colombia's three principal cities—Medellin, Cali, and Bogota—and over time within each. The book's main findings contribute to research on violence, crime, citizen security, urban development, and comparative political economy. The analysis demonstrates that the politics of urban violence is a powerful new lens on the broader question of who governs in major developing world cities."
to:NB  books:noted  political_economy  cities  violence  crime  colombia 
13 days ago
Convergence of One-parameter Operator Semigroups | Abstract Analysis | Cambridge University Press
"This book presents a detailed and contemporary account of the classical theory of convergence of semigroups. The author demonstrates the far-reaching applications of this theory using real examples from various branches of pure and applied mathematics, with a particular emphasis on mathematical biology. These examples also serve as short, non-technical introductions to biological concepts. The book may serve as a useful reference, containing a significant number of new results ranging from the analysis of fish populations to signalling pathways in living cells."
in_NB  books:noted  mathematics  analysis  stochastic_processes  markov_models  biology  re:almost_none 
13 days ago
Mathematical Foundations of Infinite-Dimensional Statistical Models | Statistical Theory and Methods | Cambridge University Press
"In nonparametric and high-dimensional statistical models, the classical Gauss-Fisher-Le Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, on approximation and wavelet theory, and on the basic theory of function spaces. The theory of statistical inference in such models – hypothesis testing, estimation and confidence sets – is then presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions."
in_NB  books:noted  statistics  estimation  nonparametrics 
13 days ago
On Algorithmic Communism - The Los Angeles Review of Books
I will be... very interested... in their analysis of the computational complexity of economic planning.
books:noted  book_reviews  progressive_forces  to_be_shot_after_a_fair_trial 
19 days ago
Taking Text and Structure Really Seriously
An argument that, taking the text of the Constitution literally, no one who wasn't a citizen when it was adopted is eligible to become president.
have_read  law  ha_ha_only_serious  us_politics  balkin.jack_m.  via:kjhealy 
25 days ago
Brownian Motion as a Limit to Physical Measuring Processes: A Chapter in the History of Noise from the Physicists’ Point of View
"In this paper, we examine the history of the idea that noise presents a fundamental limit to physical measuring processes. This idea had its origins in research aimed at improving the accuracy of instruments for electrical measurements. Out of these endeavors, the Swedish physicist Gustaf A. Ising formulated a general conclusion concerning the nature of physical measurements, namely that there is a definite limit to the ultimate sensitivity of measuring instruments beyond which we cannot advance, and that this limit is determined by Brownian motion. Ising’s conclusion agreed with experiments and received widespread recognition, but his way of modeling the system was contested by his contemporaries. With the more embracing notion of noise that developed during and after World War II, Ising’s conclusion was reinterpreted as showing that noise puts a limit on physical measurement processes. Hence, physicists in particular saw the work as an indication that noise is of practical relevance for their enterprise."
to:NB  stochastic_processes  physics_of_information  measurement  statistical_mechanics  history_of_physics 
28 days ago
Two Mathematical Approaches to Random Fluctuations
"Physicists and mathematicians in the early twentieth century had established a research program on various random fluctuations. Historical reviews have portrayed this development as a linear progress toward a unified conceptual framework. In this paper, I argue that two approaches were at work. One operated in the “time domain,” as it aimed to formulate the diffusion-type equation for the probability density function and its solutions. The other operated in the “frequency domain,” as it focused on the spectral analysis of the fluctuation. The time-domain analysis was marshaled by statistical physicists, while the frequency-domain analysis was promoted by engineering researchers."

--- Of course, the two are equivalent...
to:NB  stochastic_processes  history_of_science  history_of_physics  fourier_analysis 
28 days ago
Radar, Modems, and Air Defense Systems: Noise as a Data Communication Problem in the 1950s
"The modem was created in the context of a US strategic automatic air defense network to transmit data from radar stations over large distances over the existing telephone system. A significant early challenge was how to minimize noise, which was conceptualized in terms of echo effects, impulse noise, and phase distortion. The approaches to solving the noise problem varied with the techniques of modulation and demodulation used by early modems. The problem of minimizing noise was crucial to the development of the modem, and the focus the military placed on automatic air defense spawned decades of work into the further refinement of modem technology."
to:NB  history_of_technology  computer_networks  signal_processing  cold_war 
28 days ago
The Physics of Forgetting: Thermodynamics of Information at IBM 1959–1982
"The origin and history of Landauer’s principle is traced through the development of the thermodynamics of computation at IBM from 1959 to 1982. This development was characterized by multiple conceptual shifts: memory came to be seen not as information storage, but as delayed information transmission; information itself was seen not as a disembodied logical entity, but as participating in the physical world; and logical irreversibility was connected with physical, thermodynamic, irreversibility. These conceptual shifts were characterized by an ambivalence opposing strong metaphysical claims to practical considerations. Three sorts of practical considerations are discussed. First, these conceptual shifts engaged materials central to IBM’s business practice. Second, arguments for metaphysical certainties were made with reference to the practical functioning of typical computers. Third, arguments for metaphysical certainties were made in the context of establishing the thermodynamics of information as a sub-discipline of physics."
to:NB  physics_of_information  thermodynamics  information_theory  landauers_principle  history_of_physics  statistical_mechanics 
28 days ago
Inference in finite state space non parametric Hidden Markov Models and applications - Springer
"Hidden Markov models (HMMs) are intensively used in various fields to model and classify data observed along a line (e.g. time). The fit of such models strongly relies on the choice of emission distributions that are most often chosen among some parametric family. In this paper, we prove that finite state space non parametric HMMs are identifiable as soon as the transition matrix of the latent Markov chain has full rank and the emission probability distributions are linearly independent. This general result allows the use of semi- or non-parametric emission distributions. Based on this result we present a series of classification problems that can be tackled out of the strict parametric framework. We derive the corresponding inference algorithms. We also illustrate their use on few biological examples, showing that they may improve the classification performances."
to:NB  markov_models  state-space_models  statistics  nonparametrics  time_series 
29 days ago
Does data splitting improve prediction? - Springer
"Data splitting divides data into two parts. One part is reserved for model selection. In some applications, the second part is used for model validation but we use this part for estimating the parameters of the chosen model. We focus on the problem of constructing reliable predictive distributions for future observed values. We judge the predictive performance using log scoring. We compare the full data strategy with the data splitting strategy for prediction. We show how the full data score can be decomposed into model selection, parameter estimation and data reuse costs. Data splitting is preferred when data reuse costs are high. We investigate the relative performance of the strategies in four simulation scenarios. We introduce a hybrid estimator that uses one part for model selection but both parts for estimation. We argue that a split data analysis is prefered to a full data analysis for prediction with some exceptions."

--- Ungated: http://arxiv.org/abs/1301.2983
in_NB  statistics  regression  prediction  model_selection  faraway.j.j.  re:ADAfaEPoV  to_teach:undergrad-ADA  have_read  to_teach:mreg 
29 days ago
[1510.07389] The Human Kernel
"Bayesian nonparametric models, such as Gaussian processes, provide a compelling framework for automatic statistical modelling: these models have a high degree of flexibility, and automatically calibrated complexity. However, automating human expertise remains elusive; for example, Gaussian processes with standard kernels struggle on function extrapolation problems that are trivial for human learners. In this paper, we create function extrapolation problems and acquire human responses, and then design a kernel learning framework to reverse engineer the inductive biases of human learners across a set of behavioral experiments. We use the learned kernels to gain psychological insights and to extrapolate in human-like ways that go beyond traditional stationary and polynomial kernels. Finally, we investigate Occam's razor in human and Gaussian process based function learning."
to:NB  kernel_methods  nonparametrics  regression  statistics  via:arsyed  cognitive_science 
4 weeks ago
Quantifying Life: A Symbiosis of Computation, Mathematics, and Biology, Kondrashov
"Since the time of Isaac Newton, physicists have used mathematics to describe the behavior of matter of all sizes, from subatomic particles to galaxies. In the past three decades, as advances in molecular biology have produced an avalanche of data, computational and mathematical techniques have also become necessary tools in the arsenal of biologists. But while quantitative approaches are now providing fundamental insights into biological systems, the college curriculum for biologists has not caught up, and most biology majors are never exposed to the computational and probabilistic mathematical approaches that dominate in biological research.
"With Quantifying Life, Dmitry A. Kondrashov offers an accessible introduction to the breadth of mathematical modeling used in biology today. Assuming only a foundation in high school mathematics, Quantifying Life takes an innovative computational approach to developing mathematical skills and intuition. Through lessons illustrated with copious examples, mathematical and programming exercises, literature discussion questions, and computational projects of various degrees of difficulty, students build and analyze models based on current research papers and learn to implement them in the R programming language. This interplay of mathematical ideas, systematically developed programming skills, and a broad selection of biological research topics makes Quantifying Life an invaluable guide for seasoned life scientists and the next generation of biologists alike."

--- Mineable for examples?
books:noted  biology  programming  modeling  to_teach:statcomp  to_teach:complexity-and-inference 
4 weeks ago
Coevolution of Life on Hosts: Integrating Ecology and History, Clayton, Bush, Johnson
"For most, the mere mention of lice forces an immediate hand to the head and recollection of childhood experiences with nits, medicated shampoos, and traumatic haircuts. But for a certain breed of biologist, lice make for fascinating scientific fodder, especially enlightening in the study of coevolution. In this book, three leading experts on host-parasite relationships demonstrate how the stunning coevolution that occurs between such species in microevolutionary, or ecological, time generates clear footprints in macroevolutionary, or historical, time. By integrating these scales, Coevolution of Life on Hosts offers a comprehensive understanding of the influence of coevolution on the diversity of all life.
"Following an introduction to coevolutionary concepts, the authors combine experimental and comparative host-parasite approaches for testing coevolutionary hypotheses to explore the influence of ecological interactions and coadaptation on patterns of diversification and codiversification among interacting species. Ectoparasites—a diverse assemblage of organisms that ranges from herbivorous insects on plants, to monogenean flatworms on fish, and feather lice on birds—are powerful models for the study of coevolution because they are easy to observe, mark, and count. As lice on birds and mammals are permanent parasites that spend their entire lifecycles on the bodies of their hosts, they are ideally suited to generating a synthetic overview of coevolution—and, thereby, offer an exciting framework for integrating the concepts of coadaptation and codiversification."
in_NB  books:noted  evolutionary_biology 
4 weeks ago
High-Stakes Schooling: What We Can Learn from Japan's Experiences with Testing, Accountability, and Education Reform, Bjork
"If there is one thing that describes the trajectory of American education, it is this: more high-stakes testing. In the United States, the debates surrounding this trajectory can be so fierce that it feels like we are in uncharted waters. As Christopher Bjork reminds us in this study, however, we are not the first to make testing so central to education: Japan has been doing it for decades. Drawing on Japan’s experiences with testing, overtesting, and recent reforms to relax educational pressures, he sheds light on the best path forward for US schools.
"Bjork asks a variety of important questions related to testing and reform: Does testing overburden students? Does it impede innovation and encourage conformity? Can a system anchored by examination be reshaped to nurture creativity and curiosity? How should any reforms be implemented by teachers? Each chapter explores questions like these with careful attention to the actual effects policies have had on schools in Japan and other Asian settings, and each draws direct parallels to issues that US schools currently face. Offering a wake-up call for American education, Bjork ultimately cautions that the accountability-driven practice of standardized testing might very well exacerbate the precise problems it is trying to solve. "
in_NB  books:noted  education  standardized_testing  japan 
4 weeks ago
Semantic Properties of Diagrams and Their Cognitive Potentials, Shimojima
"Why are diagrams sometimes so useful, facilitating our understanding and thinking, while at other times they can be unhelpful and even misleading? Drawing on a comprehensive survey of modern research in philosophy, logic, artificial intelligence, cognitive psychology, and graphic design, Semantic Properties of Diagrams and Their Cognitive Potentials reveals the systematic reasons for this dichotomy, showing that the cognitive functions of diagrams are rooted in the characteristic ways they carry information. In analyzing the logical mechanisms behind the relative efficacy of diagrammatic representation, Atsushi Shimojima provides deep insight into the crucial question: What makes a diagram a diagram?"
books:noted  visual_display_of_quantitative_information  cognition  diagrams 
4 weeks ago
[1512.07942] Multi-Level Cause-Effect Systems
"We present a domain-general account of causation that applies to settings in which macro-level causal relations between two systems are of interest, but the relevant causal features are poorly understood and have to be aggregated from vast arrays of micro-measurements. Our approach generalizes that of Chalupka et al. (2015) to the setting in which the macro-level effect is not specified. We formalize the connection between micro- and macro-variables in such situations and provide a coherent framework describing causal relations at multiple levels of analysis. We present an algorithm that discovers macro-variable causes and effects from micro-level measurements obtained from an experiment. We further show how to design experiments to discover macro-variables from observational micro-variable data. Finally, we show that under specific conditions, one can identify multiple levels of causal structure. Throughout the article, we use a simulated neuroscience multi-unit recording experiment to illustrate the ideas and the algorithms."
to:NB  to_read  causality  causal_inference  macro_from_micro  eberhardt.frederick  kith_and_kin  re:what_is_a_macrostate 
4 weeks ago
Ferreirós, J.: Mathematical Knowledge and the Interplay of Practices (eBook and Hardcover).
"This book presents a new approach to the epistemology of mathematics by viewing mathematics as a human activity whose knowledge is intimately linked with practice. Charting an exciting new direction in the philosophy of mathematics, José Ferreirós uses the crucial idea of a continuum to provide an account of the development of mathematical knowledge that reflects the actual experience of doing math and makes sense of the perceived objectivity of mathematical results.
"Describing a historically oriented, agent-based philosophy of mathematics, Ferreirós shows how the mathematical tradition evolved from Euclidean geometry to the real numbers and set-theoretic structures. He argues for the need to take into account a whole web of mathematical and other practices that are learned and linked by agents, and whose interplay acts as a constraint. Ferreirós demonstrates how advanced mathematics, far from being a priori, is based on hypotheses, in contrast to elementary math, which has strong cognitive and practical roots and therefore enjoys certainty.
"Offering a wealth of philosophical and historical insights, Mathematical Knowledge and the Interplay of Practices challenges us to rethink some of our most basic assumptions about mathematics, its objectivity, and its relationship to culture and science."

--- *ahem* P. Kitcher, _The Nature of Mathematical Knowledge_ 1983 *ahem*
to:NB  books:noted  mathematics  philosophy_of_science 
4 weeks ago
Phys. Rev. Lett. 115, 268501 (2015) - Teleconnection Paths via Climate Network Direct Link Detection
"Teleconnections describe remote connections (typically thousands of kilometers) of the climate system. These are of great importance in climate dynamics as they reflect the transportation of energy and climate change on global scales (like the El Niño phenomenon). Yet, the path of influence propagation between such remote regions, and weighting associated with different paths, are only partially known. Here we propose a systematic climate network approach to find and quantify the optimal paths between remotely distant interacting locations. Specifically, we separate the correlations between two grid points into direct and indirect components, where the optimal path is found based on a minimal total cost function of the direct links. We demonstrate our method using near surface air temperature reanalysis data, on identifying cross-latitude teleconnections and their corresponding optimal paths. The proposed method may be used to quantify and improve our understanding regarding the emergence of climate patterns on global scales."

--- I am going to be pleasantly astonished if this amounts to more than the graph lasso, or even if there is any acknowledgment of prior art at all.
to:NB  statistics  time_series  graphical_models  time_series_connections 
4 weeks ago
Phys. Rev. Lett. 115, 260602 (2015) - On-Chip Maxwell's Demon as an Information-Powered Refrigerator
"We present an experimental realization of an autonomous Maxwell’s demon, which extracts microscopic information from a system and reduces its entropy by applying feedback. It is based on two capacitively coupled single-electron devices, both integrated on the same electronic circuit. This setup allows a detailed analysis of the thermodynamics of both the demon and the system as well as their mutual information exchange. The operation of the demon is directly observed as a temperature drop in the system. We also observe a simultaneous temperature rise in the demon arising from the thermodynamic cost of generating the mutual information."
to:NB  physics_of_information  maxwells_demon  physics  statistical_mechanics  information_theory  thermodynamics 
4 weeks ago
Herbst, E.P. and Schorfheide, F.: Bayesian Estimation of DSGE Models (eBook and Hardcover).
"Dynamic stochastic general equilibrium (DSGE) models have become one of the workhorses of modern macroeconomics and are extensively used for academic research as well as forecasting and policy analysis at central banks. This book introduces readers to state-of-the-art computational techniques used in the Bayesian analysis of DSGE models. The book covers Markov chain Monte Carlo techniques for linearized DSGE models, novel sequential Monte Carlo methods that can be used for parameter inference, and the estimation of nonlinear DSGE models based on particle filter approximations of the likelihood function. The theoretical foundations of the algorithms are discussed in depth, and detailed empirical applications and numerical illustrations are provided. The book also gives invaluable advice on how to tailor these algorithms to specific applications and assess the accuracy and reliability of the computations."
to:NB  books:noted  econometrics  macroeconomics  time_series  estimation  statistics  re:your_favorite_dsge_sucks 
4 weeks ago
Card, D. and Krueger, A.B.: Myth and Measurement: The New Economics of the Minimum Wage. (Twentieth-Anniversary Edition) (eBook and Paperback)
"David Card and Alan B. Krueger have already made national news with their pathbreaking research on the minimum wage. Here they present a powerful new challenge to the conventional view that higher minimum wages reduce jobs for low-wage workers. In a work that has important implications for public policy as well as for the direction of economic research, the authors put standard economic theory to the test, using data from a series of recent episodes, including the 1992 increase in New Jersey's minimum wage, the 1988 rise in California's minimum wage, and the 1990-91 increases in the federal minimum wage. In each case they present a battery of evidence showing that increases in the minimum wage lead to increases in pay, but no loss in jobs.
"A distinctive feature of Card and Krueger's research is the use of empirical methods borrowed from the natural sciences, including comparisons between the "treatment" and "control" groups formed when the minimum wage rises for some workers but not for others. In addition, the authors critically reexamine the previous literature on the minimum wage and find that it, too, lacks support for the claim that a higher minimum wage cuts jobs. Finally, the effects of the minimum wage on family earnings, poverty outcomes, and the stock market valuation of low-wage employers are documented. Overall, this book calls into question the standard model of the labor market that has dominated economists' thinking on the minimum wage. In addition, it will shift the terms of the debate on the minimum wage in Washington and in state legislatures throughout the country.
"With a new preface discussing new data, Myth and Measurement continues to shift the terms of the debate on the minimum wage."

--- I can only hope the new preface also discusses the immortal "theory is evidence too".
to:NB  books:noted  economics  imperfect_competition 
4 weeks ago
Pritchard, D.: Epistemic Angst: Radical Skepticism and the Groundlessness of Our Believing. (eBook and Hardcover)
"Epistemic Angst offers a completely new solution to the ancient philosophical problem of radical skepticism—the challenge of explaining how it is possible to have knowledge of a world external to us.
"Duncan Pritchard argues that the key to resolving this puzzle is to realize that it is composed of two logically distinct problems, each requiring its own solution. He then puts forward solutions to both problems. To that end, he offers a new reading of Wittgenstein’s account of the structure of rational evaluation and demonstrates how this provides an elegant solution to one aspect of the skeptical problem. Pritchard also revisits the epistemological disjunctivist proposal that he developed in previous work and shows how it can effectively handle the other aspect of the problem. Finally, he argues that these two antiskeptical positions, while superficially in tension with each other, are not only compatible but also mutually supporting."

--- I realize there is an irony to being skeptical about this largely on the grounds that all attempts to solve the problem for the last 2000+ years have failed. (Uncle Davey would've understood.)
to:NB  books:noted  epistemology 
4 weeks ago
Antràs, P.: Global Production: Firms, Contracts, and Trade Structure. (eBook and Hardcover)
"Global Production is the first book to provide a fully comprehensive overview of the complicated issues facing multinational companies and their global sourcing strategies. Few international trade transactions today are based on the exchange of finished goods; rather, the majority of transactions are dominated by sales of individual components and intermediary services. Many firms organize global production around offshoring parts, components, and services to producers in distant countries, and contracts are drawn up specific to the parties and distinct legal systems involved. Pol Antràs examines the contractual frictions that arise in the international system of production and how these frictions influence the world economy.
"Antràs discusses the inevitable complications that develop in contract negotiation and execution. He provides a unified framework that sheds light on the factors helping global firms determine production locations and other organizational choices. Antràs also implements a series of systematic empirical tests, based on recent data from the U.S. Customs and Census Offices, which demonstrate the relevance of contractual factors in global production decisions."
to:NB  books:noted  economics  globalization 
4 weeks ago
The Website Obesity Crisis
"Let me start by saying that beautiful websites come in all sizes and page weights. I love big websites packed with images. I love high-resolution video. I love sprawling Javascript experiments or well-designed web apps.
"This talk isn't about any of those. It's about mostly-text sites that, for unfathomable reasons, are growing bigger with every passing year.
"While I'll be using examples to keep the talk from getting too abstract, I’m not here to shame anyone, except some companies (Medium) that should know better and are intentionally breaking the web."
internet  design  ceglowski.maciej 
4 weeks ago
Models, robustness, and non-causal explanation: a foray into cognitive science and biology - Springer
"This paper is aimed at identifying how a model’s explanatory power is constructed and identified, particularly in the practice of template-based modeling (Humphreys, Philos Sci 69:1–11, 2002; Extending ourselves: computational science, empiricism, and scientific method, 2004), and what kinds of explanations models constructed in this way can provide. In particular, this paper offers an account of non-causal structural explanation that forms an alternative to causal–mechanical accounts of model explanation that are currently popular in philosophy of biology and cognitive science. Clearly, defences of non-causal explanation are far from new (e.g. Batterman, Br J Philos Sci 53:21–38, 2002a; The devil in the details: asymptotic reasoning in explanation, reduction, and emergence, 2002b; Pincock, Noûs 41:253–275, 2007; Mathematics and scientific representation 2012; Rice, Noûs. doi:10.​1111/​nous.​12042, 2013; Biol Philos 27:685–703, 2012), so the targets here are focused on a particular type of robust phenomenon and how strong invariance to interventions can block a range of causal explanations. By focusing on a common form of model construction, the paper also ties functional or computational style explanations found in cognitive science and biology more firmly with explanatory practices across model-based science in general."
to:NB  philosophy_of_science  explanation  explanation_by_mechanisms  modeling 
5 weeks ago
Scientific understanding: truth or dare? - Springer
"It is often claimed—especially by scientific realists—that science provides understanding of the world only if its theories are (at least approximately) true descriptions of reality, in its observable as well as unobservable aspects. This paper critically examines this ‘realist thesis’ concerning understanding. A crucial problem for the realist thesis is that (as study of the history and practice of science reveals) understanding is frequently obtained via theories and models that appear to be highly unrealistic or even completely fictional. So we face the dilemma of either giving up the realist thesis that understanding requires truth, or allowing for the possibility that in many if not all practical cases we do not have scientific understanding. I will argue that the first horn is preferable: the link between understanding and truth can be severed. This becomes a live option if we abandon the traditional view that scientific understanding is a special type of knowledge. While this view implies that understanding must be factive, I avoid this implication by identifying understanding with a skill rather than with knowledge. I will develop the idea that understanding phenomena consists in the ability to use a theory to generate predictions of the target system’s behavior. This implies that the crucial condition for understanding is not truth but intelligibility of the theory, where intelligibility is defined as the value that scientists attribute to the theoretical virtues that facilitate the construction of models of the phenomena. I will show, first, that my account accords with the way practicing scientists conceive of understanding, and second, that it allows for the use of idealized or fictional models and theories in achieving understanding."
to:NB  philosophy_of_science  explanation  modeling 
5 weeks ago
The Political Economy of Agricultural Statistics and Input Subsidies: Evidence from India, Nigeria and Malawi - Jerven - 2013 - Journal of Agrarian Change - Wiley Online Library
"The political economy of agricultural policies – why certain interventions may be preferred by political leaders rather than others – is well recognized. This paper explores a perspective that has previously been neglected: the political economy of the agricultural statistics. In developing economies, the data on agricultural production are weak. Because these data are assembled using competing methods and assumptions, the final series are subject to political pressure, particularly when the government is subsidizing agricultural inputs. This paper draws on debates on the evidence of a Green Revolution in India and the arguments on the effect of withdrawing fertilizer subsidies during structural adjustment in Nigeria, and finally the paper presents new data on the effect of crop data subsidies in Malawi. The recent agricultural census (2006/7) indicates a maize output of 2.1 million metric tonnes, compared to the previously widely circulated figures of 3.4 million metric tonnes. The paper suggests that ‘data’ are themselves a product of agricultural policies."
to:NB  statistics  economics  political_economy  evidence_based  science_as_a_social_process  social_measurement  india  nigeria  malawi 
5 weeks ago
Statistical Modeling: A Fresh Approach
"Statistical Modeling: A Fresh Approach introduces and illuminates the statistical reasoning used in modern research throughout the natural and social sciences, medicine, government, and commerce. It emphasizes the use of models to untangle and quantify variation in observed data. By a deft and concise use of computing coupled with an innovative geometrical presentation of the relationship among variables, A Fresh Approach reveals the logic of statistical inference and empowers the reader to use and understand techniques such as analysis of covariance that appear widely in published research but are hardly ever found in introductory texts.
"Recognizing the essential role the computer plays in modern statistics, A Fresh Approach provides a complete and self-contained introduction to statistical computing using the powerful (and free) statistics package R."
in_NB  books:noted  statistics  regression  R  re:ADAfaEPoV 
5 weeks ago
[Reviews] | Kennan Kvetches, by Andrew J. Bacevich | Harper's Magazine
The combination of a messianic self-image with contempt for those whom he wished to save, and the track-record of bad forecasts, are entirely legitimate attacks on him as a public figure. But surely it's a bit harsh to condemn someone for thoughts expressed in the privacy of a diary? I agree they are not good thoughts, but many people have bad thoughts and feel better for giving them expression in a way which doesn't hurt anyone else --- like writing them down in a diary. Did he, as a public person, act on these?
running_dogs_of_reaction  american_hegemony  via:?  have_read  to:blog 
5 weeks ago
Articulating the World: Conceptual Understanding and the Scientific Image, Rouse
"Naturalism as a guiding philosophy for modern science both disavows any appeal to the supernatural or anything else transcendent to nature, and repudiates any philosophical or religious authority over the workings and conclusions of the sciences. A longstanding paradox within naturalism, however, has been the status of scientific knowledge itself, which seems, at first glance, to be something that transcends and is therefore impossible to conceptualize within scientific naturalism itself.
"In Articulating the World, Joseph Rouse argues that the most pressing challenge for advocates of naturalism today is precisely this: to understand how to make sense of a scientific conception of nature as itself part of nature, scientifically understood. Drawing upon recent developments in evolutionary biology and the philosophy of science, Rouse defends naturalism in response to this challenge by revising both how we understand our scientific conception of the world and how we situate ourselves within it."
in_NB  books:noted  philosophy_of_science  epistemology 
6 weeks ago
How Not to Network a Nation | The MIT Press
"Between 1959 and 1989, Soviet scientists and officials made numerous attempts to network their nation—to construct a nationwide computer network. None of these attempts succeeded, and the enterprise had been abandoned by the time the Soviet Union fell apart. Meanwhile, ARPANET, the American precursor to the Internet, went online in 1969. Why did the Soviet network, with top-level scientists and patriotic incentives, fail while the American network succeeded? In How Not to Network a Nation, Benjamin Peters reverses the usual cold war dualities and argues that the American ARPANET took shape thanks to well-managed state subsidies and collaborative research environments and the Soviet network projects stumbled because of unregulated competition among self-interested institutions, bureaucrats, and others. The capitalists behaved like socialists while the socialists behaved like capitalists.
"After examining the midcentury rise of cybernetics, the science of self-governing systems, and the emergence in the Soviet Union of economic cybernetics, Peters complicates this uneasy role reversal while chronicling the various Soviet attempts to build a “unified information network.” Drawing on previously unknown archival and historical materials, he focuses on the final, and most ambitious of these projects, the All-State Automated System of Management (OGAS), and its principal promoter, Viktor M. Glushkov. Peters describes the rise and fall of OGAS—its theoretical and practical reach, its vision of a national economy managed by network, the bureaucratic obstacles it encountered, and the institutional stalemate that killed it. Finally, he considers the implications of the Soviet experience for today’s networked world."
in_NB  books:noted  computer_networks  ussr  internet  the_present_before_it_was_widely_distributed 
6 weeks ago
The Polythink Syndrome: U.S. Foreign Policy Decisions on 9/11, Afghanistan, Iraq, Iran, Syria, and ISIS | Alex Mintz and Carly Wayne
"Why do presidents and their advisors often make sub-optimal decisions on military intervention, escalation, de-escalation, and termination of conflicts?
"The leading concept of group dynamics, groupthink, offers one explanation: policy-making groups make sub-optimal decisions due to their desire for conformity and uniformity over dissent, leading to a failure to consider other relevant possibilities. But presidential advisory groups are often fragmented and divisive. This book therefore scrutinizes polythink, a group decision-making dynamic whereby different members in a decision-making unit espouse a plurality of opinions and divergent policy prescriptions, resulting in a disjointed decision-making process or even decision paralysis.
"The book analyzes eleven national security decisions, including the national security policy designed prior to the terrorist attacks of 9/11, the decisions to enter into and withdraw from Afghanistan and Iraq, the 2007 "surge" decision, the crisis over the Iranian nuclear program, the UN Security Council decision on the Syrian Civil War, the faltering Kerry Peace Process in the Middle East, and the U.S. decision on military operations against ISIS.
"Based on the analysis of these case studies, the authors address implications of the polythink phenomenon, including prescriptions for avoiding and/or overcoming it, and develop strategies and tools for what they call Productive Polythink. The authors also show the applicability of polythink to business, industry, and everyday decisions."

--- I am... intrigued, for want of a better word, by the idea that the problem with the US decision to invade Iraq was that too many people considered too many different options.
in_NB  books:noted  the_continuing_crises  us_military  american_hegemony  us-iraq_war  decision-making  social_life_of_the_mind  re:democratic_cognition 
6 weeks ago
Beasts and Gods: How Democracy Changed Its Meaning and Lost Its Purpose, Fuller
"Democracy is sold to us on its ability to deliver equal opportunity, and to give every citizen an equal voice. Yet time and again we see that this is not the case: power and spoils alike flow to the few, while the many are left with no recourse. What is wrong with democracy?
"Nothing, says Roslyn Fuller: what we have simply isn’t democracy—it’s a perversion of it, created by poorly designed electoral systems, weak campaign laws, and broad limitations on participation and representation at nearly every level. Backing her argument with copious empirical data analyzing a wide variety of voting methods across twenty nations, Fuller makes her conclusion irrefutable: if we want true democracy, we have to return to the philosophical insights that originally underpinned it, and thoroughly reexamine the goals and methods of democracy and democratic participation. A radical, damning, yet at the same time fiercely hopeful work, Beasts and Gods aims to reconfigure the very foundations of modern society."
in_NB  books:noted  political_philosophy  democracy 
6 weeks ago
Masters of Uncertainty: Weather Forecasters and the Quest for Ground Truth, Daipha
"Though we commonly make them the butt of our jokes, weather forecasters are in fact exceptionally good at managing uncertainty. They consistently do a better job calibrating their performance than stockbrokers, physicians, or other decision-making experts precisely because they receive feedback on their decisions in near real time. Following forecasters in their quest for truth and accuracy, therefore, holds the key to the analytically elusive process of decision making as it actually happens.
"In Masters of Uncertainty, Phaedra Daipha develops a new conceptual framework for the process of decision making, after spending years immersed in the life of a northeastern office of the National Weather Service. Arguing that predicting the weather will always be more craft than science, Daipha shows how forecasters have made a virtue of the unpredictability of the weather. Impressive data infrastructures and powerful computer models are still only a substitute for the real thing outside, and so forecasters also enlist improvisational collage techniques and an omnivorous appetite for information to create a locally meaningful forecast on their computer screens. Intent on capturing decision making in action, Daipha takes the reader through engrossing firsthand accounts of several forecasting episodes (hits and misses) and offers a rare fly-on-the-wall insight into the process and challenges of producing meteorological predictions come rain or come shine. Combining rich detail with lucid argument, Masters of Uncertainty advances a theory of decision making that foregrounds the pragmatic and situated nature of expert cognition and casts into new light how we make decisions in the digital age."
in_NB  books:noted  prediction  decision-making  meteorology  sociology_of_science 
6 weeks ago
How Reason Almost Lost Its Mind: The Strange Career of Cold War Rationality, Erickson, Klein, Daston
"In the United States at the height of the Cold War, roughly between the end of World War II and the early 1980s, a new project of redefining rationality commanded the attention of sharp minds, powerful politicians, wealthy foundations, and top military brass. Its home was the human sciences—psychology, sociology, political science, and economics, among others—and its participants enlisted in an intellectual campaign to figure out what rationality should mean and how it could be deployed.
"How Reason Almost Lost Its Mind brings to life the people—Herbert Simon, Oskar Morgenstern, Herman Kahn, Anatol Rapoport, Thomas Schelling, and many others—and places, including the RAND Corporation, the Center for Advanced Study in the Behavioral Sciences, the Cowles Commission for Research and Economics, and the Council on Foreign Relations, that played a key role in putting forth a “Cold War rationality.” Decision makers harnessed this picture of rationality—optimizing, formal, algorithmic, and mechanical—in their quest to understand phenomena as diverse as economic transactions, biological evolution, political elections, international relations, and military strategy. The authors chronicle and illuminate what it meant to be rational in the age of nuclear brinkmanship."
in_NB  books:noted  economics  game_theory  rationality  social_science_methodology  history_of_science  cold_war 
6 weeks ago
Tunnel Visions: The Rise and Fall of the Superconducting Super Collider, Riordan, Hoddeson, Kolb
"Starting in the 1950s, US physicists dominated the search for elementary particles; aided by the association of this research with national security, they held this position for decades. In an effort to maintain their hegemony and track down the elusive Higgs boson, they convinced President Reagan and Congress to support construction of the multibillion-dollar Superconducting Super Collider project in Texas—the largest basic-science project ever attempted. But after the Cold War ended and the estimated SSC cost surpassed ten billion dollars, Congress terminated the project in October 1993.
"Drawing on extensive archival research, contemporaneous press accounts, and over one hundred interviews with scientists, engineers, government officials, and others involved, Tunnel Visions tells the riveting story of the aborted SSC project. The authors examine the complex, interrelated causes for its demise, including problems of large-project management, continuing cost overruns, and lack of foreign contributions. In doing so, they ask whether Big Science has become too large and expensive, including whether academic scientists and their government overseers can effectively manage such an enormous undertaking."
to:NB  books:noted  history_of_science  particle_physics 
6 weeks ago
[no title]
--- One of the nice things about this paper is that it's a good example of a sociology of science which is not in any way debunking, and, as such, blends seamlessly into an _internal_ critique ("how could we have done better?")
to:NB  to_read  sociology  sociology_of_science  inequality  class_struggles_in_america  economics 
6 weeks ago
« earlier      
academia afghanistan agent-based_models american_history ancient_history archaeology art bad_data_analysis bad_science_journalism bayesian_consistency bayesianism biochemical_networks blogged book_reviews books:noted books:owned books:recommended bootstrap causal_inference causality central_asia central_limit_theorem class_struggles_in_america classifiers climate_change clustering cognitive_science collective_cognition community_discovery complexity computational_statistics concentration_of_measure confidence_sets corruption coveted crime cultural_criticism cultural_exchange data_analysis data_mining debunking decision-making decision_theory delong.brad democracy density_estimation dimension_reduction distributed_systems dynamical_systems econometrics economic_history economic_policy economics education empirical_processes ensemble_methods entropy_estimation epidemic_models epistemology ergodic_theory estimation evisceration evolutionary_biology experimental_psychology finance financial_crisis_of_2007-- financial_speculation fmri food fraud funny funny:academic funny:geeky funny:laughing_instead_of_screaming funny:malicious funny:pointed genetics graph_theory graphical_models have_read heard_the_talk heavy_tails high-dimensional_statistics history_of_ideas history_of_science human_genetics hypothesis_testing ideology imperialism in_nb inequality inference_to_latent_objects information_theory institutions kernel_methods kith_and_kin krugman.paul large_deviations lasso learning_theory liberman.mark likelihood linguistics literary_criticism machine_learning macro_from_micro macroeconomics manifold_learning market_failures_in_everything markov_models mathematics mixing model_selection modeling modern_ruins monte_carlo moral_psychology moral_responsibility mortgage_crisis natural_history_of_truthiness network_data_analysis networked_life networks neural_data_analysis neuroscience non-equilibrium nonparametrics optimization our_decrepit_institutions philosophy philosophy_of_science photos physics pittsburgh political_economy political_science practices_relating_to_the_transmission_of_genetic_information prediction pretty_pictures principal_components probability programming progressive_forces psychology r racism random_fields re:almost_none re:aos_project re:democratic_cognition re:do-institutions-evolve re:g_paper re:homophily_and_confounding re:network_differences re:smoothing_adjacency_matrices re:social_networks_as_sensor_networks re:stacs re:your_favorite_dsge_sucks recipes regression running_dogs_of_reaction science_as_a_social_process science_fiction simulation social_influence social_life_of_the_mind social_media social_networks social_science_methodology sociology something_about_america sparsity spatial_statistics statistical_inference_for_stochastic_processes statistical_mechanics statistics stochastic_processes text_mining the_american_dilemma the_continuing_crises time_series to:blog to:nb to_be_shot_after_a_fair_trial to_read to_teach:complexity-and-inference to_teach:data-mining to_teach:statcomp to_teach:undergrad-ada track_down_references us_politics utter_stupidity vast_right-wing_conspiracy via:? via:henry_farrell via:jbdelong via:klk visual_display_of_quantitative_information whats_gone_wrong_with_america why_oh_why_cant_we_have_a_better_academic_publishing_system why_oh_why_cant_we_have_a_better_press_corps

Copy this bookmark: