12598
Thermodynamics as a science of symmetry - Springer
"A new interpretation of thermodynamics is advanced; thermodynamics is the study of those properties of macroscopic matter that follow from the symmetry properties of physical laws, mediated through the statistics of large systems."
to:NB  have_read  physics  statistical_mechanics  thermodynamics  re:what_is_a_macrostate 
17 days ago
The Causal Devolution
"This article discusses causal analysis as a paradigm for explanation in sociology. It begins with a detailed analysis of causality statements in Durkheim's Le suicide. It then discusses the history of causality assumptions in sociological writing since the 1930s, with brief remarks about the related discipline of econometrics. The author locates the origins of causal argument in a generation of brilliant and brash young sociologists with a model and a mission and then briefly considers the history of causality concepts in modern philosophy. The article closes with reflections on the problems created for sociology by the presumption that causal accounting is the epitome of explanation within the discipline. It is argued that sociology should spend more effort on (and should better reward) descriptive work."
to:NB  have_read  causality  causal_inference  sociology  social_science_methodology  abbott.andrew 
21 days ago
Forecasting High Tide: Predicting Times of Elevated Activity in Online Social Media
"Social media provides a powerful platform for influencers to broadcast content to a large audience of followers. In order to reach the greatest number of users, an important first step is to identify times when a large portion of a target population is active on social media, which requires modeling the behavior of those individuals. We propose three methods for behavior modeling: a simple seasonality approach based on time-of-day and day-of-week, an autoregressive approach based on aggregate fluctuations from seasonality, and an aggregation-of-individuals approach based on modeling the behavior of individual users. We test these methods on data collected from a set of users on Twitter in 2011 and 2012. We find that the performance of the methods at predicting times of high activity depends strongly on the tradeoff between true and false positives, with no method dominating. Our results highlight the challenges and opportunities involved in modeling complex social systems, and demonstrate how influencers interested in forecasting potential user engagement can use complexity modeling to make better decisions."
to:NB  time_series  prediction  statistics  social_media  kith_and_kin  girvan.michelle  darmon.david  rand.william 
4 weeks ago
Identifying Formal and Informal Influence in Technology Adoption with Network Externalities
"Firms introducing network technologies (whose benefits depend on who installs the technology) need to understand which user characteristics confer the greatest network benefits on other potential adopters. To examine which adopter characteristics matter, I use the introduction of a video-messaging technology in an investment bank. I use data on its 2,118 employees, their adoption decisions, and their 2.4 million subsequent calls. The video-messaging technology can also be used to watch TV. Exogenous shocks to the benefits of watching TV are used to identify the causal (network) externality of one individual user's adoption on others' adoption decisions. I allow this network externality to vary in size with a variety of measures of informal and formal influence. I find that adoption by either managers or workers in “boundary spanner” positions has a large impact on the adoption decisions of employees who wish to communicate with them. Adoption by ordinary workers has a negligible impact. This suggests that firms should target those who derive their informal influence from occupying key boundary-spanning positions in communication networks, in addition to those with sources of formal influence, when launching a new network technology."
to:NB  causal_inference  instrumental_variables  diffusion_of_innovations  statistics  social_influence  social_networks  re:homophily_and_confounding  to_be_shot_after_a_fair_trial  via:mcfowland 
4 weeks ago
teaching mathematics - Oxford Blog
"I've tried to make this a more positive piece, about some of the things I think children should learn about mathematics in primary school, along with a random collection of ideas for actual teaching."
mathematics  education  yee.danny  probably_utopian  still_goood_ideas  kith_and_kin 
4 weeks ago
Born Pupils? Natural Pedagogy and Cultural Pedagogy
"The theory of natural pedagogy is an important focus of research on the evolution and development of cultural learning. It proposes that we are born pupils; that human children genetically inherit a package of psychological adaptations that make them receptive to teaching. In this article, I first examine the components of the package—eye contact, contingencies, infant-directed speech, gaze cuing, and rational imitation—asking in each case whether current evidence indicates that the component is a reliable feature of infant behavior and a genetic adaptation for teaching. I then discuss three fundamental insights embodied in the theory: Imitation is not enough for cumulative cultural inheritance, the extra comes from blind trust, and tweaking is a powerful source of cognitive change. Combining the results of the empirical review with these insights, I argue that human receptivity to teaching is founded on nonspecific genetic adaptations for social bonding and social learning and acquires its species- and functionally specific features through the operation of domain-general processes of learning in sociocultural contexts. We engage, not in natural pedagogy, but in cultural pedagogy."
to:NB  cultural_transmission_of_cognitive_tools  evolutionary_psychology  education  cultural_evolution  cultural_transmission  psychology  via:rvenkat  re:democratic_cognition  re:do-institutions-evolve 
4 weeks ago
Why is it so hard to increase learning?
Some of this is disputable. (For instance, the affection for group projects, and missing the fact that, when teachers genuinely know more than students, having "360 degree" reviews is very dubious.) But it's heart is in the right place.
teaching  pedagogy  education  academia  via:?  have_read 
4 weeks ago
[1602.03253] A Kernelized Stein Discrepancy for Goodness-of-fit Tests and Model Evaluation
"We derive a new discrepancy statistic for measuring differences between two probability distributions based on a novel combination of Stein's method and the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly."
to:NB  statistics  kernel_methods  hilbert_space  goodness-of-fit  jordan.michael_i.  to_read  via:rvenkat 
4 weeks ago
Random Graphs, Geometry and Asymptotic Structure | Cambridge University Press
"The theory of random graphs is a vital part of the education of any researcher entering the fascinating world of combinatorics. However, due to their diverse nature, the geometric and structural aspects of the theory often remain an obscure part of the formative study of young combinatorialists and probabilists. Moreover, the theory itself, even in its most basic forms, is often considered too advanced to be part of undergraduate curricula, and those who are interested usually learn it mostly through self-study, covering a lot of its fundamentals but little of the more recent developments. This book provides a self-contained and concise introduction to recent developments and techniques for classical problems in the theory of random graphs. Moreover, it covers geometric and topological aspects of the theory and introduces the reader to the diversity and depth of the methods that have been devised in this context."
to:NB  books:noted  graph_theory  stochastic_processes  mathematics 
4 weeks ago
[1510.04342] Estimation and Inference of Heterogeneous Treatment Effects using Random Forests
"Many scientific and engineering challenges---ranging from personalized medicine to customized marketing recommendations---require an understanding of treatment effect heterogeneity. In this paper, we develop a non-parametric causal forest for estimating heterogeneous treatment effects that extends Breiman's widely used random forest algorithm. Given a potential outcomes framework with unconfoundedness, we show that causal forests are pointwise consistent for the true treatment effect, and have an asymptotically Gaussian and centered sampling distribution. We also discuss a practical method for constructing asymptotic confidence intervals for the true treatment effect that are centered at the causal forest estimates. Our theoretical results rely on a generic Gaussian theory for a large family of random forest algorithms, to our knowledge, this is the first set of results that allows any type of random forest, including classification and regression forests, to be used for provably valid statistical inference. In experiments, we find causal forests to be substantially more powerful than classical methods based on nearest-neighbor matching, especially as the number of covariates increases."
to:NB  decision_trees  causal_inference  statistics  regression  ensemble_methods  athey.susan 
4 weeks ago
[1504.01132] Recursive Partitioning for Heterogeneous Causal Effects
"In this paper we study the problems of estimating heterogeneity in causal effects in experimental or observational studies and conducting inference about the magnitude of the differences in treatment effects across subsets of the population. In applications, our method provides a data-driven approach to determine which subpopulations have large or small treatment effects and to test hypotheses about the differences in these effects. For experiments, our method allows researchers to identify heterogeneity in treatment effects that was not specified in a pre-analysis plan, without concern about invalidating inference due to multiple testing. In most of the literature on supervised machine learning (e.g. regression trees, random forests, LASSO, etc.), the goal is to build a model of the relationship between a unit's attributes and an observed outcome. A prominent role in these methods is played by cross-validation which compares predictions to actual outcomes in test samples, in order to select the level of complexity of the model that provides the best predictive power. Our method is closely related, but it differs in that it is tailored for predicting causal effects of a treatment rather than a unit's outcome. The challenge is that the "ground truth" for a causal effect is not observed for any individual unit: we observe the unit with the treatment, or without the treatment, but not both at the same time. Thus, it is not obvious how to use cross-validation to determine whether a causal effect has been accurately predicted. We propose several novel cross-validation criteria for this problem and demonstrate through simulations the conditions under which they perform better than standard methods for the problem of causal effects. We then apply the method to a large-scale field experiment re-ranking results on a search engine."
to:NB  decision_trees  regression  causal_inference  cross-validation  statistics  imbens.guido  athey.susan 
4 weeks ago
What works? - The Long and Short
Shorter D^2: the problem with evidence-based policy is the evidential basis.
(This is obviously correct, and may prompt another communication from the editorial board of the JEBH.)
dsquared  evidence_based  science_as_a_social_process  statistics  to:blog 
5 weeks ago
Reproducing Statistical Results - Annual Review of Statistics and Its Application, 2(1):1
"The reproducibility of statistical findings has become a concern not only for statisticians, but for all researchers engaged in empirical discovery. Section 2 of this article identifies key reasons statistical findings may not replicate, including power and sampling issues; misapplication of statistical tests; the instability of findings under reasonable perturbations of data or models; lack of access to methods, data, or equipment; and cultural barriers such as researcher incentives and rewards. Section 3 discusses five proposed remedies for these replication failures: improved prepublication and postpublication validation of findings; the complete disclosure of research steps; assessment of the stability of statistical findings; providing access to digital research objects, in particular data and software; and ensuring these objects are legally reusable."
to:NB  statistics  stodden.victoria  kith_and_kin  via:rvenkat 
5 weeks ago
Learning Deep Generative Models - Annual Review of Statistics and Its Application, 2(1):361
Building intelligent systems that are capable of extracting high-level representations from high-dimensional sensory data lies at the core of solving many artificial intelligence–related tasks, including object recognition, speech perception, and language understanding. Theoretical and biological arguments strongly suggest that building such systems requires models with deep architectures that involve many layers of nonlinear processing. In this article, we review several popular deep learning models, including deep belief networks and deep Boltzmann machines. We show that (a) these deep generative models, which contain many layers of latent variables and millions of parameters, can be learned efficiently, and (b) the learned high-level feature representations can be successfully applied in many application domains, including visual object recognition, information retrieval, classification, and regression tasks.
to:NB  statistics  neural_networks  salakhtudinov.ruslan 
5 weeks ago
Agent-Based Models and Microsimulation - Annual Review of Statistics and Its Application, 2(1):259
"Agent-based models (ABMs) are computational models used to simulate the actions and interactions of agents within a system. Usually, each agent has a relatively simple set of rules for how he or she responds to his or her environment and to other agents. These models are used to gain insight into the emergent behavior of complex systems with many agents, in which the emergent behavior depends upon the micro-level behavior of the individuals. ABMs are widely used in many fields, and this article reviews some of those applications. However, as relatively little work has been done on statistical inference for such models, this article also points out some of those gaps and recent strategies to address them."
to:NB  agent-based_models  statistics  simulation  banks.david 
5 weeks ago
[1602.04938] "Why Should I Trust You?": Explaining the Predictions of Any Classifier
"Despite widespread adoption, machine learning models remain mostly black boxes. Understanding the reasons behind predictions is, however, quite important in assessing trust in a model. Trust is fundamental if one plans to take action based on a prediction, or when choosing whether or not to deploy a new model. Such understanding further provides insights into the model, which can be used to turn an untrustworthy model or prediction into a trustworthy one.
"In this work, we propose LIME, a novel explanation technique that explains the predictions of any classifier in an interpretable and faithful manner, by learning an interpretable model locally around the prediction. We further propose a method to explain models by presenting representative individual predictions and their explanations in a non-redundant way, framing the task as a submodular optimization problem. We demonstrate the flexibility of these methods by explaining different models for text (e.g. random forests) and image classification (e.g. neural networks). The usefulness of explanations is shown via novel experiments, both simulated and with human subjects. Our explanations empower users in various scenarios that require trust: deciding if one should trust a prediction, choosing between models, improving an untrustworthy classifier, and detecting why a classifier should not be trusted."
to:NB  machine_learning  classifiers  explanation  guestrin.carlos  to_read  model_checking 
5 weeks ago
Secrecy at Work: The Hidden Architecture of Organizational Life | Jana Costas and Christopher Grey
"Secrecy is endemic within organizations, woven into the fabric of our lives at work. Yet, until now, we've had an all-too-limited understanding of this powerful organizational force. Secrecy is a part of work, and keeping secrets is a form of work. But also, secrecy creates a social order—a hidden architecture within our organizations. Drawing on previously overlooked texts, as well as well-known classics, Jana Costas and Christopher Grey identify three forms of secrecy: formal secrecy, as we see in the case of trade and state secrets based on law and regulation; informal secrecy based on networks and trust; and public or open secrecy, where what is known goes undiscussed. Animated with evocative examples from scholarship, current events, and works of fiction, this framework presents a bold reimagining of organizational life."
to:NB  books:noted  sociology  organizations  corporations  secrecy 
5 weeks ago
Our Overworked Security Bureaucracy : Democracy Journal
I seriously doubt that our current situation is really a more complex environment than dealing with a genuinely rival super-power, multiple decaying European empires (with which were allied), their opponents and succesors (which we wanted to court against the Soviets), _and_ much slower communications (and so more uncertainty).
But that's separate from whether the security bureaucracy isn't over-taxed in unhelpful ways; in particular, by not having time to think.
american_hegemony  us_military  our_decrepit_institutions  have_read 
6 weeks ago
Identifying Personal Genomes by Surname Inference | Science
"Sharing sequencing data sets without identifiers has become a common practice in genomics. Here, we report that surnames can be recovered from personal genomes by profiling short tandem repeats on the Y chromosome (Y-STRs) and querying recreational genetic genealogy databases. We show that a combination of a surname with other types of metadata, such as age and state, can be used to triangulate the identity of the target. A key feature of this technique is that it entirely relies on free, publicly accessible Internet resources. We quantitatively analyze the probability of identification for U.S. males. We further demonstrate the feasibility of this technique by tracing back with high probability the identities of multiple participants in public sequencing projects."
to:NB  privacy  genetics  statistics  via:arthegall  genomics 
6 weeks ago
Attribution of Extreme Weather Events in the Context of Climate Change | The National Academies Press
"As climate has warmed over recent years, a new pattern of more frequent and more intense weather events has unfolded across the globe. Climate models simulate such changes in extreme events, and some of the reasons for the changes are well understood. Warming increases the likelihood of extremely hot days and nights, favors increased atmospheric moisture that may result in more frequent heavy rainfall and snowfall, and leads to evaporation that can exacerbate droughts.
"Even with evidence of these broad trends, scientists cautioned in the past that individual weather events couldn’t be attributed to climate change. Now, with advances in understanding the climate science behind extreme events and the science of extreme event attribution, such blanket statements may not be accurate. The relatively young science of extreme event attribution seeks to tease out the influence of human-cause climate change from other factors, such as natural sources of variability like El Niño, as contributors to individual extreme events.
"Event attribution can answer questions about how much climate change influenced the probability or intensity of a specific type of weather event. As event attribution capabilities improve, they could help inform choices about assessing and managing risk, and in guiding climate adaptation strategies. This report examines the current state of science of extreme weather attribution, and identifies ways to move the science forward to improve attribution capabilities."
to:NB  books:noted  climatology  climate_change  probability  causal_inference  statistics  explanation 
6 weeks ago
Blockwise bootstrap of the estimated empirical process based on psi -weakly dependent observations - Springer
"The distributional convergence of the bootstrapped estimated empirical process is shown and bootstrap consistency in the sup-norm for test statistics based on that process. Bootstrapping the estimated empirical process has up to now been considered by assuming independence of the observations, where we give up this assumption now and allow the observations to be ψ-weakly dependent in the sense of Doukhan and Louhichi (Stoch Proc Appl 84:313–342, 1999). Due to the fact that no model assumptions on the original process are made, a nonparametric blockwise bootstrap procedure is used, which has previously been used in empirical process theory based on mixing observations. Here, we succeeded in proving that assuming l=o(n) and l→∞ as only conditions for the blocklength is sufficient to show convergence of the bootstrap process to the same limit as for the original process under H0, which is the weakest condition that has been imposed in that context up to now."
to:NB  empirical_processes  bootstrap  mixing  stochastic_processes  statistical_inference_for_stochastic_processes 
6 weeks ago
Gambetta, D. and Hertog, S.: Engineers of Jihad: The Curious Connection between Violent Extremism and Education. (eBook and Hardcover)
"The violent actions of a few extremists can alter the course of history, yet there persists a yawning gap between the potential impact of these individuals and what we understand about them. In Engineers of Jihad, Diego Gambetta and Steffen Hertog uncover two unexpected facts, which they imaginatively leverage to narrow that gap: they find that a disproportionate share of Islamist radicals come from an engineering background, and that Islamist and right-wing extremism have more in common than either does with left-wing extremism, in which engineers are absent while social scientists and humanities students are prominent.
"Searching for an explanation, they tackle four general questions about extremism: Under which socioeconomic conditions do people join extremist groups? Does the profile of extremists reflect how they self-select into extremism or how groups recruit them? Does ideology matter in sorting who joins which group? Lastly, is there a mindset susceptible to certain types of extremism?
"Using rigorous methods and several new datasets, they explain the link between educational discipline and type of radicalism by looking at two key factors: the social mobility (or lack thereof) for engineers in the Muslim world, and a particular mindset seeking order and hierarchy that is found more frequently among engineers. Engineers' presence in some extremist groups and not others, the authors argue, is a proxy for individual traits that may account for the much larger question of selective recruitment to radical activism.
"Opening up markedly new perspectives on the motivations of political violence, Engineers of Jihad yields unexpected answers about the nature and emergence of extremism."
to:NB  books:noted  engineers  terrorism  gambetta.diego 
7 weeks ago
Axtell, J.: Wisdom’s Workshop: The Rise of the Modern University. (eBook and Hardcover)
"When universities began in the Middle Ages, Pope Gregory IX described them as "wisdom's special workshop." He could not have foreseen how far these institutions would travel and develop. Tracing the eight-hundred-year evolution of the elite research university from its roots in medieval Europe to its remarkable incarnation today, Wisdom's Workshop places this durable institution in sweeping historical perspective. In particular, James Axtell focuses on the ways that the best American universities took on Continental influences, developing into the finest expressions of the modern university and enviable models for kindred institutions worldwide. Despite hand-wringing reports to the contrary, the venerable university continues to renew itself, becoming ever more indispensable to society in the United States and beyond.
"Born in Europe, the university did not mature in America until the late nineteenth century. Once its heirs proliferated from coast to coast, their national role expanded greatly during World War II and the Cold War. Axtell links the legacies of European universities and Tudor-Stuart Oxbridge to nine colonial and hundreds of pre–Civil War colleges, and delves into how U.S. universities were shaped by Americans who studied in German universities and adapted their discoveries to domestic conditions and goals. The graduate school, the PhD, and the research imperative became and remain the hallmarks of the American university system and higher education institutions around the globe."
books:noted  academia  education  universities  to:NB 
7 weeks ago
Statistically controlling for confounding constructs is harder than you think
"Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement- level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest—in some cases approaching 100%—when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity."
to:NB  have_read  measurement  social_measurement  social_science_methodology  psychometrics  econometrics  graphical_models  statistics  to_teach:undergrad-ADA  re:ADAfaEPoV  yarkoni.tal  to:blog 
8 weeks ago
PLOS ONE: Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time
"We explore whether the number of null results in large National Heart Lung, and Blood Institute (NHLBI) funded trials has increased over time.
"We identified all large NHLBI supported RCTs between 1970 and 2012 evaluating drugs or dietary supplements for the treatment or prevention of cardiovascular disease. Trials were included if direct costs >$500,000/year, participants were adult humans, and the primary outcome was cardiovascular risk, disease or death. The 55 trials meeting these criteria were coded for whether they were published prior to or after the year 2000, whether they registered in clinicaltrials.gov prior to publication, used active or placebo comparator, and whether or not the trial had industry co-sponsorship. We tabulated whether the study reported a positive, negative, or null result on the primary outcome variable and for total mortality.
"17 of 30 studies (57%) published prior to 2000 showed a significant benefit of intervention on the primary outcome in comparison to only 2 among the 25 (8%) trials published after 2000 (χ2=12.2,df= 1, p=0.0005). There has been no change in the proportion of trials that compared treatment to placebo versus active comparator. Industry co-sponsorship was unrelated to the probability of reporting a significant benefit. Pre-registration in clinical trials.gov was strongly associated with the trend toward null findings.
"The number NHLBI trials reporting positive results declined after the year 2000. Prospective declaration of outcomes in RCTs, and the adoption of transparent reporting standards, as required by clinicaltrials.gov, may have contributed to the trend toward null findings."

--- It _could_ be that all the useful treatments were found before 2000...
statistics  experiments  medicine  clinical_trials  re:neutral_model_of_inquiry 
8 weeks ago
Surabaya Beat: A Fairy Tale of Ships, Trade and Travels in Indonesia, Presser
"From 2012 to 2014, Swiss photographer and seaman Beat Presser traveled the vast Indonesia archipelago by boat, amassing an extensive collection of photographs that capture the complexity and beauty of life on the country’s tens of thousands of islands and the surrounding ocean waters.
"Surabaya Beat draws on Presser’s photographs to reconstruct his travels. Through high winds and heavy storms, he sails as part of the crew aboard traditional Indonesian pinisi, carrying fresh fruit bound for the port cities and the world-famous floating markets while learning from the local seamen about the country through its transport and trade. Along the way, he encounters fellow travelers, from tourists to oil exporters, and even officers of the Coast Guard who lend him a motorcycle and later invite him to join them for a dive. Presser’s masterful black-and-white photography lends a timeless tone to these and many other encounters in this popular travel destination that remains nevertheless a source of mystery to most outside Southeast Asia. Alongside more than one hundred photographs and his own writings, Presser has assembled short stories and poems by some of Indonesia’s most promising writers.
"Photographic opportunities abound in Indonesia, and Surabaya Beat represents Presser’s personal photographic vision that will entrance fans of photography and anyone looking to learn more about this incredible, vibrant country."
to:NB  books:noted  photos  travelers'_tales 
8 weeks ago
Diploma Mills
"The most significant shift in higher education over the past two decades has been the emergence of for-profit colleges and universities. These online and storefront institutions lure students with promises of fast degrees and "guaranteed" job placement, but what they deliver is often something quite different. In this provocative history of for-profit higher education, historian and educational researcher A. J. Angulo tells the remarkable and often sordid story of these "diploma mills," which target low-income and nontraditional students while scooping up a disproportionate amount of federal student aid.
"Tapping into a little-known history with big implications, Angulo takes readers on a lively journey that begins with the apprenticeship system of colonial America and ends with today’s politically savvy $35 billion multinational for-profit industry. He traces the transformation of nineteenth-century reading and writing schools into "commercial" and "business" colleges, explores the early twentieth century’s move toward professionalization and progressivism, and explains why the GI Bill prompted a surge of new for-profit institutions. He also shows how well-founded concerns about profit-seeking in higher education have evolved over the centuries and argues that financial gaming and maneuvering by these institutions threatens to destabilize the entire federal student aid program.
"This is the first sweeping narrative history to explain why for-profits have mattered to students, taxpayers, lawmakers, and the many others who have viewed higher education as part of the American dream. Diploma Mills speaks to today’s concerns by shedding light on unmistakable conflicts of interest long associated with this scandal-plagued class of colleges and universities."
to:NB  books:noted  education  academia  our_decrepit_institutions  market_failures_in_everything  fraud 
8 weeks ago
Information Geometry and Its Applications | Shun-ichi Amari | Springer
"This is the first comprehensive book on information geometry, written by the founder of the field. It begins with an elementary introduction to dualistic geometry and proceeds to a wide range of applications, covering information science, engineering, and neuroscience. It consists of four parts, which on the whole can be read independently. A manifold with a divergence function is first introduced, leading directly to dualistic structure, the heart of information geometry. This part (Part I) can be apprehended without any knowledge of differential geometry. An intuitive explanation of modern differential geometry then follows in Part II, although the book is for the most part understandable without modern differential geometry. Information geometry of statistical inference, including time series analysis and semiparametric estimation (the Neyman–Scott problem), is demonstrated concisely in Part III. Applications addressed in Part IV include hot current topics in machine learning, signal processing, optimization, and neural networks. The book is interdisciplinary, connecting mathematics, information sciences, physics, and neurosciences, inviting readers to a new world of information and geometry. This book is highly recommended to graduate students and researchers who seek new mathematical methods and tools useful in their own fields.'
to:NB  books:noted  information_geometry  information_theory  geometry  statistics  amari.shun-ichi 
8 weeks ago
Cultural Phylogenetics - Concepts and Applications | Larissa Mendoza Straffon | Springer
"This book explores the potential and challenges of implementing evolutionary phylogenetic methods in archaeological research, by discussing key concepts and presenting concrete applications of these approaches.
"The volume is divided into two parts: The first covers the theoretical and conceptual implications of using evolution-based models in the sociocultural domain, illustrates the sorts of questions that these methods can help answer, and invites the reader to reflect on the opportunities and limitations of these perspectives. The second part comprises case studies that address relevant empirical issues, such as inferring patterns and rates of cultural transmission, detecting selective pressures in cultural evolution, and explaining the nature of cultural variation."
to:NB  books:noted  archaeology  cultural_evolution  phylogenetics  re:do-institutions-evolve 
8 weeks ago
Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop | The National Academies Press
"Questions about the reproducibility of scientific research have been raised in numerous settings and have gained visibility through several high-profile journal and popular press articles. Quantitative issues contributing to reproducibility challenges have been considered (including improper data measurement and analysis, inadequate statistical expertise, and incomplete data, among others), but there is no clear consensus on how best to approach or to minimize these problems.
"A lack of reproducibility of scientific results has created some distrust in scientific findings among the general public, scientists, funding agencies, and industries. While studies fail for a variety of reasons, many factors contribute to the lack of perfect reproducibility, including insufficient training in experimental design, misaligned incentives for publication and the implications for university tenure, intentional manipulation, poor data management and analysis, and inadequate instances of statistical inference.
"The workshop summarized in this report was designed not to address the social and experimental challenges but instead to focus on the latter issues of improper data management and analysis, inadequate statistical expertise, incomplete data, and difficulties applying sound statistic inference to the available data. Many efforts have emerged over recent years to draw attention to and improve reproducibility of scientific work. This report uniquely focuses on the statistical perspective of three issues: the extent of reproducibility, the causes of reproducibility failures, and the potential remedies for these failures."
to:NB  to_read  books:noted  statistics  science  science_as_a_social_process  kith_and_kin 
8 weeks ago
The Market for Financial Adviser Misconduct by Mark Egan, Gregor Matvos, Amit Seru :: SSRN
"We construct a novel database containing the universe of financial advisers in the United States from 2005 to 2015, representing approximately 10% of employment of the finance and insurance sector. Roughly 7% of advisers have misconduct records. Prior offenders are five times as likely to engage in new misconduct as the average financial adviser. Firms discipline misconduct: approximately half of financial advisers lose their job after misconduct. The labor market partially undoes firm-level discipline: of these advisers, 44% are reemployed in the financial services industry within a year. Reemployment is not costless. Following misconduct, advisers face longer unemployment spells, and move to less reputable firms, with a 10% reduction in compensation. Additionally, firms that hire these advisers also have higher rates of prior misconduct themselves. We find similar results for advisers of dissolved firms, in which all advisers are forced to find new employment independent of past misconduct or performance. Firms that persistently engage in misconduct coexist with firms that have clean records. We show that differences in consumer sophistication may be partially responsible for this phenomenon: misconduct is concentrated in firms with retail customers and in counties with low education, elderly populations, and high incomes. Our findings suggest that some firms "specialize" in misconduct and cater to unsophisticated consumers, while others use their reputation to attract sophisticated consumers."
fraud  finance  market_failures_in_everything  to:NB  class_struggles_in_america  via:? 
8 weeks ago
Nudges Aren’t Enough for Problems Like Retirement Savings - The New York Times
'"The single biggest contribution of behavioral economics to public policy is taking this flawed approach to retirement savings and making it a little bit more viable,” Mr. Loewenstein told me. “The downside is that if we make it just sufficiently viable, people won’t recognize how bankrupt the concept is.”' --- Yay, George!
public_policy  behavioral_economics  re:anti-nudge  to:blog 
8 weeks ago
School Finance Reform and the Distribution of Student Achievement
"We study the impact of post-1990 school finance reforms, during the so-called "adequacy" era, on gaps in spending and achievement between high-income and low-income school districts. Using an event study design, we find that reform events--court orders and legislative reforms--lead to sharp, immediate, and sustained increases in absolute and relative spending in low-income school districts. Using representative samples from the National Assessment of Educational Progress, we also find that reforms cause gradual increases in the relative achievement of students in low-income school districts, consistent with the goal of improving educational opportunity for these students. The implied effect of school resources on educational achievement is large."

- Last tag depends on replication data, which might not be available.
to:NB  education  inequality  us_politics  causal_inference  to_teach:undergrad-ADA  via:jbdelong 
8 weeks ago
Hive Mind: How Your Nation’s IQ Matters So Much More Than Your Own | Garett Jones
"Over the last few decades, economists and psychologists have quietly documented the many ways in which a person's IQ matters. But, research suggests that a nation's IQ matters so much more.
"As Garett Jones argues in Hive Mind, modest differences in national IQ can explain most cross-country inequalities. Whereas IQ scores do a moderately good job of predicting individual wages, information processing power, and brain size, a country's average score is a much stronger bellwether of its overall prosperity.
"Drawing on an expansive array of research from psychology, economics, management, and political science, Jones argues that intelligence and cognitive skill are significantly more important on a national level than on an individual one because they have "positive spillovers." On average, people who do better on standardized tests are more patient, more cooperative, and have better memories. As a result, these qualities—and others necessary to take on the complexity of a modern economy—become more prevalent in a society as national test scores rise. What's more, when we are surrounded by slightly more patient, informed, and cooperative neighbors we take on these qualities a bit more ourselves. In other words, the worker bees in every nation create a "hive mind" with a power all its own. Once the hive is established, each individual has only a tiny impact on his or her own life.
"Jones makes the case that, through better nutrition and schooling, we can raise IQ, thereby fostering higher savings rates, more productive teams, and more effective bureaucracies. After demonstrating how test scores that matter little for individuals can mean a world of difference for nations, the book leaves readers with policy-oriented conclusions and hopeful speculation: Whether we lift up the bottom through changing the nature of work, institutional improvements, or freer immigration, it is possible that this period of massive global inequality will be a short season by the standards of human history if we raise our global IQ."

--- The last tag applies with vehemence.
to:NB  books:noted  iq  cognitive_development  economics  economic_growth  to_be_shot_after_a_fair_trial 
8 weeks ago
The Colonial Origins of Ethnic Violence in India | Ajay Verghese
"The neighboring north Indian districts of Jaipur and Ajmer are identical in language, geography, and religious and caste demography. But when the famous Babri Mosque in Ayodhya was destroyed in 1992, Jaipur burned while Ajmer remained peaceful; when the state clashed over low-caste affirmative action quotas in 2008, Ajmer's residents rioted while Jaipur's citizens stayed calm. What explains these divergent patterns of ethnic conflict across multiethnic states? Using archival research and elite interviews in five case studies spanning north, south, and east India, as well as a quantitative analysis of 589 districts, Ajay Verghese shows that the legacies of British colonialism drive contemporary conflict.
"Because India served as a model for British colonial expansion into parts of Africa and Southeast Asia, this project links Indian ethnic conflict to violent outcomes across an array of multiethnic states, including cases as diverse as Nigeria and Malaysia. The Colonial Origins of Ethnic Violence in India makes important contributions to the study of Indian politics, ethnicity, conflict, and historical legacies."
to:NB  books:noted  imperialism  india  violence  political_science 
8 weeks ago
The Problem With Evidence-Based Policies by Ricardo Hausmann - Project Syndicate
These are good points, but let's think about how his thought experiment differs from just doing nothing and letting people do whatever....
evidence_based  experiments  social_science_methodology  have_read  to:blog  via:henry_farrell  re:democratic_cognition 
8 weeks ago
Information, Inequality, and Mass Polarization: Ideology in Advanced Democracies
"Growing polarization in the American Congress is closely related to rising income inequality. Yet there has been no corresponding polarization of the U.S. electorate, and across advanced democracies, mass polarization is negatively related to income inequality. To explain this puzzle, we propose a comparative political economy model of mass polarization in which the same institutional factors that generate income inequality also undermine political information. We explain why more voters then place themselves in the ideological center, hence generating a negative correlation between mass polarization and inequality. We confirm these conjectures on individual-level data for 20 democracies, and we then show that democracies cluster into two types: one with high inequality, low mass polarization, and polarized and right-shifted elites (e.g., the United States); and the other with low inequality and high mass polarization with left-shifted elites (e.g., Sweden). This division reflects long-standing differences in educational systems, the role of unions, and social networks."

--- Replication data available?
political_economy  political_science  social_networks  unions  political_parties  inequality  class_struggles_in_america  whats_gone_wrong_with_america  re:democratic_cognition  democracy  via:henry_farrell  to_read  to_teach:undergrad-ADA 
9 weeks ago
Cat Basis Pursuit
Will the statistical machine learning reading group meet on 1 April?
machine_learning  cats  funny:geeky  principal_components  sparsity 
9 weeks ago
Groundwater Histories of Iran & the Mediterranean | Dissertation Reviews
"A review of Hidden Waters: Groundwater Histories of Iran and the Mediterranean, by Abigail E. Schade.
"Abigail Schade’s dissertation is an examination of different techniques of accessing and using groundwater in different regions, and also of past literatures that have examined these techniques from various standpoints and with diverse underlying assumptions. The thesis is presented in five chapters, and each deals with a markedly different situation, varying across literature, space and time, and giving us a wide and thorough overview of different methods in, and changing perceptions of, the exploitation of groundwater resources. The dissertation thus examines historical practices relating to human exploitation of physical spaces, but also with the perception and projection of those spaces, opening with a careful consideration of the geographer Paul Ward English’s work on the spread of qanats in the old world. There is also a substantial appendix giving an English translation of a crucial eleventh-century Arabic text on extracting groundwater, and there are a good number of images and maps throughout the thesis where visual illustration is required."
history_of_science  history_of_technology  hydraulics  books:noted 
9 weeks ago
Losing Afghanistan: An Obituary for the Intervention | Noah Coburn
"The U.S.-led intervention in Afghanistan mobilized troops, funds, and people on an international level not seen since World War II. Hundreds of thousands of individuals and tens of billions of dollars flowed into the country. But what was gained for Afghanistan—or for the international community that footed the bill? Why did development money not lead to more development? Why did a military presence make things more dangerous?
"Through the stories of four individuals—an ambassador, a Navy SEAL, a young Afghan businessman, and a wind energy engineer—Noah Coburn weaves a vivid account of the challenges and contradictions of life during the intervention. Looking particularly at the communities around Bagram Airbase, this ethnography considers how Afghans viewed and attempted to use the intervention and how those at the base tried to understand the communities around them. These compelling stories step outside the tired paradigms of 'unruly' Afghan tribes, an effective Taliban resistance, and a corrupt Karzai government to show how the intervention became an entity unto itself, one doomed to collapse under the weight of its own bureaucracy and contradictory intentions."
to:NB  books:noted  afghanistan  us_military  ethnography  the_continuing_crises 
9 weeks ago
The melting-away of North Atlantic social democracy - Equitable Growth
"We as a civilization could decide that we are not willing to let money talk so loudly in politics. We could keep our politics from being one of establishing monopoly after monopoly and rent-extraction chokepoint after rent-extraction chokepoint. If we manage that, then the forecasts of Keynes (1936) and Rognlie (2015 will come true, and a rise in wealth accumulation will carry with it a fall in the rate of profit, and a highly-productive not-too-unequal society.
"But right now money talks very loudly indeed. And I leave the Piketty debate more depressed about our ability to keep it from talking so loudly. What makes me more depressed? The Piketty debate itself does: The eagerness of so-many economists to aggressively make so many shoddy arguments that Piketty does not know what he is talking about–that makes me think that Piketty does indeed know what he is talking about."
inequality  political_economy  economics  piketty.thomas  delong.brad  class_struggles_in_america 
9 weeks ago
Earth and other unlikely worlds: Strange Gifts From The Gods
"Truly advanced technologies aspire to the condition of 2001: A Space Odyssey’s black monoliths. Pursuing cryptic plans of their own, changing and manipulating us in unknown, unpredictable ways. Strange gifts of the gods, indistinguishable from magic. All we can do is hope to appease them by cargo-cult ceremonies that borrow gestures and language from science. Already, many machines in daily use are imprinted with a warning that echoes the curses sometimes set on ancient Egyptian tombs: Warranty Void If Opened."
mcauley.paul  the_singularity_has_happened  the_re-enchantment_of_the_world 
10 weeks ago
I Am The New Person You Have To Know About Now | ClickHole
This is too good a premise for a horror story not to have been used multiple times, but I cannot bring to mind any examples.
funny:because_its_true  popular_culture  epidemiology_of_representations 
10 weeks ago
[1511.02976] Dynamic fluctuations in global brain network topology characterize functional states during rest and behavior
"Higher brain function relies upon the ability to flexibly integrate information across specialized communities of macroscopic brain regions, but it is unclear how this mechanism manifests over time. Here we characterized patterns of time-resolved functional connectivity using resting state and task fMRI data from a large cohort of unrelated subjects. Our results demonstrate that dynamic fluctuations in network structure during the resting state reflect transitions between states of integrated and segregated network topology. These patterns were altered during task performance, demonstrating a higher level of network integration that tracked with the complexity of the task and correlated with effective behavioral performance. Replication analysis demonstrated that these results were reproducible across sessions, sample populations and datasets. Together these results provide insight into the brain's coordination between integration and segregation and highlight key principles underlying the reorganization of the network architecture of the brain during both rest and behavior."
to:NB  to_read  functional_connectivity  neural_data_analysis  neuroscience  network_data_analysis  re:network_differences  poldrack.russell 
10 weeks ago
Two Revolutions
"The tension is that, increasingly, people who come in to the world of social science wanting to work with data tend to have little or no prior experience with text-based, command-line, file-system-dependent tools. In many cases, they do not have much experience multi-tasking in a windowing environment, either, at least in the sense of making applications work together in the service of a single goal. To be clear, this is not something to blame users for, and neither is it something to complain about in misguided nostalgia for the command line. Rather, it is an aspect of how computer use is changing at a very large scale. The coding and data analysis tools we have are powerful and for the most part meant to allow work to be opened up and inspected. But the way they work clearly run against the grain of where everyday, end-use computing is going, which is to hide many implementation details and focus on single-purpose tasks. Again, specialized tools are necessarily specialized. The net result for the social sciences in the short to medium term, I think, is that we will have a suite of powerful tools that enable an amazing variety of scientific activity, developed in the open and mostly available for free. But it will get harder to teach people how to use them."

--- This is in complete agreement with my experiences teaching advanced undergraduate statistics, at a very tech-y school. (With the wrinkle that I have a sub-population of students who have been doing stuff at the command line since their early teens.)
statistics  teaching  statistical_computing  healy.kieran 
10 weeks ago
Homicide in Eighteenth-Century Scotland: Numbers and Theories - Edinburgh University Press
"The purpose of this article is to address the lacuna in our knowledge of the extent of interpersonal violence in eighteenth-century Scotland, with particular reference to homicide, and in doing so use these findings to examine the theoretical and empirical issues that have dominated historical discourse regarding this phenomenon over the last few decades. Essentially, it seeks to challenge widely held explanations for the alleged long-term decline in homicide, arguing that incidences of murder in the eighteenth century were affected more by political tensions and socio-economic dislocation than by cultural changes in taste and manners. It also criticises the methodological weaknesses evident in longitudinal studies of homicide and tries to resolve them in two ways: firstly, by adjusting the homicide rate to take account of the rises and falls in population in the period 1700–1799; and, secondly, by providing national data rather than relying on extrapolating national trends from local or regional studies. Finally, it is argued that the main assumptions of historians working in the field of homicide studies are in the light of evidence for Scotland in need of revision as data from there provide little support for a linear fall in the level of homicides, or a link with shifts in sentiment and/or taste as put forward by those influenced by the civilising theories of Norbert Elias."

--- Smoothing over time, with a generalized additive model (though only one predictor variable, so really a spline + a fancy link function). Perhaps usable as an example.
to:NB  to_read  violence  statistics  early_modern_european_history  the_civilizing_process  scotland  to_teach:undergrad-ADA 
10 weeks ago
It’s not Cyberspace anymore. — Data & Society: Points — Medium
Oh my, yes. (I was just enough older than boyd to think Barlow was being a bit silly, but that clearly we were on the same side.) The whole thing is worth reading, but this bit really resonated:

"We built the Internet hoping that the world would come. The world did, but the dream that drove so many of us in the early days isn’t the dream of those who are shaping the Internet today. Now what?"
networked_life  the_wired_ideology  to:blog  body.danah 
11 weeks ago
Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning
"Randomized neural networks are immortalized in this AI Koan: _In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6. What are you doing?'' asked Minsky. I am training a randomly wired neural net to play tic-tac-toe,'' Sussman replied. Why is the net wired randomly?'' asked Minsky. Sussman replied, I do not want it to have any preconceptions of how to play.'' Minsky then shut his eyes. Why do you close your eyes?'' Sussman asked his teacher. So that the room will be empty,'' replied Minsky. At that moment, Sussman was enlightened._ We analyze shallow random networks with the help of concentration of measure inequalities. Specifically, we consider architectures that compute a weighted sum of their inputs after passing them through a bank of arbitrary randomized nonlinearities. We identify conditions under which these networks exhibit good classification performance, and bound their test error in terms of the size of the dataset and the number of random nonlinearities."
to:NB  have_read  random_projections  kernel_methods  prediction  computational_statistics  statistics  classifiers 
11 weeks ago
Fastfood - Computing Hilbert Space Expansions in loglinear time | ICML 2013 | JMLR W&CP
"Fast nonlinear function classes are crucial for nonparametric estimation, such as in kernel methods. This paper proposes an improvement to random kitchen sinks that offers significantly faster computation in log-linear time without sacrificing accuracy. Furthermore, we show how one may adjust the regularization properties of the kernel simply by changing the spectral distribution of the projection matrix. We provide experimental results which show that even for for moderately small problems we already achieve two orders of magnitude faster computation and three orders of magnitude lower memory footprint."
to:NB  regression  nonparametrics  computational_statistics  hilbert_space  kernel_methods  smola.alex  le.quoc  prediction  statistics  random_projections 
11 weeks ago
AWS Service Terms
"57.10 Acceptable Use; Safety-Critical Systems. Your use of the Lumberyard Materials must comply with the AWS Acceptable Use Policy. The Lumberyard Materials are not intended for use with life-critical or safety-critical systems, such as use in operation of medical equipment, automated transportation systems, autonomous vehicles, aircraft or air traffic control, nuclear facilities, manned spacecraft, or military use in connection with live combat. However, this restriction will not apply in the event of the occurrence (certified by the United States Centers for Disease Control or successor body) of a widespread viral infection transmitted via bites or contact with bodily fluids that causes human corpses to reanimate and seek to consume living human flesh, blood, brain or nerve tissue and is likely to result in the fall of organized civilization."
funny:geeky  well_slightly_funny  zombies  amazon 
11 weeks ago
The Koch Effect: The Impact of a Cadre-Led Network on American Politics
"Presidential election years attract attention to the rhetoric, personalities, and agendas of contending White House aspirants, but these headlines do not reflect the ongoing political shifts that will confront whoever moves into the White House in 2017. Earthquakes and erosions have remade the U.S. political terrain, reconfiguring the ground on which politicians and social groups must maneuver, and it is important to make sure that narrow and short-term analyses do not blind us to this shifting terrain. In this paper, we draw from research on changes since 2000 in the organizational universes surrounding the Republican and Democratic parties to highlight a major emergent force in U.S. politics: the recently expanded “Koch network” that coordinates big money funders, idea producers, issue advocates, and innovative constituency-building efforts in an ongoing effort to pull the Republican Party and agendas of U.S. politics sharply to the right. We review the major components and evolution of the Koch network and explore how it has reshaped American politics and policy agendas, focusing especially on implications for right-tilted partisan polarization and rising economic inequality."
to:NB  to_read  political_science  us_politics  inequality  political_networks  vast_right-wing_conspiracy  skocpol.theda  class_struggles_in_america 
11 weeks ago
[1601.00934] Confidence Intervals for Projections of Partially Identified Parameters
"This paper proposes a bootstrap-based procedure to build confidence intervals for single components of a partially identified parameter vector, and for smooth functions of such components, in moment (in)equality models. The extreme points of our confidence interval are obtained by maximizing/minimizing the value of the component (or function) of interest subject to the sample analog of the moment (in)equality conditions properly relaxed. The novelty is that the amount of relaxation, or critical level, is computed so that the component of θ, instead of θ itself, is uniformly asymptotically covered with prespecified probability. Calibration of the critical level is based on repeatedly checking feasibility of linear programming problems, rendering it computationally attractive. Computation of the extreme points of the confidence interval is based on a novel application of the response surface method for global optimization, which may prove of independent interest also for applications of other methods of inference in the moment (in)equalities literature. The critical level is by construction smaller (in finite sample) than the one used if projecting confidence regions designed to cover the entire parameter vector θ. Hence, our confidence interval is weakly shorter than the projection of established confidence sets (Andrews and Soares, 2010), if one holds the choice of tuning parameters constant. We provide simple conditions under which the comparison is strict. Our inference method controls asymptotic coverage uniformly over a large class of data generating processes. Our assumptions and those used in the leading alternative approach (a profiling based method) are not nested. We explain why we employ some restrictions that are not required by other methods and provide examples of models for which our method is uniformly valid but profiling based methods are not."
to:NB  statistics  confidence_sets  partial_identification  bootstrap 
11 weeks ago
[1601.07460] Information-theoretic lower bounds on learning the structure of Bayesian networks
"In this paper, we study the information theoretic limits of learning the structure of Bayesian networks from data. We show that for Bayesian networks on continuous as well as discrete random variables, there exists a parameterization of the Bayesian network such that, the minimum number of samples required to learn the "true" Bayesian network grows as (m), where m is the number of variables in the network. Further, for sparse Bayesian networks, where the number of parents of any variable in the network is restricted to be at most l for l≪m, the minimum number of samples required grows as (llogm). We discuss conditions under which these limits are achieved. For Bayesian networks over continuous variables, we obtain results for Gaussian regression and Gumbel Bayesian networks. While for the discrete variables, we obtain results for Noisy-OR, Conditional Probability Table (CPT) based Bayesian networks and Logistic regression networks. Finally, as a byproduct, we also obtain lower bounds on the sample complexity of feature selection in logistic regression and show that the bounds are sharp."
to:NB  graphical_models  model_discovery  statistics  information_theory 
11 weeks ago
« earlier      
academia afghanistan agent-based_models american_history ancient_history archaeology art bad_data_analysis bad_science_journalism bayesian_consistency bayesianism biochemical_networks blogged book_reviews books:noted books:owned books:recommended bootstrap causal_inference causality central_asia central_limit_theorem class_struggles_in_america classifiers climate_change clustering cognitive_science collective_cognition comics community_discovery complexity computational_statistics concentration_of_measure confidence_sets corruption coveted crime cross-validation cthulhiana cultural_criticism cultural_exchange data_analysis data_mining debunking decision-making decision_theory delong.brad democracy density_estimation design dimension_reduction distributed_systems dynamical_systems econometrics economic_history economic_policy economics education empirical_processes ensemble_methods entropy_estimation epidemic_models epistemology ergodic_theory estimation evisceration evolution_of_cooperation evolutionary_biology experimental_psychology finance financial_crisis_of_2007-- financial_speculation fmri food fraud functional_connectivity funny funny:academic funny:geeky funny:laughing_instead_of_screaming funny:malicious funny:pointed genetics graph_theory graphical_models have_read heard_the_talk heavy_tails high-dimensional_statistics hilbert_space history_of_ideas history_of_science human_genetics hypothesis_testing ideology imperialism in_nb inequality inference_to_latent_objects information_theory institutions kernel_methods kith_and_kin krugman.paul large_deviations lasso learning_theory liberman.mark likelihood linguistics literary_criticism machine_learning macro_from_micro macroeconomics manifold_learning market_failures_in_everything markov_models mathematics mixing model_selection modeling modern_ruins monte_carlo moral_psychology moral_responsibility mortgage_crisis natural_history_of_truthiness network_data_analysis networked_life networks neural_data_analysis neuroscience non-equilibrium nonparametrics optimization our_decrepit_institutions philosophy philosophy_of_science photos physics pittsburgh political_economy political_science practices_relating_to_the_transmission_of_genetic_information prediction pretty_pictures principal_components probability programming progressive_forces psychology r racism random_fields re:almost_none re:aos_project re:democratic_cognition re:do-institutions-evolve re:g_paper re:homophily_and_confounding re:network_differences re:smoothing_adjacency_matrices re:social_networks_as_sensor_networks re:stacs re:your_favorite_dsge_sucks recipes regression running_dogs_of_reaction science_as_a_social_process science_fiction simulation social_influence social_life_of_the_mind social_media social_networks social_science_methodology sociology something_about_america sparsity spatial_statistics state-space_models statistical_inference_for_stochastic_processes statistical_mechanics statistics stochastic_processes text_mining the_american_dilemma the_continuing_crises time_series to:blog to:nb to_be_shot_after_a_fair_trial to_read to_teach:complexity-and-inference to_teach:data-mining to_teach:statcomp to_teach:undergrad-ada track_down_references us-iraq_war us_politics utter_stupidity vast_right-wing_conspiracy via:? via:henry_farrell via:jbdelong via:klk visual_display_of_quantitative_information whats_gone_wrong_with_america why_oh_why_cant_we_have_a_better_academic_publishing_system why_oh_why_cant_we_have_a_better_press_corps

Copy this bookmark:



description:


tags: