13522
[1710.05468] Generalization in Deep Learning
"This paper explains why deep learning can generalize well, despite large capacity and possible algorithmic instability, nonrobustness, and sharp minima, effectively addressing an open problem in the literature. Based on our theoretical insight, this paper also proposes a family of new regularization methods. Its simplest member was empirically shown to improve base models and achieve state-of-the-art performance on MNIST and CIFAR-10 benchmarks. Moreover, this paper presents both data-dependent and data-independent generalization guarantees with improved convergence rates. Our results suggest several new open areas of research."
to:NB  learning_theory  to_read 
yesterday
A Pottery Barn rule for scientific journals – The Hardest Science
"Proposed: Once a journal has published a study, it becomes responsible for publishing direct replications of that study. Publication is subject to editorial review of technical merit but is not dependent on outcome. Replications shall be published as brief reports in an online supplement, linked from the electronic version of the original."

--- I like this proposal very much.
why_oh_why_cant_we_have_a_better_academic_publishing_system  science_as_a_social_process  have_read 
yesterday
How Much Should We Trust Ideal Point Estmates from Surveys?
Ans.: Not at all.
Commentary, in no particular order:
(1) Justin Gross, in his 2010 Ph.D. thesis, looked at the stability of NOMINATE scores in Senate roll-call voting, across various sub-sets of votes. His main interest there was in getting at uncertainty in statements like "the 32nd most conservative senator", but some of his findings would also apply to CV for that context. (I don't know if Justin ever published that separately.)
(2) I suspect there is something funny about the way they are doing CV, because when they simulate from 1D ideal-point models, it doesn't favor 1 dimension! I think it might be better to make the validation set be distributed across both questions and items (as in Dabbs & Junker on network CV, https://arxiv.org/abs/1605.03000), but I am not sure.
(3) Regressing changes in average likelihood to see which sub-groups benefit most from an ideal point model is just weird. At the very least, I'd use log-likelihood, and there should be some way of getting at this by estimating a joint model, with both covariates and latent ideal points.
to:NB  public_opinion  inference_to_latent_objects  statistics  have_read  via:henry_farrell  cross-validation 
7 days ago
When and how to satisfice: an experimental investigation | SpringerLink
"This paper is about satisficing behaviour. Rather tautologically, this is when decision-makers are satisfied with achieving some objective, rather than in obtaining the best outcome. The term was coined by Simon (Q J Econ 69:99–118, 1955), and has stimulated many discussions and theories. Prominent amongst these theories are models of incomplete preferences, models of behaviour under ambiguity, theories of rational inattention, and search theories. Most of these, however, seem to lack an answer to at least one of two key questions: when should the decision-maker (DM) satisfice; and how should the DM satisfice. In a sense, search models answer the latter question (in that the theory tells the DM when to stop searching), but not the former; moreover, usually the question as to whether any search at all is justified is left to a footnote. A recent paper by Manski (Theory Decis. doi:10.1007/s11238-017-9592-1, 2017) fills the gaps in the literature and answers the questions: when and how to satisfice? He achieves this by setting the decision problem in an ambiguous situation (so that probabilities do not exist, and many preference functionals can therefore not be applied) and by using the Minimax Regret criterion as the preference functional. The results are simple and intuitive. This paper reports on an experimental test of his theory. The results show that some of his propositions (those relating to the ‘how’) appear to be empirically valid while others (those relating to the ‘when’) are less so."

--- I am continually impressed by economist's resistance to Simon's (very simple) points about computational complexity. (But: two cheers for actually doing something empirical.)
to:NB  decision-making  decision_theory  bounded_rationality 
8 days ago
Block Bootstrap for the Empirical Process of Long-Range Dependent Data - Tewes - 2017 - Journal of Time Series Analysis - Wiley Online Library
"We consider the bootstrapped empirical process of long-range dependent data. It is shown that this process converges to a semi-degenerate limit, where the random part of this limit is always Gaussian. Thus the bootstrap might fail when the original empirical process accomplishes a noncentral limit theorem. However, even in this case our results can be used to estimate a nuisance parameter that appears in the limit of many nonparametric tests under long memory. Moreover, we develop a new resampling procedure for goodness-of-fit tests and a test for monotonicity of transformations."
to:NB  stochastic_processes  time_series  statistics  bootstrap  empirical_processes  long-range_dependence 
8 days ago
Automated conjecturing III | SpringerLink
"Discovery in mathematics is a prototypical intelligent behavior, and an early and continuing goal of artificial intelligence research. We present a heuristic for producing mathematical conjectures of a certain typical form and demonstrate its utility. Our program conjectures relations that hold between properties of objects (property-relation conjectures). These objects can be of a wide variety of types. The statements are true for all objects known to the program, and are the simplest statements which are true of all these objects. The examples here include new conjectures for the hamiltonicity of a graph, a well-studied property of graphs. While our motivation and experiments have been to produce mathematical conjectures—and to contribute to mathematical research—other kinds of interesting property-relation conjectures can be imagined, and this research may be more generally applicable to the development of intelligent machinery."
to:NB  heuristics  artificial_intelligence  mathematics 
9 days ago
Live Work Work Work Die | Corey Pein | Macmillan
"At the height of the startup boom, journalist Corey Pein set out for Silicon Valley with little more than a smartphone and his wits. His goal: to learn how such an overhyped industry could possibly sustain itself as long as it has. Determined to cut through the clichés of big tech—the relentless optimism, the mandatory enthusiasm, and the earnest, incessant repetition of vacuous buzzwords—Pein decided that he would need to take an approach as unorthodox as the companies he would soon be covering. To truly understand the delirious reality of a Silicon Valley entrepreneur, he knew, he would have to inhabit that perspective—he would have to become an entrepreneur. Thus he begins his journey—skulking through gimmicky tech conferences, pitching his over-the-top business ideas to investors, and interviewing a cast of outrageous characters: cyborgs and con artists, Teamsters and transhumanists, jittery hackers and naive upstart programmers whose entire lives are managed by their employers—who work endlessly and obediently, never thinking to question their place in the system.
"In showing us this frantic world, Pein challenges the positive, feel-good self-image that the tech tycoons have crafted—as nerdy and benevolent creators of wealth and opportunity—revealing their self-justifying views and their insidious visions for the future. Indeed, as Pein shows, Silicon Valley is awash in disreputable ideas: Google executive and futurist Raymond Kurzweil has a side business peddling dietary supplements and has for years pushed the outlandish notion that human beings are destined to merge with computers and live forever in some kind of digital cosmic hive mind. Peter Thiel, the billionaire venture capitalist affiliated with PayPal and Facebook, is now an important advisor to President Trump and has subsidized a prolific blogger known by the pen name Mencius Moldbug who writes approvingly of ideas like eugenics and dictatorship. And Moldbug is not alone. There is, in fact, a small but influential—and growing—group of techies with similarly absurd and extremist beliefs who call themselves the “neoreactionary” vanguard of a “Dark Enlightenment.”
"Vivid and incisive, Live Work Work Work Die is a troubling portrait of a self-obsessed industry bent on imposing its disturbing visions on the rest of us."

--- By the author of the instant-classic "Mouthbreathing Machiavellis Dream of a Silicon Reich" (https://thebaffler.com/latest/mouthbreathing-machiavellis).
to:NB  books:noted  coveted  the_wired_ideology  nerdworld  running_dogs_of_reaction 
10 days ago
Envisioning the Data Science Discipline: The Undergraduate Perspective: Interim Report | The National Academies Press
"The need to manage, analyze, and extract knowledge from data is pervasive across industry, government, and academia. Scientists, engineers, and executives routinely encounter enormous volumes of data, and new techniques and tools are emerging to create knowledge out of these data, some of them capable of working with real-time streams of data. The nation’s ability to make use of these data depends on the availability of an educated workforce with necessary expertise. With these new capabilities have come novel ethical challenges regarding the effectiveness and appropriateness of broad applications of data analyses.
"The field of data science has emerged to address the proliferation of data and the need to manage and understand it. Data science is a hybrid of multiple disciplines and skill sets, draws on diverse fields (including computer science, statistics, and mathematics), encompasses topics in ethics and privacy, and depends on specifics of the domains to which it is applied. Fueled by the explosion of data, jobs that involve data science have proliferated and an array of data science programs at the undergraduate and graduate levels have been established. Nevertheless, data science is still in its infancy, which suggests the importance of envisioning what the field might look like in the future and what key steps can be taken now to move data science education in that direction.
"This study will set forth a vision for the emerging discipline of data science at the undergraduate level. This interim report lays out some of the information and comments that the committee has gathered and heard during the first half of its study, offers perspectives on the current state of data science education, and poses some questions that may shape the way data science education evolves in the future. The study will conclude in early 2018 with a final report that lays out a vision for future data science education."
to:NB  to_read  books:noted  statistics  education  to_be_shot_after_a_fair_trial 
10 days ago
Assessing benefits, costs, and disparate racial impacts of confrontational proactive policing
"Effective policing in a democratic society must balance the sometime conflicting objectives of public safety and community trust. This paper uses a formal model of optimal policing to explore how society might reasonably resolve the tension between these two objectives as well as evaluate disparate racial impacts. We do so by considering the social benefits and costs of confrontational types of proactive policing, such as stop, question, and frisk. Three features of the optimum that are particularly relevant to policy choices are explored: (i) the cost of enforcement against the innocent, (ii) the baseline level of crime rate without confrontational enforcement, and (iii) differences across demographic groups in the optimal rate of enforcement."
to:NB  police  political_economy  to_be_shot_after_a_fair_trial  to_read 
13 days ago
Trumpism and American Democracy: History, Comparison, and the Predicament of Liberal Democracy in the United States by Robert C. Lieberman, Suzanne Mettler, Thomas B. Pepinsky, Kenneth M. Roberts, Richard Valelly :: SSRN
"In the eyes of many citizens, activists, pundits, and scholars, American democracy appears under threat. Concern about President Trump and the future of American politics may be found among both conservatives and progressives; among voters, activists, and elites; and among many scholars and analysts of American and comparative politics. What is the nature of the Trumpism as a political phenomenon? And how much confidence should we have at present in the capacity of American institutions to withstand this threat?
"In this essay, we argue that answering these questions and understanding what is uniquely threatening to democracy at the present moment requires looking beyond the contemporary particulars of Donald Trump and his presidency. Instead, it demands a historical and comparative perspective on American politics. Drawing on a range of insights from the fields of comparative politics and American political development, we argue that President Trump’s election in 2016 represents the intersection of three streams in American politics: polarized two-party presidentialism; a polity fundamentally divided over membership and status in the political community, in ways structured by race and economic inequality; and the erosion of democratic norms at the elite and mass levels. The current political circumstance is an existential threat to American democratic order because of the interactive effects of institutions, identity, and norm-breaking in American politics."
us_politics  democracy  institutions  political_science  via:?  trump.donald 
13 days ago
[1607.00653] node2vec: Scalable Feature Learning for Networks
"Prediction tasks over nodes and edges in networks require careful effort in engineering features used by learning algorithms. Recent research in the broader field of representation learning has led to significant progress in automating prediction by learning the features themselves. However, present feature learning approaches are not expressive enough to capture the diversity of connectivity patterns observed in networks. Here we propose node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks. In node2vec, we learn a mapping of nodes to a low-dimensional space of features that maximizes the likelihood of preserving network neighborhoods of nodes. We define a flexible notion of a node's network neighborhood and design a biased random walk procedure, which efficiently explores diverse neighborhoods. Our algorithm generalizes prior work which is based on rigid notions of network neighborhoods, and we argue that the added flexibility in exploring neighborhoods is the key to learning richer representations. We demonstrate the efficacy of node2vec over existing state-of-the-art techniques on multi-label classification and link prediction in several real-world networks from diverse domains. Taken together, our work represents a new way for efficiently learning state-of-the-art task-independent representations in complex networks."
to:NB  to_read  network_data_analysis  leskovec.jure 
13 days ago
Large deviation principle for epidemic models | Journal of Applied Probability | Cambridge Core
"We consider a general class of epidemic models obtained by applying the random time changes of Ethier and Kurtz (2005) to a collection of Poisson processes and we show the large deviation principle for such models. We generalise the approach followed by Dolgoarshinnykh (2009) in the case of the SIR epidemic model. Thanks to an additional assumption which is satisfied in many examples, we simplify the recent work of Kratz and Pardoux (2017)."
to:NB  large_deviations  epidemic_models  stochastic_processes  to_read  re:almost_none  re:do-institutions-evolve 
13 days ago
[1609.04212] Formalizing Neurath's Ship: Approximate Algorithms for Online Causal Learning
"Higher-level cognition depends on the ability to learn models of the world. We can characterize this at the computational level as a structure-learning problem with the goal of best identifying the prevailing causal relationships among a set of relata. However, the computational cost of performing exact Bayesian inference over causal models grows rapidly as the number of relata increases. This implies that the cognitive processes underlying causal learning must be substantially approximate. A powerful class of approximations that focuses on the sequential absorption of successive inputs is captured by the Neurath's ship metaphor in philosophy of science, where theory change is cast as a stochastic and gradual process shaped as much by people's limited willingness to abandon their current theory when considering alternatives as by the ground truth they hope to approach. Inspired by this metaphor and by algorithms for approximating Bayesian inference in machine learning, we propose an algorithmic-level model of causal structure learning under which learners represent only a single global hypothesis that they update locally as they gather evidence. We propose a related scheme for understanding how, under these limitations, learners choose informative interventions that manipulate the causal system to help elucidate its workings. We find support for our approach in the analysis of four experiments."

--- This sounds interesting, but does nothing to alleviate my usual issue with Griffiths's stuff, which is that I don't see what _Bayesianism_ adds, that you couldn't just get from some sort of evolutionary optimization (cf. http://bactra.org/weblog/601.html and especially http://bactra.org/weblog/796.html) --- but then we're back in the world of Holland et al.'s _Induction_ (not that that's bad: http://bactra.org/reviews/hhnt-induction/).
to:NB  to_read  causal_inference  via:vaguery  to_be_shot_after_a_fair_trial 
14 days ago
Biological Clocks, Rhythms, and Oscillations: The Theory of Biological Timekeeping | The MIT Press
"All areas of biology and medicine contain rhythms, and these behaviors are best understood through mathematical tools and techniques. This book offers a survey of mathematical, computational, and analytical techniques used for modeling biological rhythms, gathering these methods for the first time in one volume. Drawing on material from such disciplines as mathematical biology, nonlinear dynamics, physics, statistics, and engineering, it presents practical advice and techniques for studying biological rhythms, with a common language.
"The chapters proceed with increasing mathematical abstraction. Part I, on models, highlights the implicit assumptions and common pitfalls of modeling, and is accessible to readers with basic knowledge of differential equations and linear algebra. Part II, on behaviors, focuses on simpler models, describing common properties of biological rhythms that range from the firing properties of squid giant axon to human circadian rhythms. Part III, on mathematical techniques, guides readers who have specific models or goals in mind. Sections on “frontiers” present the latest research; “theory” sections present interesting mathematical results using more accessible approaches than can be found elsewhere. Each chapter offers exercises. Commented MATLAB code is provided to help readers get practical experience.
"The book, by an expert in the field, can be used as a textbook for undergraduate courses in mathematical biology or graduate courses in modeling biological rhythms and as a reference for researchers."
to:NB  books:noted  mathematics  biology 
14 days ago
Perturbations, Optimization, and Statistics | The MIT Press
"In nearly all machine learning, decisions must be made given current knowledge. Surprisingly, making what is believed to be the best decision is not always the best strategy, even when learning in a supervised learning setting. An emerging body of work on learning under different rules applies perturbations to decision and learning procedures. These methods provide simple and highly efficient learning rules with improved theoretical guarantees. This book describes perturbation-based methods developed in machine learning to augment novel optimization methods with strong statistical guarantees, offering readers a state-of-the-art overview.
"Chapters address recent modeling ideas that have arisen within the perturbations framework, including Perturb & MAP, herding, and the use of neural networks to map generic noise to distribution over highly structured data. They describe new learning procedures for perturbation models, including an improved EM algorithm and a learning algorithm that aims to match moments of model samples to moments of data. They discuss understanding the relation of perturbation models to their traditional counterparts, with one chapter showing that the perturbations viewpoint can lead to new algorithms in the traditional setting. And they consider perturbation-based regularization in neural networks, offering a more complete understanding of dropout and studying perturbations in the context of deep neural networks."
to:NB  books:noted  computational_statistics  optimization  stochastic_approximation  learning_theory  statistics 
14 days ago
Coding Literacy | The MIT Press
"The message from educators, the tech community, and even politicians is clear: everyone should learn to code. To emphasize the universality and importance of computer programming, promoters of coding for everyone often invoke the concept of “literacy,” drawing parallels between reading and writing code and reading and writing text. In this book, Annette Vee examines the coding-as-literacy analogy and argues that it can be an apt rhetorical frame. The theoretical tools of literacy help us understand programming beyond a technical level, and in its historical, social, and conceptual contexts. Viewing programming from the perspective of literacy and literacy from the perspective of programming, she argues, shifts our understandings of both. Computer programming becomes part of an array of communication skills important in everyday life, and literacy, augmented by programming, becomes more capacious.
"Vee examines the ways that programming is linked with literacy in coding literacy campaigns, considering the ideologies that accompany this coupling, and she looks at how both writing and programming encode and distribute information. She explores historical parallels between writing and programming, using the evolution of mass textual literacy to shed light on the trajectory of code from military and government infrastructure to large-scale businesses to personal use. Writing and coding were institutionalized, domesticated, and then established as a basis for literacy. Just as societies demonstrated a “literate mentality” regardless of the literate status of individuals, Vee argues, a “computational mentality” is now emerging even though coding is still a specialized skill."
to:NB  books:noted  literacy  programming  to_be_shot_after_a_fair_trial 
15 days ago
Invisible Mind | The MIT Press
"In Invisible Mind, Lasana Harris takes a social neuroscience approach to explaining the worst of human behavior. How can a person take part in racially motivated violence and then tenderly cradle a baby or lovingly pet a puppy? Harris argues that our social cognition—the ability to infer the mental states of another agent—is flexible. That is, we can either engage or withhold social cognition. If we withhold social cognition, we dehumanize the other person. Integrating theory from a range of disciplines—social, developmental, and cognitive psychology, evolutionary anthropology, philosophy, economics, and law—with neuroscience data, Harris explores how and why we engage or withhold social cognition. He examines research in these different disciplines and describes biological processes that underlie flexible social cognition, including brain, genetic, hormonal, and physiological mechanisms.
"After laying out the philosophical and theoretical terrain, Harris explores examples of social cognitive ability in nonhumans and explains the evolutionary staying power of this trait. He addresses two motives for social cognition—prediction and explanation—and reviews cases of anthropomorphism (extending social cognition to entities without mental states) and dehumanization (withholding it from people with mental states). He discusses the relation of social cognition to the human/nonhuman distinction and to the evolution of sociality. He considers the importance of social context and, finally, he speculates about the implications of flexible social cognition in such arenas for human interaction as athletic competition and international disputes."
to:NB  books:noted  social_psychology  moral_psychology 
15 days ago
[1406.0423] Targeted Maximum Likelihood Estimation using Exponential Families
"Targeted maximum likelihood estimation (TMLE) is a general method for estimating parameters in semiparametric and nonparametric models. Each iteration of TMLE involves fitting a parametric submodel that targets the parameter of interest. We investigate the use of exponential families to define the parametric submodel. This implementation of TMLE gives a general approach for estimating any smooth parameter in the nonparametric model. A computational advantage of this approach is that each iteration of TMLE involves estimation of a parameter in an exponential family, which is a convex optimization problem for which software implementing reliable and computationally efficient methods exists. We illustrate the method in three estimation problems, involving the mean of an outcome missing at random, the parameter of a median regression model, and the causal effect of a continuous exposure, respectively. We conduct a simulation study comparing different choices for the parametric submodel, focusing on the first of these problems. To the best of our knowledge, this is the first study investigating robustness of TMLE to different specifications of the parametric submodel. We find that the choice of submodel can have an important impact on the behavior of the estimator in finite samples."
to:NB  statistics  estimation  nonparametrics  causal_inference  exponential_families 
16 days ago
Is Capitalism Obsolete? — Giacomo Corneo | Harvard University Press
"After communism collapsed in the former Soviet Union, capitalism seemed to many observers like the only game in town, and questioning it became taboo for academic economists. But the financial crisis, chronic unemployment, and the inexorable rise of inequality have resurrected the question of whether there is a feasible and desirable alternative to capitalism. Against this backdrop of growing disenchantment, Giacomo Corneo presents a refreshingly antidogmatic review of economic systems, taking as his launching point a fictional argument between a daughter indignant about economic injustice and her father, a professor of economics.
"Is Capitalism Obsolete? begins when the daughter’s angry complaints prompt her father to reply that capitalism cannot responsibly be abolished without an alternative in mind. He invites her on a tour of tried and proposed economic systems in which production and consumption obey noncapitalistic rules. These range from Plato’s Republic to diverse modern models, including anarchic communism, central planning, and a stakeholder society. Some of these alternatives have considerable strengths. But daunting problems arise when the basic institutions of capitalism—markets and private property—are suppressed. Ultimately, the father argues, all serious counterproposals to capitalism fail to pass the test of economic feasibility. Then the story takes an unexpected turn. Father and daughter jointly come up with a proposal to gradually transform the current economic system so as to share prosperity and foster democratic participation."
to:NB  books:noted  progressive_forces  capitalism  socialism  economics 
17 days ago
Rhododendron, Milne
"Has ever a plant inspired such love and such hatred as the rhododendron? Its beauty is inarguable; it can clothe whole hillsides and gardens with a blanket of vibrant color. The rhododendron has a propensity towards sexual infidelity, making it very popular with horticultural breeding programs. And it can also be used as an herbal remedy for an astonishing range of ailments.
"But there is a darker side to these gorgeous flowers. Daphne du Maurier used the red rhododendron as a symbol of blood in her best-selling novel Rebecca, and numerous Chinese folktales link the plant with tragedy and death. It can poison livestock and intoxicate humans, and its narcotic honey has been used as a weapon of war. Rhododendron ponticum has run riot across the British countryside, but the full story of this implacable invader contains many fascinating surprises.
"In this beautifully illustrated volume, Richard Milne explores the many ways in which the rhododendron has influenced human societies, relating this to the extraordinary story of the plant’s evolution. Over one thousand species of the plant exist, ranging from rugged trees on Himalayan slopes to rock-hugging alpines, and delicate plants perched on rainforest branches. Milne relays tales of mythical figures, intrepid collectors, and eccentric plant breeders. However much you may think you know about the rhododendron, this charming book will offer something new."
books:noted  plants 
17 days ago
A History of the Silk Road, Clements
"The Silk Road is not a place, but a journey, a route from the edges of the Mediterranean to the central plains of China, through high mountains and inhospitable deserts. For thousands of years its history has been a traveler’s history, of brief encounters in desert towns, snowbound passes and nameless forts. It was the conduit that first brought Buddhism, Christianity and Islam into China, and the site of much of the “Great Game” between 19th-century empires. Today, its central section encompasses several former Soviet republics, and the Chinese Autonomous Region of Xinjiang. The ancient trade route controversially crosses the sites of several forgotten kingdoms, buried in sand and only now revealing their secrets.
"A History of the Silk Road not only offers the reader a chronological outline of the region’s development, but also provides an invaluable introduction to its languages, literature, and arts. It takes a comprehensive and illuminating look at the rich history of this dynamic and little known region, and provides an easy-to-use reference source. Jonathan Clements pays particular attention to the fascinating historical sites which feature on any visitor’s itinerary and also gives special emphasis to the writings and reactions of travelers through the centuries."
to:NB  books:noted  ancient_history  eurasian_history  silk_road 
17 days ago
Technosystem — Andrew Feenberg | Harvard University Press
"We live in a world of technical systems designed in accordance with technical disciplines and operated by personnel trained in those disciplines. This is a unique form of social organization that largely determines our way of life, but the actions of individuals and social protest still play a role in developing and purposing these rational systems. In Technosystem, Andrew Feenberg builds a theory of both the threats of technocratic modernity and the potential for democratic change.
"Feenberg draws on the tradition of radical social criticism represented by Herbert Marcuse and the Frankfurt School, which recognized the social effects of instrumental rationality but did not advance a convincing alternative to the new forms of domination imposed by rational systems. That is where the fine-grained analyses of Science, Technology, and Society (STS) studies can contribute. Feenberg uses these approaches to reconcile the claims of rationality with the agency of a public increasingly mobilized to intervene in technically based decisions. The resulting social theory recognizes emerging forms of resistance, such as protests and hacking, as essential expressions of public life in the “rational society.”
"Combining the most salient insights from critical theory with the empirical findings of STS, Technosystem advances the philosophical debate over the nature and practice of reason in modern society."
to:NB  books:noted  philosophy  critical_theory  technology 
17 days ago
Test Score Measurement and the Black-White Test Score Gap | The Review of Economics and Statistics | MIT Press Journals
"Research as to the size of the black-white test score gap often comes to contradictory conclusions. Recent literature has affirmed that the source of these contradictions and other controversies in education economics may be due to the fact that test scores contain only ordinal information. In this paper, I propose a normalization of test scores that is invariant to monotonic transformations. Under fairly weak assumptions, this metric has interval properties and thus solves the ordinality problem. The measure can serve as a valuable robustness check to ensure that any results are not simply statistical artifacts from the choice of scale."
mental_testing  standardized_testing  re:g_paper  to:NB 
17 days ago
Optimistic realism about scientific progress | SpringerLink
"Scientific realists use the “no miracle argument” to show that the empirical and pragmatic success of science is an indicator of the ability of scientific theories to give true or truthlike representations of unobservable reality. While antirealists define scientific progress in terms of empirical success or practical problem-solving, realists characterize progress by using some truth-related criteria. This paper defends the definition of scientific progress as increasing truthlikeness or verisimilitude. Antirealists have tried to rebut realism with the “pessimistic metainduction”, but critical realists turn this argument into an optimistic view about progressive science."
to:NB  philosophy_of_science  epistemology 
22 days ago
On the Frequentist Properties of Bayesian Nonparametric Methods | Annual Review of Statistics and Its Application
"In this paper, I review the main results on the asymptotic properties of the posterior distribution in nonparametric or high-dimensional models. In particular, I explain how posterior concentration rates can be derived and what we learn from such analysis in terms of the impact of the prior distribution on high-dimensional models. These results concern fully Bayes and empirical Bayes procedures. I also describe some of the results that have been obtained recently in semiparametric models, focusing mainly on the Bernstein–von Mises property. Although these results are theoretical in nature, they shed light on some subtle behaviors of the prior models and sharpen our understanding of the family of functionals that can be well estimated for a given prior model."
to:NB  bayesian_consistency  bayesianism  statistics  re:bayes_as_evol 
25 days ago
Structure Learning in Graphical Modeling | Annual Review of Statistics and Its Application
"A graphical model is a statistical model that is associated with a graph whose nodes correspond to variables of interest. The edges of the graph reflect allowed conditional dependencies among the variables. Graphical models have computationally convenient factorization properties and have long been a valuable tool for tractable modeling of multivariate distributions. More recently, applications such as reconstructing gene regulatory networks from gene expression data have driven major advances in structure learning, that is, estimating the graph underlying a model. We review some of these advances and discuss methods such as the graphical lasso and neighborhood selection for undirected graphical models (or Markov random fields) and the PC algorithm and score-based search methods for directed graphical models (or Bayesian networks). We further review extensions that account for effects of latent variables and heterogeneous data sources."
to:NB  graphical_models  causal_discovery  statistics  maathuis.marloes 
25 days ago
Modeling Through Latent Variables | Annual Review of Statistics and Its Application
"In this review, we give a general overview of latent variable models. We introduce the general model and discuss various inferential approaches. Afterward, we present several commonly applied special cases, including mixture or latent class models, as well as mixed models. We apply many of these models to a single data set with simple structure, allowing for easy comparison of the results. This allows us to discuss advantages and disadvantages of the various approaches, but also to illustrate several problems inherently linked to models incorporating latent structures. Finally, we touch on model extensions and applications and highlight several issues often ignored when applying latent variable models."
to:NB  statistics  inference_to_latent_objects  re:g_paper 
25 days ago
Is Most Published Research Really False? | Annual Review of Statistics and Its Application
"There has been an increasing concern in both the scientific and lay communities that most published medical findings are false. But what does it mean to be false? Here we describe the range of definitions of false discoveries in the scientific literature. We summarize the philosophical, statistical, and experimental evidence for each type of false discovery. We discuss common underpinning problems with the scientific and data analytic practices and point to tools and behaviors that can be implemented to reduce the problems with published scientific results."
to:NB  re:neutral_model_of_inquiry  statistics  meta-analysis  science_as_a_social_process 
25 days ago
p-Values: The Insight to Modern Statistical Inference | Annual Review of Statistics and Its Application
"I introduce a p-value function that derives from the continuity inherent in a wide range of regular statistical models. This provides confidence bounds and confidence sets, tests, and estimates that all reflect model continuity. The development starts with the scalar-variable scalar-parameter exponential model and extends to the vector-parameter model with scalar interest parameter, then to general regular models, and then references for testing vector interest parameters are available. The procedure does not use sufficiency but applies directly to general models, although it reproduces sufficiency-based results when sufficiency is present. The emphasis is on the coherence of the full procedure, and technical details are not emphasized."
to:NB  p-values  hypothesis_testing  confidence_sets  statistics  fraser.d.a.s. 
25 days ago
Risk and Uncertainty Communication | Annual Review of Statistics and Its Application
"This review briefly examines the vast range of techniques used to communicate risk assessments arising from statistical analysis. After discussing essential psychological and sociological issues, I focus on individual health risks and relevant research on communicating numbers, verbal expressions, graphics, and conveying deeper uncertainty. I then consider practice in a selection of diverse case studies, including gambling, the benefits and risks of pharmaceuticals, weather forecasting, natural hazards, climate change, environmental exposures, security and intelligence, industrial reliability, and catastrophic national and global risks. There are some tentative final conclusions, but the primary message is to acknowledge expert guidance, be clear about objectives, and work closely with intended audiences."
to:NB  risk_vs_uncertainty  risk_assessment  statistics 
25 days ago
An Introduction to Transfer Entropy | SpringerLink
"This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance."
to:NB  books:noted  information_theory 
29 days ago
Designing Beauty: The Art of Cellular Automata | Andrew Adamatzky | Springer
"This fascinating, colourful  book offers in-depth insights and first-hand working experiences in the production of art works, using simple computational models with rich morphological behaviour, at the edge of mathematics, computer science, physics and biology. It organically combines ground breaking scientific discoveries in the theory of computation and complex systems with artistic representations of the research results. In this appealing book mathematicians, computer scientists, physicists, and engineers brought together marvelous and esoteric patterns generated by cellular automata, which are arrays of simple machines with complex behavior. Configurations produced by cellular automata  uncover  mechanics of dynamic patterns formation, their propagation and interaction in natural systems: heart pacemaker, bacterial membrane proteins, chemical rectors, water permeation in soil, compressed gas, cell division, population dynamics, reaction-diffusion media and self-organisation."
to:NB  books:noted  cellular_automata  pretty_pictures  kith_and_kin 
29 days ago
Mathematics of Epidemics on Networks - From Exact to | István Z. Kiss | Springer
"This textbook provides an exciting new addition to the area of network science featuring a stronger and more methodical link of models to their mathematical origin and explains how these relate to each other with special focus on epidemic spread on networks. The content of the book is at the interface of graph theory, stochastic processes and dynamical systems. The authors set out to make a significant contribution to closing the gap between model development and the supporting mathematics. This is done by:
"Summarising and presenting the state-of-the-art in modeling epidemics on networks with results and readily usable models signposted throughout the book;
"Presenting different mathematical approaches to formulate exact and solvable models;
"Identifying the concrete links between approximate models and their rigorous mathematical representation;
"Presenting a model hierarchy and clearly highlighting the links between model assumptions and model complexity;
"Providing a reference source for advanced undergraduate students, as well as doctoral students, postdoctoral researchers and academic experts who are engaged in modeling stochastic processes on networks;
"Providing software that can solve differential equation models or directly simulate epidemics on networks.
"Replete with numerous diagrams, examples, instructive exercises, and online access to simulation algorithms and readily usable code, this book will appeal to a wide spectrum of readers from different backgrounds and academic levels. Appropriate for students with or without a strong background in mathematics, this textbook can form the basis of an advanced undergraduate or graduate course in both mathematics and other departments alike."
in_NB  books:noted  epidemic_models  networks  re:do-institutions-evolve  to_teach:baby-nets  in_library  downloaded 
29 days ago
Cellular Automata: Analysis and Applications | Karl-Peter Hadeler | Springer
"This book provides an overview of the main approaches used to analyze the dynamics of cellular automata. Cellular automata are an indispensable tool in mathematical modeling. In contrast to classical modeling approaches like partial differential equations, cellular automata are relatively easy to simulate but difficult to analyze. In this book we present a review of approaches and theories that allow the reader to understand the behavior of cellular automata beyond simulations. The first part consists of an introduction to cellular automata on Cayley graphs, and their characterization via the fundamental Cutis-Hedlund-Lyndon theorems in the context of various topological concepts (Cantor, Besicovitch and Weyl topology). The second part focuses on classification results: What classification follows from topological concepts (Hurley classification), Lyapunov stability (Gilman classification), and the theory of formal languages and grammars (Kůrka classification)? These classifications suggest that cellular automata be clustered, similar to the classification of partial differential equations into hyperbolic, parabolic and elliptic equations. This part of the book culminates in the question of whether the properties of cellular automata are decidable. Surjectivity and injectivity are examined, and the seminal Garden of Eden theorems are discussed. In turn, the third part focuses on the analysis of cellular automata that inherit distinct properties, often based on mathematical modeling of biological, physical or chemical systems. Linearity is a concept that allows us to define self-similar limit sets. Models for particle motion show how to bridge the gap between cellular automata and partial differential equations (HPP model and ultradiscrete limit). Pattern formation is related to linear cellular automata, to the Bar-Yam model for the Turing pattern, and Greenberg-Hastings automata for excitable media. In addition, models for sand piles, the dynamics of infectious diseases, and evolution of predator-prey systems are discussed. Mathematicians will find an essential overview of theory and tools used for the analysis of cellular automata. The book also features an appendix introducing the reader to basic mathematical techniques and notations, so that physicists, chemists and biologists interested in cellular automata beyond pure simulations will also benefit from the book."
to:NB  books:noted  cellular_automata 
29 days ago
McAdams, A.J.: Vanguard of the Revolution: The Global Idea of the Communist Party. (eBook and Hardcover)
"Vanguard of the Revolution is a sweeping history of one of the most significant political institutions of the modern world. The communist party was a revolutionary idea long before its supporters came to power. In this book, A. James McAdams argues that the rise and fall of communism can be understood only by taking into account the origins and evolution of this compelling idea. He shows how the leaders of parties in countries as diverse as the Soviet Union, China, Germany, Yugoslavia, Cuba, and North Korea adapted the original ideas of revolutionaries like Karl Marx and Vladimir Lenin to profoundly different social and cultural settings.
"Taking readers from the drafting of The Communist Manifesto in the 1840s to the dissolution of the Soviet Union in the early 1990s, McAdams describes the decisive role played by individual rulers in the success of their respective parties—men like Joseph Stalin, Mao Zedong, and Fidel Castro. He demonstrates how these personalities drew on vying conceptions of the party’s functions to mesmerize their followers, mobilize their populations, and transform their societies. He also shows how many of these figures abused these ideas to justify incomprehensible acts of inhumanity. McAdams explains why communist parties lasted as long as they did, and why they either disappeared or ceased to be meaningful institutions by the close of the twentieth century."
to:NB  books:noted  20th_century_history  communism  politics  history_of_ideas 
5 weeks ago
Random Measures, Theory and Applications | Olav Kallenberg | Springer
"Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas."
to:NB  books:noted  probability  stochastic_processes  ergodic_theory  kallenberg.olav  coveted 
5 weeks ago
Asymptotic Theory of Weakly Dependent Random Processes | Emmanuel Rio | Springer
"Presenting tools to aid understanding of asymptotic theory and weakly dependent processes, this book is devoted to inequalities and limit theorems for sequences of random variables that are strongly mixing in the sense of Rosenblatt, or absolutely regular.
"The first chapter introduces covariance inequalities under strong mixing or absolute regularity. These covariance inequalities are applied in Chapters 2, 3 and 4 to moment inequalities, rates of convergence in the strong law, and central limit theorems. Chapter 5 concerns coupling. In Chapter 6 new deviation inequalities and new moment inequalities for partial sums via the coupling lemmas of Chapter 5 are derived and applied to the bounded law of the iterated logarithm. Chapters 7 and 8 deal with the theory of empirical processes under weak dependence. Lastly, Chapter 9 describes links between ergodicity, return times and rates of mixing in the case of irreducible Markov chains. Each chapter ends with a set of exercises.
"The book is an updated and extended translation of the French edition entitled "Théorie asymptotique des processus aléatoires faiblement dépendants" (Springer, 2000). It will be useful for students and researchers in mathematical statistics, econometrics, probability theory and dynamical systems who are interested in weakly dependent processes."
to:NB  books:noted  stochastic_processes  mixing  ergodic_theory  markov_models 
5 weeks ago
[FoR&AI] The Seven Deadly Sins of Predicting the Future of AI – Rodney Brooks
It's a bit depressing that _Rodney Brooks_ feels he needs to take the time to say these obvious things.
ai  robots_and_robotics  utter_stupidity  brooks.rodney  via:whimsley  have_read 
5 weeks ago
Tetlock, P.E.: Expert Political Judgment: How Good Is It? How Can We Know?. (New Edition) (eBook, Paperback and Hardcover)
"Tetlock first discusses arguments about whether the world is too complex for people to find the tools to understand political phenomena, let alone predict the future. He evaluates predictions from experts in different fields, comparing them to predictions by well-informed laity or those based on simple extrapolation from current trends. He goes on to analyze which styles of thinking are more successful in forecasting. Classifying thinking styles using Isaiah Berlin's prototypes of the fox and the hedgehog, Tetlock contends that the fox--the thinker who knows many little things, draws from an eclectic array of traditions, and is better able to improvise in response to changing events--is more successful in predicting the future than the hedgehog, who knows one big thing, toils devotedly within one tradition, and imposes formulaic solutions on ill-defined problems. He notes a perversely inverse relationship between the best scientific indicators of good judgement and the qualities that the media most prizes in pundits--the single-minded determination required to prevail in ideological combat.
"Clearly written and impeccably researched, the book fills a huge void in the literature on evaluating expert opinion. It will appeal across many academic disciplines as well as to corporations seeking to develop standards for judging expert decision-making. Now with a new preface in which Tetlock discusses the latest research in the field, the book explores what constitutes good judgment in predicting future events and looks at why experts are often wrong in their forecasts."
in_NB  books:noted  prediction  expertise  cognitive_science 
5 weeks ago
Bok, D.: The Struggle to Reform Our Colleges (eBook and Hardcover).
"During the first decade of this century, many commentators predicted that American higher education was about to undergo major changes that would be brought about under the stimulus of online learning and other technological advances. Toward the end of the decade, the president of the United States declared that America would regain its historic lead in the education of its workforce within the next ten years through a huge increase in the number of students earning “quality” college degrees.
"Several years have elapsed since these pronouncements were made, yet the rate of progress has increased very little, if at all, in the number of college graduates or the nature and quality of the education they receive. In The Struggle to Reform Our Colleges, Derek Bok seeks to explain why so little change has occurred by analyzing the response of America’s colleges; the influence of students, employers, foundations, accrediting organizations, and government officials; and the impact of market forces and technological innovation. In the last part of the book, Bok identifies a number of initiatives that could improve the performance of colleges and universities. The final chapter examines the process of change itself and describes the strategy best calculated to quicken the pace of reform and enable colleges to meet the challenges that confront them."
to:NB  books:noted  education  academia  bok.derek 
5 weeks ago
[1709.02012v1] On Fairness and Calibration
"The machine learning community has become increasingly concerned with the potential for bias and discrimination in predictive models, and this has motivated a growing line of work on what it means for a classification procedure to be "fair." In particular, we investigate the tension between minimizing error disparity across different population groups while maintaining calibrated probability estimates. We show that calibration is compatible only with a single error constraint (i.e. equal false-negatives rates across groups), and show that any algorithm that satisfies this relaxation is no better than randomizing a percentage of predictions for an existing classifier. These unsettling findings, which extend and generalize existing results, are empirically confirmed on several datasets."
to:NB  to_read  calibration  prediction  classifiers  kleinberg.jon  via:arsyed 
5 weeks ago
A new paradigm for the introductory course in economics | VOX, CEPR’s Policy Portal
"Our intro courses fail to reflect the dramatic advances in economics – concerning information problems and strategic interactions, for example – since Samuelson’s paradigm-setting 1948 textbook. Missing, too, is any sustained engagement with new problems we now confront and on which economics has important insights for public policy – climate change, innovation, instability and growing inequality amongst them. This column introduces a free online interactive text – now used as the standard intro at UCL, Sciences Po, and Toulouse School of Economics – which responds."
economics  market_failures_in_everything  markets_as_collective_calculating_devices  kith_and_kin  bowles.samuel  have_read  via:?  game_theory  behavioral_economics  inequality 
6 weeks ago
Book Review: A Coming of (Information) Age Story | Issues in Science and Technology
This is mostly a very nice review of what sounds like a very interesting book. I would quibble somewhat though with the distinction drawn between Wiener's approach and that of someone like Simon or Minsky, however. Wiener was _very_ interested in how one might reproduce the functional, input/output behavior of a black box by approximating it with a sufficiently powerful system of white boxes. (This is the whole point of his book _Nonlinear Problems in Random Theory_.) . Given his mathematical background, he tended to think of the white box as expanding a function in a power series (or really a series of integrals against memory kernels; see NPiRT), but the important point was universal approximation, which a general-purpose digital computer certainly has in spades. I don't see how this is notably more "embodied" than what Simon or Minsky talked about; it's very much in line with the functionalism of "good old-fashioned AI".
(And I really have no idea what D.A. could be thinking of to say that "deep learning" is more embodied/enactive than classical AI...)

ObSelfLinkage: http://bactra.org/notebooks/cybernetics.html
book_reviews  have_read  cybernetics  information_theory  history_of_science  auerbach.david 
6 weeks ago
As If — Kwame Anthony Appiah | Harvard University Press
"Idealization is a fundamental feature of human thought. We build simplified models in our scientific research and utopias in our political imaginations. Concepts like belief, desire, reason, and justice are bound up with idealizations and ideals. Life is a constant adjustment between the models we make and the realities we encounter. In idealizing, we proceed “as if” our representations were true, while knowing they are not. This is not a dangerous or distracting occupation, Kwame Anthony Appiah shows. Our best chance of understanding nature, society, and ourselves is to open our minds to a plurality of imperfect depictions that together allow us to manage and interpret our world.
"The philosopher Hans Vaihinger first delineated the “as if” impulse at the turn of the twentieth century, drawing on Kant, who argued that rational agency required us to act as if we were free. Appiah extends this strategy to examples across philosophy and the human and natural sciences. In a broad range of activities, we have some notion of the truth yet continue with theories that we recognize are, strictly speaking, false. From this vantage point, Appiah demonstrates that a picture one knows to be unreal can be a vehicle for accessing reality.
"As If explores how strategic untruth plays a critical role in far-flung areas of inquiry: decision theory, psychology, natural science, and political philosophy. A polymath who writes with mainstream clarity, Appiah defends the centrality of the imagination not just in the arts but in science, morality, and everyday life."
to:NB  books:noted  philosophy  epistemology  philosophy_of_science  approximation  modeling 
6 weeks ago
The Testing Charade: Pretending to Make Schools Better, Koretz
"For decades we’ve been studying, experimenting with, and wrangling over different approaches to improving public education, and there’s still little consensus on what works, and what to do. The one thing people seem to agree on, however, is that schools need to be held accountable—we need to know whether what they’re doing is actually working. But what does that mean in practice?
"High-stakes tests. Lots of them. And that has become a major problem. Daniel Koretz, one of the nation’s foremost experts on educational testing, argues in The Testing Charade that the whole idea of test-based accountability has failed—it has increasingly become an end in itself, harming students and corrupting the very ideals of teaching. In this powerful polemic, built on unimpeachable evidence and rooted in decades of experience with educational testing, Koretz calls out high-stakes testing as a sham, a false idol that is ripe for manipulation and shows little evidence of leading to educational improvement. Rather than setting up incentives to divert instructional time to pointless test prep, he argues, we need to measure what matters, and measure it in multiple ways—not just via standardized tests.
"Right now, we’re lying to ourselves about whether our children are learning. And the longer we accept that lie, the more damage we do. It’s time to end our blind reliance on high-stakes tests. With The Testing Charade, Daniel Koretz insists that we face the facts and change course, and he gives us a blueprint for doing better. "
to:NB  books:noted  education  social_measurement  standardized_testing  mental_testing  re:g_paper 
6 weeks ago
Signs from Silence: Ur of the First Sumerians, Charvát
"The Royal Tombs of Ur, dating from approximately 3000–2700 BCE, are among the most famous and impressive archeological discoveries of the twentieth century. Excavated between 1922 and 1934 under the direction of Leonard Woolley, this site is one of the richest sources of information we have about ancient Sumer—however, many mysteries about the society that produced these tombs remain. Based on primary research with the Ur materials at the University of Pennsylvania Museum of Archeology and Anthropology, and paying particular attention to the iconography found in what Woolley referred to as the “Seal Impression Strata of Ur,” this book works to reconstruct the early history of Sumer. What was this society like? What social structures did this society build? What were its institutions of authority? The answers Petr Charvát proposes are of interest not only to archeologists, but to anyone fascinated by early human history."
to:NB  books:noted  ancient_history  mesopotamia 
6 weeks ago
Foldamer hypothesis for the growth and sequence differentiation of prebiotic polymers
"It is not known how life originated. It is thought that prebiotic processes were able to synthesize short random polymers. However, then, how do short-chain molecules spontaneously grow longer? Also, how would random chains grow more informational and become autocatalytic (i.e., increasing their own concentrations)? We study the folding and binding of random sequences of hydrophobic (HH) and polar (PP) monomers in a computational model. We find that even short hydrophobic polar (HP) chains can collapse into relatively compact structures, exposing hydrophobic surfaces. In this way, they act as primitive versions of today’s protein catalysts, elongating other such HP polymers as ribosomes would now do. Such foldamer catalysts are shown to form an autocatalytic set, through which short chains grow into longer chains that have particular sequences. An attractive feature of this model is that it does not overconverge to a single solution; it gives ensembles that could further evolve under selection. This mechanism describes how specific sequences and conformations could contribute to the chemistry-to-biology (CTB) transition."
to:NB  biophysics  polymers  origin_of_life  self-organization 
6 weeks ago
On transient climate change at the Cretaceous−Paleogene boundary due to atmospheric soot injections
"Climate simulations that consider injection into the atmosphere of 15,000 Tg of soot, the amount estimated to be present at the Cretaceous−Paleogene boundary, produce what might have been one of the largest episodes of transient climate change in Earth history. The observed soot is believed to originate from global wildfires ignited after the impact of a 10-km-diameter asteroid on the Yucatán Peninsula 66 million y ago. Following injection into the atmosphere, the soot is heated by sunlight and lofted to great heights, resulting in a worldwide soot aerosol layer that lasts several years. As a result, little or no sunlight reaches the surface for over a year, such that photosynthesis is impossible and continents and oceans cool by as much as 28 °C and 11 °C, respectively. The absorption of light by the soot heats the upper atmosphere by hundreds of degrees. These high temperatures, together with a massive injection of water, which is a source of odd-hydrogen radicals, destroy the stratospheric ozone layer, such that Earth’s surface receives high doses of UV radiation for about a year once the soot clears, five years after the impact. Temperatures remain above freezing in the oceans, coastal areas, and parts of the Tropics, but photosynthesis is severely inhibited for the first 1 y to 2 y, and freezing temperatures persist at middle latitudes for 3 y to 4 y. Refugia from these effects would have been very limited. The transient climate perturbation ends abruptly as the stratosphere cools and becomes supersaturated, causing rapid dehydration that removes all remaining soot via wet deposition."
to:NB  paleontology  climatology 
6 weeks ago
Social network fragmentation and community health
"Community health interventions often seek to intentionally destroy paths between individuals to prevent the spread of infectious diseases. Immunizing individuals through direct vaccination or the provision of health education prevents pathogen transmission and the propagation of misinformation concerning medical treatments. However, it remains an open question whether network-based strategies should be used in place of conventional field approaches to target individuals for medical treatment in low-income countries. We collected complete friendship and health advice networks in 17 rural villages of Mayuge District, Uganda. Here we show that acquaintance algorithms, i.e., selecting neighbors of randomly selected nodes, were systematically more efficient in fragmenting all networks than targeting well-established community roles, i.e., health workers, village government members, and schoolteachers. Additionally, community roles were not good proxy indicators of physical proximity to other households or connections to many sick people. We also show that acquaintance algorithms were effective in offsetting potential noncompliance with deworming treatments for 16,357 individuals during mass drug administration (MDA). Health advice networks were destroyed more easily than friendship networks. Only an average of 32% of nodes were removed from health advice networks to reduce the percentage of nodes at risk for refusing treatment in MDA to below 25%. Treatment compliance of at least 75% is needed in MDA to control human morbidity attributable to parasitic worms and progress toward elimination. Our findings point toward the potential use of network-based approaches as an alternative to role-based strategies for targeting individuals in rural health interventions."
to:NB  social_networks  epidemiology_of_representations  social_influence  networks  re:do-institutions-evolve 
6 weeks ago
Individuals with greater science literacy and education have more polarized beliefs on controversial science topics
"Although Americans generally hold science in high regard and respect its findings, for some contested issues, such as the existence of anthropogenic climate change, public opinion is polarized along religious and political lines. We ask whether individuals with more general education and greater science knowledge, measured in terms of science education and science literacy, display more (or less) polarized beliefs on several such issues. We report secondary analyses of a nationally representative dataset (the General Social Survey), examining the predictors of beliefs regarding six potentially controversial issues. We find that beliefs are correlated with both political and religious identity for stem cell research, the Big Bang, and human evolution, and with political identity alone on climate change. Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues. We find little evidence of political or religious polarization regarding nanotechnology and genetically modified foods. On all six topics, people who trust the scientific enterprise more are also more likely to accept its findings. We discuss the causal mechanisms that might underlie the correlation between education and identity-based polarization."
to:NB  us_politics  re:democratic_cognition  by_people_i_know 
6 weeks ago
Architectural Intelligence | The MIT Press
"In Architectural Intelligence, Molly Wright Steenson explores the work of four architects in the 1960s and 1970s who incorporated elements of interactivity into their work. Christopher Alexander, Richard Saul Wurman, Cedric Price, and Nicholas Negroponte and the MIT Architecture Machine Group all incorporated technologies—including cybernetics and artificial intelligence—into their work and influenced digital design practices from the late 1980s to the present day.
"Alexander, long before his famous 1977 book A Pattern Language, used computation and structure to visualize design problems; Wurman popularized the notion of “information architecture”; Price designed some of the first intelligent buildings; and Negroponte experimented with the ways people experience artificial intelligence, even at architectural scale. Steenson investigates how these architects pushed the boundaries of architecture—and how their technological experiments pushed the boundaries of technology. What did computational, cybernetic, and artificial intelligence researchers have to gain by engaging with architects and architectural problems? And what was this new space that emerged within these collaborations? At times, Steenson writes, the architects in this book characterized themselves as anti-architects and their work as anti-architecture. The projects Steenson examines mostly did not result in constructed buildings, but rather in design processes and tools, computer programs, interfaces, digital environments. Alexander, Wurman, Price, and Negroponte laid the foundation for many of our contemporary interactive practices, from information architecture to interaction design, from machine learning to smart cities."

--- Christopher Alexander's early work is very interesting; _Notes on the Synthesis of Form_, in particular, is a brilliant-if-slightly-mad book, which anticipates, in some ways, the work since the 2000s on community discovery in network theory.
to:NB  books:noted  history_of_technology  computers  architecture  alexander.christopher 
6 weeks ago
The Grid | The MIT Press
"The North American power grid has been called the world’s largest machine. The grid connects nearly every living soul on the continent; Americans rely utterly on the miracle of electrification. In this book, Julie Cohn tells the history of the grid, from early linkages in the 1890s through the grid’s maturity as a networked infrastructure in the 1980s. She focuses on the strategies and technologies used to control power on the grid—in fact made up of four major networks of interconnected power systems—paying particular attention to the work of engineers and system operators who handled the everyday operations. To do so, she consulted sources that range from the pages of historical trade journals to corporate archives to the papers of her father, Nathan Cohn, who worked in the industry from 1927 to 1989—roughly the period of key power control innovations across North America.
"Cohn investigates major challenges and major breakthroughs but also the hidden aspects of our electricity infrastructure, both technical and human. She describes the origins of the grid and the growth of interconnection; emerging control issues, including difficulties in matching generation and demand on linked systems; collaboration and competition against the backdrop of economic depression and government infrastructure investment; the effects of World War II on electrification; postwar plans for a coast-to-coast grid; the northeast blackout of 1965 and the East-West closure of 1967; and renewed efforts at achieving stability and reliability after those two events."
in_NB  books:noted  american_history  history_of_technology  infrastructure  the_present_before_it_was_widely_distributed  the_electrification_of_the_whole_country  in_wishlist 
6 weeks ago
Machine Learners | The MIT Press
"Machine learning—programming computers to learn from data—has spread across scientific disciplines, media, entertainment, and government. Medical research, autonomous vehicles, credit transaction processing, computer gaming, recommendation systems, finance, surveillance, and robotics use machine learning. Machine learning devices (sometimes understood as scientific models, sometimes as operational algorithms) anchor the field of data science. They have also become mundane mechanisms deeply embedded in a variety of systems and gadgets. In contexts from the everyday to the esoteric, machine learning is said to transform the nature of knowledge. In this book, Adrian Mackenzie investigates whether machine learning also transforms the practice of critical thinking.
"Mackenzie focuses on machine learners—either humans and machines or human-machine relations—situated among settings, data, and devices. The settings range from fMRI to Facebook; the data anything from cat images to DNA sequences; the devices include neural networks, support vector machines, and decision trees. He examines specific learning algorithms—writing code and writing about code—and develops an archaeology of operations that, following Foucault, views machine learning as a form of knowledge production and a strategy of power. Exploring layers of abstraction, data infrastructures, coding practices, diagrams, mathematical formalisms, and the social organization of machine learning, Mackenzie traces the mostly invisible architecture of one of the central zones of contemporary technological cultures.
"Mackenzie’s account of machine learning locates places in which a sense of agency can take root. His archaeology of the operational formation of machine learning does not unearth the footprint of a strategic monolith but reveals the local tributaries of force that feed into the generalization and plurality of the field."

--- We really need good histories, and critical studies, of ML, but the mere style of rhetoric here makes me suspicious of whether the author is up to the job. (Which is a prejudice...) . Last tag applies.
to:NB  books:noted  machine_learning  philosophy_of_science  in_wishlist  to_be_shot_after_a_fair_trial 
6 weeks ago
True Enough | The MIT Press
"Philosophy valorizes truth, holding that there can never be epistemically good reasons to accept a known falsehood, or to accept modes of justification that are not truth conducive. How can this stance account for the epistemic standing of science, which unabashedly relies on models, idealizations, and thought experiments that are known not to be true? In True Enough, Catherine Elgin argues that we should not assume that the inaccuracy of models and idealizations constitutes an inadequacy. To the contrary, their divergence from truth or representational accuracy fosters their epistemic functioning. When effective, models and idealizations are, Elgin contends, felicitous falsehoods that exemplify features of the phenomena they bear on. Because works of art deploy the same sorts of felicitous falsehoods, she argues, they also advance understanding.
"Elgin develops a holistic epistemology that focuses on the understanding of broad ranges of phenomena rather than knowledge of individual facts. Epistemic acceptability, she maintains, is a matter not of truth-conduciveness, but of what would be reflectively endorsed by the members of an idealized epistemic community—a quasi-Kantian realm of epistemic ends."

--- Well, the first part sounds interesting...
to:NB  books:noted  modeling  epistemology  philosophy_of_science 
6 weeks ago
Communism for Kids | The MIT Press
"Once upon a time, people yearned to be free of the misery of capitalism. How could their dreams come true? This little book proposes a different kind of communism, one that is true to its ideals and free from authoritarianism. Offering relief for many who have been numbed by Marxist exegesis and given headaches by the earnest pompousness of socialist politics, it presents political theory in the simple terms of a children’s story, accompanied by illustrations of lovable little revolutionaries experiencing their political awakening.
"It all unfolds like a story, with jealous princesses, fancy swords, displaced peasants, mean bosses, and tired workers–not to mention a Ouija board, a talking chair, and a big pot called “the state.” Before they know it, readers are learning about the economic history of feudalism, class struggles in capitalism, different ideas of communism, and more. Finally, competition between two factories leads to a crisis that the workers attempt to solve in six different ways (most of them borrowed from historic models of communist or socialist change). Each attempt fails, since true communism is not so easy after all. But it’s also not that hard. At last, the people take everything into their own hands and decide for themselves how to continue. Happy ending? Only the future will tell. With an epilogue that goes deeper into the theoretical issues behind the story, this book is perfect for all ages and all who desire a better world."
to:NB  books:noted  socialism  to_be_shot_after_a_fair_trial 
6 weeks ago
A Mark of the Mental | The MIT Press
"In A Mark of the Mental, Karen Neander considers the representational power of mental states—described by the cognitive scientist Zenon Pylyshyn as the “second hardest puzzle” of philosophy of mind (the first being consciousness). The puzzle at the heart of the book is sometimes called “the problem of mental content,” “Brentano’s problem,” or “the problem of intentionality.” Its motivating mystery is how neurobiological states can have semantic properties such as meaning or reference. Neander proposes a naturalistic account for sensory-perceptual (nonconceptual) representations.
"Neander draws on insights from state-space semantics (which appeals to relations of second-order similarity between representing and represented domains), causal theories of reference (which claim the reference relation is a causal one), and teleosemantic theories (which claim that semantic norms, at their simplest, depend on functional norms). She proposes and defends an intuitive, theoretically well-motivated but highly controversial thesis: sensory-perceptual systems have the function to produce inner state changes that are the analogs of as well as caused by their referents. Neander shows that the three main elements—functions, causal-information relations, and relations of second-order similarity—complement rather than conflict with each other. After developing an argument for teleosemantics by examining the nature of explanation in the mind and brain sciences, she develops a theory of mental content and defends it against six main content-determinacy challenges to a naturalized semantics."
to:NB  books:noted  philosophy_of_mind  cognitive_science 
6 weeks ago
Minitel | The MIT Press
"A decade before the Internet became a medium for the masses in the United States, tens of millions of users in France had access to a network for e-mail, e-commerce, chat, research, game playing, blogging, and even an early form of online porn. In 1983, the French government rolled out Minitel, a computer network that achieved widespread adoption in just a few years as the government distributed free terminals to every French telephone subscriber. With this volume, Julien Mailland and Kevin Driscoll offer the first scholarly book in English on Minitel, examining it as both a technical system and a cultural phenomenon.
"Mailland and Driscoll argue that Minitel was a technical marvel, a commercial success, and an ambitious social experiment. Other early networks may have introduced protocols and software standards that continue to be used today, but Minitel foretold the social effects of widespread telecomputing. They examine the unique balance of forces that enabled the growth of Minitel: public and private, open and closed, centralized and decentralized. Mailland and Driscoll describe Minitel’s key technological components, novel online services, and thriving virtual communities. Despite the seemingly tight grip of the state, however, a lively Minitel culture emerged, characterized by spontaneity, imagination, and creativity. After three decades of continuous service, Minitel was shut down in 2012, but the history of Minitel should continue to inform our thinking about Internet policy, today and into the future."
to:NB  books:noted  networked_life  the_present_before_it_was_widely_distributed 
6 weeks ago
Minding the Weather | The MIT Press
"This book argues that the human cognition system is the least understood, yet probably most important, component of forecasting accuracy. Minding the Weather investigates how people acquire massive and highly organized knowledge and develop the reasoning skills and strategies that enable them to achieve the highest levels of performance.
"The authors consider such topics as the forecasting workplace; atmospheric scientists’ descriptions of their reasoning strategies; the nature of expertise; forecaster knowledge, perceptual skills, and reasoning; and expert systems designed to imitate forecaster reasoning. Drawing on research in cognitive science, meteorology, and computer science, the authors argue that forecasting involves an interdependence of humans and technologies. Human expertise will always be necessary."
to:NB  prediction  meteorology  cognitive_science  books:noted 
6 weeks ago
Real-World Algorithms | The MIT Press
"Algorithms are what we do in order not to have to do something. Algorithms consist of instructions to carry out tasks—usually dull, repetitive ones. Starting from simple building blocks, computer algorithms enable machines to recognize and produce speech, translate texts, categorize and summarize documents, describe images, and predict the weather. A task that would take hours can be completed in virtually no time by using a few lines of code in a modern scripting program. This book offers an introduction to algorithms through the real-world problems they solve. The algorithms are presented in pseudocode and can readily be implemented in a computer language.
"The book presents algorithms simply and accessibly, without overwhelming readers or insulting their intelligence. Readers should be comfortable with mathematical fundamentals and have a basic understanding of how computers work; all other necessary concepts are explained in the text. After presenting background in pseudocode conventions, basic terminology, and data structures, chapters cover compression, cryptography, graphs, searching and sorting, hashing, classification, strings, and chance. Each chapter describes real problems and then presents algorithms to solve them. Examples illustrate the wide range of applications, including shortest paths as a solution to paragraph line breaks, strongest paths in elections systems, hashes for song recognition, voting power Monte Carlo methods, and entropy for machine learning. Real-World Algorithms can be used by students in disciplines from economics to applied sciences. Computer science majors can read it before using a more technical text."
to:NB  books:noted  algorithms  computation  programming  in_wishlist 
6 weeks ago
Energy and Civilization | The MIT Press
"Energy is the only universal currency; it is necessary for getting anything done. The conversion of energy on Earth ranges from terra-forming forces of plate tectonics to cumulative erosive effects of raindrops. Life on Earth depends on the photosynthetic conversion of solar energy into plant biomass. Humans have come to rely on many more energy flows—ranging from fossil fuels to photovoltaic generation of electricity—for their civilized existence. In this monumental history, Vaclav Smil provides a comprehensive account of how energy has shaped society, from pre-agricultural foraging societies through today’s fossil fuel–driven civilization.
"Humans are the only species that can systematically harness energies outside their bodies, using the power of their intellect and an enormous variety of artifacts—from the simplest tools to internal combustion engines and nuclear reactors. The epochal transition to fossil fuels affected everything: agriculture, industry, transportation, weapons, communication, economics, urbanization, quality of life, politics, and the environment. Smil describes humanity’s energy eras in panoramic and interdisciplinary fashion, offering readers a magisterial overview. This book is an extensively updated and expanded version of Smil’s Energy in World History (1994). Smil has incorporated an enormous amount of new material, reflecting the dramatic developments in energy studies over the last two decades and his own research over that time."
to:NB  books:noted  energy  history  human_ecology  smil.vaclav 
6 weeks ago
Israel, J.: The Expanding Blaze: How the American Revolution Ignited the World, 1775-1848. (eBook and Hardcover)
"The Expanding Blaze is a sweeping history of how the American Revolution inspired revolutions throughout Europe and the Atlantic world in the eighteenth and nineteenth centuries. Jonathan Israel, one of the world’s leading historians of the Enlightenment, shows how the radical ideas of American founders such as Paine, Jefferson, Franklin, Madison, and Monroe set the pattern for democratic revolutions, movements, and constitutions in France, Britain, Ireland, the Netherlands, Belgium, Poland, Greece, Canada, Haiti, Brazil, and Spanish America.
"The Expanding Blaze reminds us that the American Revolution was an astonishingly radical event—and that it didn’t end with the transformation and independence of America. Rather, the Revolution continued to reverberate in Europe and the Americas for the next three-quarters of a century. This comprehensive history of the Revolution’s international influence traces how American efforts to implement Radical Enlightenment ideas—including the destruction of the old regime and the promotion of democratic republicanism, self-government, and liberty—helped drive revolutions abroad, as foreign leaders explicitly followed the American example and espoused American democratic values."
to:NB  books:noted  history_of_ideas  american_history  american_revolution  israel.jonathan  in_wishlist 
6 weeks ago
Don't Fall for Babylonian Trigonometry Hype - Scientific American Blog Network
"Plimpton 322 is a remarkable artifact, and we have much to learn from it. When I taught math history, I loved opening the semester by having my students read a few papers about it to show how much scholarship has gone into understanding such a small document and how accomplished scholars can disagree about what it means. It demonstrates differences in the way different cultures have done mathematics and outstanding computational facility. It has raised questions about how ancient Mesopotamians approached calculation and geometry. But using it to sell a questionable pet theory won’t get us any closer to the answers."
history_of_mathematics  mathematics  ancient_history 
6 weeks ago
Becker, L.C.: A New Stoicism (Revised Edition) (eBook and Paperback).
"What would stoic ethics be like today if stoicism had survived as a systematic approach to ethical theory, if it had coped successfully with the challenges of modern philosophy and experimental science? A New Stoicism proposes an answer to that question, offered from within the stoic tradition but without the metaphysical and psychological assumptions that modern philosophy and science have abandoned. Lawrence Becker argues that a secular version of the stoic ethical project, based on contemporary cosmology and developmental psychology, provides the basis for a sophisticated form of ethical naturalism, in which virtually all the hard doctrines of the ancient Stoics can be clearly restated and defended.
"Becker argues, in keeping with the ancients, that virtue is one thing, not many; that it, and not happiness, is the proper end of all activity; that it alone is good, all other things being merely rank-ordered relative to each other for the sake of the good; and that virtue is sufficient for happiness. Moreover, he rejects the popular caricature of the stoic as a grave figure, emotionally detached and capable mainly of endurance, resignation, and coping with pain. To the contrary, he holds that while stoic sages are able to endure the extremes of human suffering, they do not have to sacrifice joy to have that ability, and he seeks to turn our attention from the familiar, therapeutic part of stoic moral training to a reconsideration of its theoretical foundations."
to:NB  books:noted  ethics  stoicism  the_good_man_is_happy_on_the_rack 
6 weeks ago
The Wind of Change: Maritime Technology, Trade, and Economic Development
"The 1870-1913 period marked the birth of the first era of trade globalization. How did this tremendous increase in trade affect economic development? This work isolates a causality channel by exploiting the fact that the introduction of the steamship in the shipping industry produced an asymmetric change in trade distances among countries. Before this invention, trade routes depended on wind patterns. The steamship reduced shipping costs and time in a disproportionate manner across countries and trade routes. Using this source of variation and novel data on shipping, trade, and development, I find that (i) the adoption of the steamship had a major impact on patterns of trade worldwide; (ii) only a small number of countries, characterized by more inclusive institutions, benefited from trade integration; and (iii) globalization was the major driver of the economic divergence between the rich and the poor portions of the world in the years 1850-1900."

--- For "characterized by more inclusive institutions", read "characterized by imperial power"?
to:NB  economics  economic_history  globalization  imperialism  19th_century_history 
7 weeks ago
Virtual Classrooms: How Online College Courses Affect Student Success
"Online college courses are a rapidly expanding feature of higher education, yet little research identifies their effects relative to traditional in-person classes. Using an instrumental variables approach, we find that taking a course online, instead of in-person, reduces student success and progress in college. Grades are lower both for the course taken online and in future courses. Students are less likely to remain enrolled at the university. These estimates are local average treatment effects for students with access to both online and in-person options; for other students, online classes may be the only option for accessing college-level courses."

--- I will be very curious about their instrument, and whether it's at all plausible.
to:NB  education  instrumental_variables  causal_inference  statistics  re:ADAfaEPoV 
7 weeks ago
« earlier      
academia afghanistan agent-based_models american_history ancient_history archaeology art bad_data_analysis bad_science_journalism bayesian_consistency bayesianism biochemical_networks biology blogged book_reviews books:noted books:owned books:recommended bootstrap causal_inference causality central_asia central_limit_theorem class_struggles_in_america classifiers climate_change clustering cognitive_science collective_cognition community_discovery complexity computational_statistics confidence_sets corruption coveted crime cultural_criticism cultural_evolution cultural_exchange data_analysis data_mining debunking decision-making decision_theory delong.brad democracy density_estimation dimension_reduction distributed_systems dynamical_systems ecology econometrics economic_history economic_policy economics education empirical_processes ensemble_methods entropy_estimation epidemic_models epidemiology_of_representations epistemology ergodic_theory estimation evisceration evolution_of_cooperation evolutionary_biology experimental_psychology finance financial_crisis_of_2007-- financial_speculation fmri food fraud funny funny:academic funny:geeky funny:laughing_instead_of_screaming funny:malicious graph_theory graphical_models have_read heard_the_talk heavy_tails high-dimensional_statistics hilbert_space history_of_ideas history_of_science human_genetics hypothesis_testing ideology imperialism in_nb inequality inference_to_latent_objects information_theory institutions kernel_methods kith_and_kin krugman.paul large_deviations lasso learning_theory linguistics literary_criticism machine_learning macro_from_micro macroeconomics market_failures_in_everything markov_models mathematics mixing model_selection modeling modern_ruins monte_carlo moral_psychology moral_responsibility mortgage_crisis natural_history_of_truthiness network_data_analysis networked_life networks neural_data_analysis neuroscience non-equilibrium nonparametrics optimization our_decrepit_institutions philosophy philosophy_of_science photos physics pittsburgh political_economy political_science practices_relating_to_the_transmission_of_genetic_information prediction pretty_pictures principal_components probability programming progressive_forces psychology public_policy r racism random_fields re:almost_none re:aos_project re:democratic_cognition re:do-institutions-evolve re:g_paper re:homophily_and_confounding re:network_differences re:smoothing_adjacency_matrices re:social_networks_as_sensor_networks re:stacs re:your_favorite_dsge_sucks recipes regression running_dogs_of_reaction science_as_a_social_process science_fiction simulation social_influence social_life_of_the_mind social_media social_networks social_science_methodology sociology something_about_america sparsity spatial_statistics statistical_inference_for_stochastic_processes statistical_mechanics statistics stochastic_processes text_mining the_american_dilemma the_continuing_crises time_series to:blog to:nb to_be_shot_after_a_fair_trial to_read to_teach:complexity-and-inference to_teach:data-mining to_teach:statcomp to_teach:undergrad-ada track_down_references us_politics utter_stupidity vast_right-wing_conspiracy via:? via:henry_farrell via:jbdelong visual_display_of_quantitative_information whats_gone_wrong_with_america why_oh_why_cant_we_have_a_better_academic_publishing_system why_oh_why_cant_we_have_a_better_press_corps

Copy this bookmark:



description:


tags: