Combining natural language processing and network analysis to examine how advocacy organizations stimulate conversation on social media
"Social media sites are rapidly becoming one of the most important forums for public deliberation about advocacy issues. However, social scientists have not explained why some advocacy organizations produce social media messages that inspire far-ranging conversation among social media users, whereas the vast majority of them receive little or no attention. I argue that advocacy organizations are more likely to inspire comments from new social media audiences if they create “cultural bridges,” or produce messages that combine conversational themes within an advocacy field that are seldom discussed together. I use natural language processing, network analysis, and a social media application to analyze how cultural bridges shaped public discourse about autism spectrum disorders on Facebook over the course of 1.5 years, controlling for various characteristics of advocacy organizations, their social media audiences, and the broader social context in which they interact. I show that organizations that create substantial cultural bridges provoke 2.52 times more comments about their messages from new social media users than those that do not, controlling for these factors. This study thus offers a theory of cultural messaging and public deliberation and computational techniques for text analysis and application-based survey research."
to:NB  text_mining  network_data_analysis  social_networks  social_movements  social_media  networked_life 
3 days ago
Predictability and hierarchy in Drosophila behavior
"Even the simplest of animals exhibit behavioral sequences with complex temporal dynamics. Prominent among the proposed organizing principles for these dynamics has been the idea of a hierarchy, wherein the movements an animal makes can be understood as a set of nested subclusters. Although this type of organization holds potential advantages in terms of motion control and neural circuitry, measurements demonstrating this for an animal’s entire behavioral repertoire have been limited in scope and temporal complexity. Here, we use a recently developed unsupervised technique to discover and track the occurrence of all stereotyped behaviors performed by fruit flies moving in a shallow arena. Calculating the optimally predictive representation of the fly’s future behaviors, we show that fly behavior exhibits multiple time scales and is organized into a hierarchical structure that is indicative of its underlying behavioral programs and its changing internal states."
to:NB  clustering  data_mining  biology  information_theory  statistics  bialek.william 
3 days ago
Optimal Rate of Convergence for Empirical Quantiles and Distribution Functions for Time Series - Jirak - 2016 - Journal of Time Series Analysis - Wiley Online Library
"Given a stationary sequence , we are interested in the rate of convergence in the central limit theorem of the empirical quantiles and the empirical distribution function. Under a general notion of weak dependence, we show a Berry–Esseen result with optimal rate n−1/2. The setup includes many prominent time series models, such as functions of ARMA or (augmented) GARCH processes. In this context, optimal Berry–Esseen rates for empirical quantiles appear to be novel."
to:NB  empirical_processes  nonparametrics  statistics  time_series  central_limit_theorem 
3 days ago
Econometrics as evidence? Examining the ‘causal’ connections between financial speculation and commodities prices
"One of the lasting legacies of the financial crisis of 2008, and the legislative energies that followed from it, is the growing reliance on econometrics as part of the rulemaking process. Financial regulators are increasingly expected to rationalize proposed rules using available econometric techniques, and the courts have vacated several key rules emanating from Dodd-Frank on the grounds of alleged deficiencies in this evidentiary effort. The turn toward such econometric tools is seen as a significant constraint on and challenge to regulators as they endeavor to engage with such essential policy questions as the impact of financial speculation on food security. Yet, outside of the specialized practitioner community, very little is known about these techniques. This article examines one such econometric test, Granger causality, and its role in a pivotal Dodd-Frank rulemaking. Through an examination of the test for Granger causality and its attempts to distill the causal connections between financial speculation and commodities prices, the article argues that econometrics is a blunt but useful tool, limited in its ability to provide decisive insights into commodities markets and yet yielding useful returns for those who are able to wield it."
to:NB  sociology_of_science  econometrics  financial_speculation  regulation  causal_inference  time_series  statistics 
3 days ago
[1610.03592] On statistical learning via the lens of compression
"This work continues the study of the relationship between sample compression schemes and statistical learning, which has been mostly investigated within the framework of binary classification. The central theme of this work is establishing equivalences between learnability and compressibility, and utilizing these equivalences in the study of statistical learning theory.
"We begin with the setting of multiclass categorization (zero/one loss). We prove that in this case learnability is equivalent to compression of logarithmic sample size, and that uniform convergence implies compression of constant size.
"We then consider Vapnik's general learning setting: we show that in order to extend the compressibility-learnability equivalence to this case, it is necessary to consider an approximate variant of compression.
"Finally, we provide some applications of the compressibility-learnability equivalences:
"(i) Agnostic-case learnability and realizable-case learnability are equivalent in multiclass categorization problems (in terms of sample complexity).
"(ii) This equivalence between agnostic-case learnability and realizable-case learnability does not hold for general learning problems: There exists a learning problem whose loss function takes just three values, under which agnostic-case and realizable-case learnability are not equivalent.
"(iii) Uniform convergence implies compression of constant size in multiclass categorization problems. Part of the argument includes an analysis of the uniform convergence rate in terms of the graph dimension, in which we improve upon previous bounds.
"(iv) A dichotomy for sample compression in multiclass categorization problems: If a non-trivial compression exists then a compression of logarithmic size exists.
"(v) A compactness theorem for multiclass categorization problems."
to:NB  learning_theory  information_theory  statistics 
4 days ago
Dark Ghettos — Tommie Shelby | Harvard University Press
"Why do American ghettos persist? Decades after Moynihan’s report on the black family and the Kerner Commission’s investigations of urban disorders, deeply disadvantaged black communities remain a disturbing reality. Scholars and commentators today often identify some factor—such as single motherhood, joblessness, or violent street crime—as the key to solving the problem and recommend policies accordingly. But, Tommie Shelby argues, these attempts to “fix” ghettos or “help” their poor inhabitants ignore fundamental questions of justice and fail to see the urban poor as moral agents responding to injustice.
"Drawing on liberal-egalitarian philosophy and informed by leading social science research, Dark Ghettos examines the thorny questions of political morality raised by ghettos. Should government foster integrated neighborhoods? If a “culture of poverty” exists, what interventions are justified? Should single parenthood be avoided or deterred? Is voluntary nonwork or crime an acceptable mode of dissent? How should a criminal justice system treat the oppressed? Shelby offers practical answers, framed in terms of what justice requires of both a government and its citizens, and he views the oppressed as allies in the fight for a society that warrants everyone’s allegiance.
"“The ghetto is not ‘their’ problem but ours, privileged and disadvantaged alike,” Shelby writes. The existence of ghettos is evidence that our society is marred by structural injustices that demand immediate rectification. Dark Ghettos advances a social vision and political ethics that calls for putting the abolition of ghettos at the center of reform."
to:NB  books:noted  political_philosophy  racism  the_american_dilemma  shelby.tommie 
5 days ago
“Socialist Accounting” by Karl Polanyi: with preface “Socialism and the embedded economy” | SpringerLink
"Ariane Fischer, David Woodruff, and Johanna Bockman have translated Karl Polanyi’s “Sozialistische Rechnungslegung” [“Socialist Accounting”] from 1922. In this article, Polanyi laid out his model of a future socialism, a world in which the economy is subordinated to society. Polanyi described the nature of this society and a kind of socialism that he would remain committed to his entire life. Accompanying the translation is the preface titled “Socialism and the embedded economy.” In the preface, Bockman explains the historical context of the article and its significance to the socialist calculation debate, the social sciences, and socialism more broadly. Based on her reading of the accounting and society that Polanyi offers here, Bockman argues that scholars have too narrowly used Polanyi’s work to support the Keynesian welfare state to the exclusion of other institutions, have too broadly used his work to study social institutions indiscriminately, and have not recognized that his work shares fundamental commonalities with and often unacknowledged distinctions from neoclassical economics."
to:NB  to_read  polanyi.karl  socialism  economics  socialist_calculation_debate 
5 days ago
Data-driven agent-based modeling, with application to rooftop solar adoption | SpringerLink
"Agent-based modeling is commonly used for studying complex system properties emergent from interactions among agents. However, agent-based models are often not developed explicitly for prediction, and are generally not validated as such. We therefore present a novel data-driven agent-based modeling framework, in which individual behavior model is learned by machine learning techniques, deployed in multi-agent systems and validated using a holdout sequence of collective adoption decisions. We apply the framework to forecasting individual and aggregate residential rooftop solar adoption in San Diego county and demonstrate that the resulting agent-based model successfully forecasts solar adoption trends and provides a meaningful quantification of uncertainty about its predictions. Meanwhile, we construct a second agent-based model, with its parameters calibrated based on mean square error of its fitted aggregate adoption to the ground truth. Our result suggests that our data-driven agent-based approach based on maximum likelihood estimation substantially outperforms the calibrated agent-based model. Seeing advantage over the state-of-the-art modeling methodology, we utilize our agent-based model to aid search for potentially better incentive structures aimed at spurring more solar adoption. Although the impact of solar subsidies is rather limited in our case, our study still reveals that a simple heuristic search algorithm can lead to more effective incentive plans than the current solar subsidies in San Diego County and a previously explored structure. Finally, we examine an exclusive class of policies that gives away free systems to low-income households, which are shown significantly more efficacious than any incentive-based policies we have analyzed to date."
to:NB  agent-based_models  statistics  re:stacs 
15 days ago
Putting the agent in agent-based modeling | SpringerLink
"One of the perquisites of a talk like this is that I get to expound on broad themes. AAMAS is a conference about agents and multiples of agents, so I probably ought to say something about agents. Of course, my position on agents is that I am all for them. Today I’d like to make a case for actually putting agents in agent-based models. I hope that by the end of the talk you have some idea about what I mean by this."
to:NB  to_read  agent-based_models  ai  wellman.michael_p. 
15 days ago
[1411.2127] Causal Inference with a Graphical Hierarchy of Interventions
"Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another.
"Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures.
"In this paper, we give a unifying view of a large class of causal effects of interest in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula.
"Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl's front-door criterion."
to:NB  to_read  causal_inference  graphical_models  statistics  identifiability 
15 days ago
Rate of convergence in the central limit theorem and in the strong law of large numbers for von mises statistics | SpringerLink
"This paper provides the rate of convergence in the central limit theorem and in the strong law of large numbers forvon Mises statistics VN=N−m∑i1=1N…∑im=1Nh(Xi1,…,Xim),N⩾m, based on i.i.d. random variablesX1,..., XN.
"The proofs rely on a decomposition ofvon Mises statistics into a linear combination ofU-statistics and then use (generalized) results on the convergence rates forU-statistics obtained byGrams/Serfling [1973] andCallaert/Janssen [1978]."
to:NB  asymptotics  central_limit_theorem  u-statistics  re:network_bootstraps 
16 days ago
Cybernetics: A mathematician of mind : Nature : Nature Research
A nice review of a promising-sounding book. (And I had no idea that our Manuel Blum was a pupil of McCulloch's!)
in_NB  book_reviews  mcculloch.warren  cybernetics  neuroscience 
16 days ago
Combing natural language processing and network analysis to examine how advocacy organizations stimulate conversation on social media
"Social media sites are rapidly becoming one of the most important forums for public deliberation about advocacy issues. However, social scientists have not explained why some advocacy organizations produce social media messages that inspire far-ranging conversation among social media users, whereas the vast majority of them receive little or no attention. I argue that advocacy organizations are more likely to inspire comments from new social media audiences if they create “cultural bridges,” or produce messages that combine conversational themes within an advocacy field that are seldom discussed together. I use natural language processing, network analysis, and a social media application to analyze how cultural bridges shaped public discourse about autism spectrum disorders on Facebook over the course of 1.5 years, controlling for various characteristics of advocacy organizations, their social media audiences, and the broader social context in which they interact. I show that organizations that create substantial cultural bridges provoke 2.52 times more comments about their messages from new social media users than those that do not, controlling for these factors. This study thus offers a theory of cultural messaging and public deliberation and computational techniques for text analysis and application-based survey research."
to:NB  networked_life  social_media  social_networks  text_mining  sociology 
17 days ago
Lopez, D.S., Jr.: The <i>Lotus Sūtra</i>: A Biography. (eBook and Hardcover)
"The Lotus Sutra is arguably the most famous of all Buddhist scriptures. Composed in India in the first centuries of the Common Era, it is renowned for its inspiring message that all beings are destined for supreme enlightenment. Here, Donald Lopez provides an engaging and accessible biography of this enduring classic.
"Lopez traces the many roles the Lotus Sutra has played in its travels through Asia, Europe, and across the seas to America. The story begins in India, where it was one of the early Mahayana sutras, which sought to redefine the Buddhist path. In the centuries that followed, the text would have a profound influence in China and Japan, and would go on to play a central role in the European discovery of Buddhism. It was the first Buddhist sutra to be translated from Sanskrit into a Western language—into French in 1844 by the eminent scholar Eugène Burnouf. That same year, portions of the Lotus Sutra appeared in English in The Dial, the journal of New England’s Transcendentalists. Lopez provides a balanced account of the many controversies surrounding the text and its teachings, and describes how the book has helped to shape the popular image of the Buddha today. He explores how it was read by major literary figures such as Henry David Thoreau and Gustave Flaubert, and how it was used to justify self-immolation in China and political extremism in Japan."
to:NB  books:noted  buddhism  history_of_religion  history_of_ideas  india  cultural_exchange 
19 days ago
Contagion in Financial Networks
"The recent financial crisis has prompted much new research on the interconnectedness of the modern financial system and the extent to which it contributes to systemic fragility. Network connections diversify firms' risk exposures, but they also create channels through which shocks can spread by contagion. We review the extensive literature on this issue, with the focus on how network structure interacts with other key variables such as leverage, size, common exposures, and short-term funding. We discuss various metrics that have been proposed for evaluating the susceptibility of the system to contagion and suggest directions for future research."
to:NB  financial_markets  contagion  social_networks  economics  young.h._peyton 
19 days ago
Autocatalytic, bistable, oscillatory networks of biologically relevant organic reactions : Nature : Nature Research
"Networks of organic chemical reactions are important in life and probably played a central part in its origin1, 2, 3. Network dynamics regulate cell division4, 5, 6, circadian rhythms7, nerve impulses8 and chemotaxis9, and guide the development of organisms10. Although out-of-equilibrium networks of chemical reactions have the potential to display emergent network dynamics11 such as spontaneous pattern formation, bistability and periodic oscillations12, 13, 14, the principles that enable networks of organic reactions to develop complex behaviours are incompletely understood. Here we describe a network of biologically relevant organic reactions (amide formation, thiolate–thioester exchange, thiolate–disulfide interchange and conjugate addition) that displays bistability and oscillations in the concentrations of organic thiols and amides. Oscillations arise from the interaction between three subcomponents of the network: an autocatalytic cycle that generates thiols and amides from thioesters and dialkyl disulfides; a trigger that controls autocatalytic growth; and inhibitory processes that remove activating thiol species that are produced during the autocatalytic cycle. In contrast to previous studies that have demonstrated oscillations and bistability using highly evolved biomolecules (enzymes15 and DNA16, 17) or inorganic molecules of questionable biochemical relevance (for example, those used in Belousov–Zhabotinskii-type reactions)18, 19, the organic molecules we use are relevant to metabolism and similar to those that might have existed on the early Earth. By using small organic molecules to build a network of organic reactions with autocatalytic, bistable and oscillatory behaviour, we identify principles that explain the ways in which dynamic networks relevant to life could have developed. Modifications of this network will clarify the influence of molecular structure on the dynamics of reaction networks, and may enable the design of biomimetic networks and of synthetic self-regulating and evolving chemical systems."
to:NB  biochemical_networks  pattern_formation  non-equilibrium  chemistry  physics 
19 days ago
Foundations of Ergodic Theory | Abstract Analysis | Cambridge University Press
"Rich with examples and applications, this textbook provides a coherent and self-contained introduction to ergodic theory, suitable for a variety of one- or two-semester courses. The authors' clear and fluent exposition helps the reader to grasp quickly the most important ideas of the theory, and their use of concrete examples illustrates these ideas and puts the results into perspective. The book requires few prerequisites, with background material supplied in the appendix. The first four chapters cover elementary material suitable for undergraduate students – invariance, recurrence and ergodicity – as well as some of the main examples. The authors then gradually build up to more sophisticated topics, including correlations, equivalent systems, entropy, the variational principle and thermodynamical formalism. The 400 exercises increase in difficulty through the text and test the reader's understanding of the whole theory. Hints and solutions are provided at the end of the book."
to:NB  ergodic_theory  mathematics  stochastic_processes  probability  books:noted  re:almost_none 
22 days ago
Humanitarian Invasion: Global Development in Cold War Afghanistan | Global History | Cambridge University Press
"Humanitarian Invasion is the first book of its kind: a ground-level inside account of what development and humanitarianism meant for Afghanistan, a country touched by international aid like no other. Relying on Soviet, Western, and NGO archives, interviews with Soviet advisers and NGO workers, and Afghan sources, Timothy Nunan forges a vivid account of the impact of development on a country on the front lines of the Cold War. Nunan argues that Afghanistan functioned as a laboratory for the future of the Third World nation-state. If, in the 1960s, Soviets, Americans, and Germans sought to make a territorial national economy for Afghanistan, later, under military occupation, Soviet nation-builders, French and Swedish humanitarians, and Pakistani-supported guerrillas fought a transnational civil war over Afghan statehood. Covering the entire period from the Cold War to Taliban rule, Humanitarian Invasion signals the beginning of a new stage in the writing of international history."
to:NB  books:noted  development_economics  afghanistan  cold_war  20th_century_history 
4 weeks ago
Empires and Bureaucracy in World History | Global History | Cambridge University Press
"How did empires rule different peoples across vast expanses of space and time? And how did small numbers of imperial bureaucrats govern large numbers of subordinated peoples? Empires and Bureaucracy in World History seeks answers to these fundamental problems in imperial studies by exploring the power and limits of bureaucracy. The book is pioneering in bringing together historians of antiquity and the Middle Ages with scholars of post-medieval European empires, while a genuinely world-historical perspective is provided by chapters on China, the Incas and the Ottomans. The editors identify a paradox in how bureaucracy operated on the scale of empires and so help explain why some empires endured for centuries while, in the contemporary world, empires fail almost before they begin. By adopting a cross-chronological and world-historical approach, the book challenges the abiding association of bureaucratic rationality with 'modernity' and the so-called 'Rise of the West'."
to:NB  books:noted  bureaucracy  imperialism  world_history  comparative_history 
4 weeks ago
Of Limits and Growth | Global History | Cambridge University Press
"Of Limits and Growth connects three of the most important aspects of the twentieth century: decolonization, the rise of environmentalism, and the United States' support for economic development and modernization in the Third World. It links these trends by revealing how environmental NGOs challenged and reformed development approaches of the U.S. government, World Bank, and United Nations from the 1960s through the 1990s. The book shows how NGOs promoted the use of “appropriate” technologies, environmental reviews in the lending process, development plans based on ecological principles, and international cooperation on global issues such as climate change. It also reveals that the “sustainable development” concept emerged from transnational negotiations in which environmentalists accommodated the developmental aspirations of Third World intellectuals and leaders. In sum, Of Limits and Growth offers a new history of sustainability by elucidating the global origins of environmental activism, the ways in which environmental activists challenged development approaches worldwide, and how environmental non-state actors reshaped the United States' and World Bank's development policies."
to:NB  books:noted  20th_century_history  environmentalism  economic_policy  development_economics 
4 weeks ago
Vellend, M.: The Theory of Ecological Communities (MPB-57) (eBook and Hardcover).
"A plethora of different theories, models, and concepts make up the field of community ecology. Amid this vast body of work, is it possible to build one general theory of ecological communities? What other scientific areas might serve as a guiding framework? As it turns out, the core focus of community ecology—understanding patterns of diversity and composition of biological variants across space and time—is shared by evolutionary biology and its very coherent conceptual framework, population genetics theory. The Theory of Ecological Communities takes this as a starting point to pull together community ecology’s various perspectives into a more unified whole.
"Mark Vellend builds a theory of ecological communities based on four overarching processes: selection among species, drift, dispersal, and speciation. These are analogues of the four central processes in population genetics theory—selection within species, drift, gene flow, and mutation—and together they subsume almost all of the many dozens of more specific models built to describe the dynamics of communities of interacting species. The result is a theory that allows the effects of many low-level processes, such as competition, facilitation, predation, disturbance, stress, succession, colonization, and local extinction to be understood as the underpinnings of high-level processes with widely applicable consequences for ecological communities."
to:NB  books:noted  ecology  evolutionary_biology 
4 weeks ago
Graphical Modeling for Multivariate Hawkes Processes with Nonparametric Link Functions - Eichler - 2016 - Journal of Time Series Analysis - Wiley Online Library
"Hawkes (1971a) introduced a powerful multivariate point process model of mutually exciting processes to explain causal structure in data. In this article, it is shown that the Granger causality structure of such processes is fully encoded in the corresponding link functions of the model. A new nonparametric estimator of the link functions based on a time-discretized version of the point process is introduced by using an infinite order autoregression. Consistency of the new estimator is derived. The estimator is applied to simulated data and to neural spike train data from the spinal dorsal horn of a rat."
to:NB  time_series  graphical_models  nonparametrics  statistics  point_processes  neural_data_analysis 
5 weeks ago
Turing learning: a metric-free approach to inferring behavior and its application to swarms | SpringerLink
"We propose Turing Learning, a novel system identification method for inferring the behavior of natural or artificial systems. Turing Learning simultaneously optimizes two populations of computer programs, one representing models of the behavior of the system under investigation, and the other representing classifiers. By observing the behavior of the system as well as the behaviors produced by the models, two sets of data samples are obtained. The classifiers are rewarded for discriminating between these two sets, that is, for correctly categorizing data samples as either genuine or counterfeit. Conversely, the models are rewarded for ‘tricking’ the classifiers into categorizing their data samples as genuine. Unlike other methods for system identification, Turing Learning does not require predefined metrics to quantify the difference between the system and its models. We present two case studies with swarms of simulated robots and prove that the underlying behaviors cannot be inferred by a metric-based system identification method. By contrast, Turing Learning infers the behaviors with high accuracy. It also produces a useful by-product—the classifiers—that can be used to detect abnormal behavior in the swarm. Moreover, we show that Turing Learning also successfully infers the behavior of physical robot swarms. The results show that collective behaviors can be directly inferred from motion trajectories of individuals in the swarm, which may have significant implications for the study of animal collectives. Furthermore, Turing Learning could prove useful whenever a behavior is not easily characterizable using metrics, making it suitable for a wide range of applications."

--- Oh FFS. Co-evolutionary learning of classifiers and hard instances was an old idea when I encountered it in graduate school 20+ years ago. (See, e.g., the discussion of Hillis's work in the 1980s in ch. 1 of Mitchell's _Introduction to Genetic Algorithms_ [1996].) I suppose it's possible that the paper acknowledges this is a new implementation of an ancient idea, while the abstract (and the publicity: http://www.defenseone.com/technology/2016/09/new-ai-learns-through-observation-alone-what-means-drone-surveillance/131322/ ) is breathless. It's _possible_.

(Also: anyone who thinks that using classification accuracy means they're doing "metric-free systems identification" fully deserves what will happen to them.)
machine_learning  reinventing_the_wheel_and_putting_out_a_press_release  to_be_shot_after_a_fair_trial  why_oh_why_cant_we_have_a_better_academic_publishing_system 
6 weeks ago
Belief without credence | SpringerLink
"One of the deepest ideological divides in contemporary epistemology concerns the relative importance of belief versus credence. A prominent consideration in favor of credence-based epistemology is the ease with which it appears to account for rational action. In contrast, cases with risky payoff structures threaten to break the link between rational belief and rational action. This threat poses a challenge to traditional epistemology, which maintains the theoretical prominence of belief. The core problem, we suggest, is that belief may not be enough to register all aspects of a subject’s epistemic position with respect to any given proposition. We claim this problem can be solved by introducing other doxastic attitudes—genuine representations—that differ in strength from belief. The resulting alternative picture, a kind of doxastic states pluralism, retains the central features of traditional epistemology—most saliently, an emphasis on truth as a kind of objective accuracy—while adequately accounting for rational action."
to:NB  epistemology  decision_theory  rationality 
6 weeks ago
IEEE Xplore Document - Excess-Risk of Distributed Stochastic Learners
"This work studies the learning ability of consensus and diffusion distributed learners from continuous streams of data arising from different but related statistical distributions. Four distinctive features for diffusion learners are revealed in relation to other decentralized schemes even under leftstochastic combination policies. First, closed-form expressions for the evolution of their excess-risk are derived for strongly-convex risk functions under a diminishing step-size rule. Second, using these results, it is shown that the diffusion strategy improves the asymptotic convergence rate of the excess-risk relative to non-cooperative schemes. Third, it is shown that when the innetwork cooperation rules are designed optimally, the performance of the diffusion implementation can outperform that of naive centralized processing. Finally, the arguments further show that diffusion outperforms consensus strategies asymptotically, and that the asymptotic excess-risk expression is invariant to the particular network topology. The framework adopted in this work studies convergence in the stronger mean-squareerror sense, rather than in distribution, and develops tools that enable a close examination of the differences between distributed strategies in terms of asymptotic behavior, as well as in terms of convergence rates."
to:NB  learning_theory  distributed_systems  statistics  collective_cognition  via:ded-maxim 
6 weeks ago
[1609.00037] Good Enough Practices in Scientific Computing
"We present a set of computing tools and techniques that every researcher can and should adopt. These recommendations synthesize inspiration from our own work, from the experiences of the thousands of people who have taken part in Software Carpentry and Data Carpentry workshops over the past six years, and from a variety of other guides. Unlike some other guides, our recommendations are aimed specifically at people who are new to research computing."
to:NB  to_teach:statcomp  to_teach  scientific_computing  have_read  to:blog 
6 weeks ago
[1606.08650] Approximate Smoothing and Parameter Estimation in High-Dimensional State-Space Models
"We present approximate algorithms for performing smoothing in a class of high-dimensional state-space models via sequential Monte Carlo methods ("particle filters"). In high dimensions, a prohibitively large number of Monte Carlo samples ("particles") -- growing exponentially in the dimension of the state space -- is usually required to obtain a useful smoother. Using blocking strategies as in Rebeschini and Van Handel (2015) (and earlier pioneering work on blocking), we exploit the spatial ergodicity properties of the model to circumvent this curse of dimensionality. We thus obtain approximate smoothers that can be computed recursively in time and in parallel in space. First, we show that the bias of our blocked smoother is bounded uniformly in the time horizon and in the model dimension. We then approximate the blocked smoother with particles and derive the asymptotic variance of idealised versions of our blocked particle smoother to show that variance is no longer adversely effected by the dimension of the model. Finally, we employ our method to successfully perform maximum-likelihood estimation via stochastic gradient-ascent and stochastic expectation--maximisation algorithms in a 100-dimensional state-space model."
to:NB  particle_filters  time_series  statistical_inference_for_stochastic_processes  filtering  stochastic_processes  state-space_models  high-dimensional_statistics  singh.sumeetpal_s.  statistics  re:fitness_sampling 
6 weeks ago
[1508.05906] Chaining, Interpolation, and Convexity
"We show that classical chaining bounds on the suprema of random processes in terms of entropy numbers can be systematically improved when the underlying set is convex: the entropy numbers need not be computed for the entire set, but only for certain "thin" subsets. This phenomenon arises from the observation that real interpolation can be used as a natural chaining mechanism. Unlike the general form of Talagrand's generic chaining method, which is sharp but often difficult to use, the resulting bounds involve only entropy numbers but are nonetheless sharp in many situations in which classical entropy bounds are suboptimal. Such bounds are readily amenable to explicit computations in specific examples, and we discover some old and new geometric principles for the control of chaining functionals as special cases."
to:NB  empirical_processes  learning_theory  approximation  convexity  functional_analysis  van_handel.ramon 
6 weeks ago
[1301.6585] Can local particle filters beat the curse of dimensionality?
"The discovery of particle filtering methods has enabled the use of nonlinear filtering in a wide array of applications. Unfortunately, the approximation error of particle filters typically grows exponentially in the dimension of the underlying model. This phenomenon has rendered particle filters of limited use in complex data assimilation problems. In this paper, we argue that it is often possible, at least in principle, to develop local particle filtering algorithms whose approximation error is dimension-free. The key to such developments is the decay of correlations property, which is a spatial counterpart of the much better understood stability property of nonlinear filters. For the simplest possible algorithm of this type, our results provide under suitable assumptions an approximation error bound that is uniform both in time and in the model dimension. More broadly, our results provide a framework for the investigation of filtering problems and algorithms in high dimension."
to:NB  filtering  van_handel.ramon  particle_filters  stochastic_processes  high-dimensional_statistics 
6 weeks ago
[1308.4117] Comparison Theorems for Gibbs Measures
"The Dobrushin comparison theorem is a powerful tool to bound the difference between the marginals of high-dimensional probability distributions in terms of their local specifications. Originally introduced to prove uniqueness and decay of correlations of Gibbs measures, it has been widely used in statistical mechanics as well as in the analysis of algorithms on random fields and interacting Markov chains. However, the classical comparison theorem requires validity of the Dobrushin uniqueness criterion, essentially restricting its applicability in most models to a small subset of the natural parameter space. In this paper we develop generalized Dobrushin comparison theorems in terms of influences between blocks of sites, in the spirit of Dobrushin-Shlosman and Weitz, that substantially extend the range of applicability of the classical comparison theorem. Our proofs are based on the analysis of an associated family of Markov chains. We develop in detail an application of our main results to the analysis of sequential Monte Carlo algorithms for filtering in high dimension."
to:NB  statistical_mechanics  stochastic_processes  ergodic_theory  mixing  van_handel.ramon 
6 weeks ago
The price of complexity in financial networks
"Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises."
to:NB  networks  economics  financial_markets  risk_assessment 
6 weeks ago
How chimpanzees cooperate in a competitive world
"Our species is routinely depicted as unique in its ability to achieve cooperation, whereas our closest relative, the chimpanzee (Pan troglodytes), is often characterized as overly competitive. Human cooperation is assisted by the cost attached to competitive tendencies through enforcement mechanisms, such as punishment and partner choice. To examine if chimpanzees possess the same ability to mitigate competition, we set up a cooperative task in the presence of the entire group of 11 adults, which required two or three individuals to pull jointly to receive rewards. This open-group set-up provided ample opportunity for competition (e.g., freeloading, displacements) and aggression. Despite this unique set-up and initial competitiveness, cooperation prevailed in the end, being at least five times as common as competition. The chimpanzees performed 3,565 cooperative acts while using a variety of enforcement mechanisms to overcome competition and freeloading, as measured by (attempted) thefts of rewards. These mechanisms included direct protest by the target, third-party punishment in which dominant individuals intervened against freeloaders, and partner choice. There was a marked difference between freeloading and displacement; freeloading tended to elicit withdrawal and third-party interventions, whereas displacements were met with a higher rate of direct retaliation. Humans have shown similar responses in controlled experiments, suggesting shared mechanisms across the primates to mitigate competition for the sake of cooperation."
to:NB  evolution_of_cooperation  primates 
6 weeks ago
Semantic representations in the temporal pole predict false memories
"Recent advances in neuroscience have given us unprecedented insight into the neural mechanisms of false memory, showing that artificial memories can be inserted into the memory cells of the hippocampus in a way that is indistinguishable from true memories. However, this alone is not enough to explain how false memories can arise naturally in the course of our daily lives. Cognitive psychology has demonstrated that many instances of false memory, both in the laboratory and the real world, can be attributed to semantic interference. Whereas previous studies have found that a diverse set of regions show some involvement in semantic false memory, none have revealed the nature of the semantic representations underpinning the phenomenon. Here we use fMRI with representational similarity analysis to search for a neural code consistent with semantic false memory. We find clear evidence that false memories emerge from a similarity-based neural code in the temporal pole, a region that has been called the “semantic hub” of the brain. We further show that each individual has a partially unique semantic code within the temporal pole, and this unique code can predict idiosyncratic patterns of memory errors. Finally, we show that the same neural code can also predict variation in true-memory performance, consistent with an adaptive perspective on false memory. Taken together, our findings reveal the underlying structure of neural representations of semantic knowledge, and how this semantic structure can both enhance and distort our memories."
to:NB  neuroscience  psychology  memory  fmri 
6 weeks ago
Intellectual Pursuits of Nicolas Rashevsky - The Queer Duck | Maya Shmailov | Springer
"Who was Nicolas Rashevsky? To answer that question, this book draws on Rashevsky’s unexplored personal archival papers and shares interviews with his family, students and friends, as well as discussions with biologists and mathematical biologists, to flesh out and complete the picture.
"“Most modern-day biologists have never heard of Rashevsky. Why?” In what constitutes the first detailed biography of theoretical physicist Nicolas Rashevsky (1899-1972), spanning key aspects of his long scientific career, the book captures Rashevsky’s ways of thinking about the place mathematical biology should have in biology and his personal struggle for the acceptance of his views. It brings to light the tension between mathematicians, theoretical physicists and biologists when it comes to the introduction of physico-mathematical tools into biology. Rashevsky’s successes and failures in his efforts to establish mathematical biology as a subfield of biology provide an important test case for understanding the role of theory (in particular mathematics) in understanding the natural world.
"With the biological sciences moving towards new vistas of inter- and multi-disciplinary collaborations and research programs, the book will appeal to a wide readership ranging from historians, sociologists, and ethnographers of American science and culture to students and general readers with an interest in the history of the life sciences, mathematical biology and the social construction of science."

--- Rashevsky has long seemed to me to be a key player in the secret intellectual history of the 20th century, someone who influenced and encouraged all sorts of people who made more famous (and perhaps more lasting) contributions...
to:NB  books:noted  history_of_science  lives_of_the_scientists  rashevsky.nicolas 
6 weeks ago
[1510.04740] Semiparametric theory and empirical processes in causal inference
"In this paper we review important aspects of semiparametric theory and empirical processes that arise in causal inference problems. We begin with a brief introduction to the general problem of causal inference, and go on to discuss estimation and inference for causal effects under semiparametric models, which allow parts of the data-generating process to be unrestricted if they are not of particular interest (i.e., nuisance functions). These models are very useful in causal problems because the outcome process is often complex and difficult to model, and there may only be information available about the treatment process (at best). Semiparametric theory gives a framework for benchmarking efficiency and constructing estimators in such settings. In the second part of the paper we discuss empirical process theory, which provides powerful tools for understanding the asymptotic behavior of semiparametric estimators that depend on flexible nonparametric estimators of nuisance functions. These tools are crucial for incorporating machine learning and other modern methods into causal inference analyses. We conclude by examining related extensions and future directions for work in semiparametric causal inference."
to:NB  statistics  causal_inference  nonparametrics  empirical_processes  to_read  kith_and_kin  kennedy.edward_h. 
7 weeks ago
Measuring Paradigmaticness of Disciplines Using Text | Sociological Science
"In this paper, we describe new methods that use the text of publications to measure the paradigmaticness of disciplines. Drawing on the text of published articles in the Web of Science, we build samples of disciplinary discourse. Using these language samples, we measure the two core concepts of paradigmaticness—consensus and rapid discovery (Collins 1994)—and show the relative positioning of eight example disciplines on each of these measures. Our measures show consistent differences between the “hard” sciences and “soft” social sciences. Deviations in the expected ranking of disciplines within the sciences and social sciences suggest new interpretations of the hierarchy of disciplines, directions for future research, and further insight into the developments in disciplinary structure and discourse that shape paradigmaticness."
to:NB  sociology_of_science  text_mining  via:kjhealy 
7 weeks ago
Asymmetric Information and Intermediation Chains
"We propose a parsimonious model of bilateral trade under asymmetric information to shed light on the prevalence of intermediation chains that stand between buyers and sellers in many decentralized markets. Our model features a classic problem in economics where an agent uses his market power to inefficiently screen a privately informed counterparty. Paradoxically, involving moderately informed intermediaries also endowed with market power can improve trade efficiency. Long intermediation chains in which each trader's information set is similar to those of his direct counterparties limit traders' incentives to post prices that reduce trade volume and jeopardize gains to trade."
to:NB  economics  market_failures_in_everything  economics_of_imperfect_information 
7 weeks ago
Robust Social Decisions
"We propose and operationalize normative principles to guide social decisions when individuals potentially have imprecise and heterogeneous beliefs, in addition to conflicting tastes or interests. To do so, we adapt the standard Pareto principle to those preference comparisons that are robust to belief imprecision and characterize social preferences that respect this robust principle. We also characterize a suitable restriction of this principle. The former principle provides stronger guidance when it can be satisfied; when it cannot, the latter always provides minimal guidance."
to:NB  decision_theory  social_choice  risk_vs_uncertainty  re:knightian_uncertainty 
7 weeks ago
Neurocomputational mechanisms of prosocial learning and links to empathy
"Reinforcement learning theory powerfully characterizes how we learn to benefit ourselves. In this theory, prediction errors—the difference between a predicted and actual outcome of a choice—drive learning. However, we do not operate in a social vacuum. To behave prosocially we must learn the consequences of our actions for other people. Empathy, the ability to vicariously experience and understand the affect of others, is hypothesized to be a critical facilitator of prosocial behaviors, but the link between empathy and prosocial behavior is still unclear. During functional magnetic resonance imaging (fMRI) participants chose between different stimuli that were probabilistically associated with rewards for themselves (self), another person (prosocial), or no one (control). Using computational modeling, we show that people can learn to obtain rewards for others but do so more slowly than when learning to obtain rewards for themselves. fMRI revealed that activity in a posterior portion of the subgenual anterior cingulate cortex/basal forebrain (sgACC) drives learning only when we are acting in a prosocial context and signals a prosocial prediction error conforming to classical principles of reinforcement learning theory. However, there is also substantial variability in the neural and behavioral efficiency of prosocial learning, which is predicted by trait empathy. More empathic people learn more quickly when benefitting others, and their sgACC response is the most selective for prosocial learning. We thus reveal a computational mechanism driving prosocial learning in humans. This framework could provide insights into atypical prosocial behavior in those with disorders of social cognition."
to:NB  psychology  reinforcement_learning  learning_in_games  evolution_of_cooperation  neuroscience  fmri 
7 weeks ago
European Neolithic societies showed early warning signals of population collapse
"Ecosystems on the verge of major reorganization—regime shift—may exhibit declining resilience, which can be detected using a collection of generic statistical tests known as early warning signals (EWSs). This study explores whether EWSs anticipated human population collapse during the European Neolithic. It analyzes recent reconstructions of European Neolithic (8–4 kya) population trends that reveal regime shifts from a period of rapid growth following the introduction of agriculture to a period of instability and collapse. We find statistical support for EWSs in advance of population collapse. Seven of nine regional datasets exhibit increasing autocorrelation and variance leading up to collapse, suggesting that these societies began to recover from perturbation more slowly as resilience declined. We derive EWS statistics from a prehistoric population proxy based on summed archaeological radiocarbon date probability densities. We use simulation to validate our methods and show that sampling biases, atmospheric effects, radiocarbon calibration error, and taphonomic processes are unlikely to explain the observed EWS patterns. The implications of these results for understanding the dynamics of Neolithic ecosystems are discussed, and we present a general framework for analyzing societal regime shifts using EWS at large spatial and temporal scales. We suggest that our findings are consistent with an adaptive cycling model that highlights both the vulnerability and resilience of early European populations. We close by discussing the implications of the detection of EWS in human systems for archaeology and sustainability science."
to:NB  archaeology  ecology  statistics  time_series 
7 weeks ago
Interactive R On-Line
"IROL was developed by the team of Howard Seltman (email feedback), Rebecca Nugent, Sam Ventura, Ryan Tibshirani, and Chris Genovese at the Department of Statistics at Carnegie Mellon University."

--- I mark this as "to_teach:statcomp", but of course the point is to have people go through this _before_ that course, so the class can cover more interesting stuff.
R  kith_and_kin  seltman.howard  nugent.rebecca  genovese.christopher  ventura.samuel  tibshirani.ryan  to_teach:statcomp 
7 weeks ago
"ShinyTex is a system for authoring interactive World Wide Web applications (apps) which includes the full capabilities of the R statistical language, particularly in the context of Technology Enhanced Learning (TEL). It uses a modified version of the LaTeX syntax that is standard for document creation among mathematicians and statisticians. It is built on the Shiny platform, an extension of R designed by RStudio to produce web apps. The goal is to provide an easy to use TEL authoring environment with excellent mathematical and statistical support using only free software. ShinyTex authoring can be performed on Windows, OS X, and Linux. Users may view the app on any system with a standard web browser."
R  latex  kith_and_kin  seltman.howard 
7 weeks ago
Organizing Enlightenment: Information Overload and the Invention of the Modern Research University
"Since its inception, the research university has been the central institution of knowledge in the West. Today its intellectual authority is being challenged on many fronts, above all by radical technological change. Organizing Enlightenment tells the story of how the university emerged in the early nineteenth century at a similarly fraught moment of cultural anxiety about revolutionary technologies and their disruptive effects on established institutions of knowledge.
"Drawing on the histories of science, the university, and print, as well as media theory and philosophy, Chad Wellmon explains how the research university and the ethic of disciplinarity it created emerged as the final and most lasting technology of the Enlightenment. Organizing Enlightenment reveals higher education’s story as one not only of the production of knowledge but also of the formation of a particular type of person: the disciplinary self. In order to survive, the university would have to institutionalize a new order of knowledge, one that was self-organizing, internally coherent, and embodied in the very character of the modern, critical scholar."
to:NB  books:noted  academia  history_of_ideas  social_life_of_the_mind  enlightenment  the_present_before_it_was_widely_distributed 
7 weeks ago
Age of System: Understanding the Development of Modern Social Science
"Before the Second World War, social scientists struggled to define and defend their disciplines. After the war, "high modern" social scientists harnessed new resources in a quest to create a unified understanding of human behavior—and to remake the world in the image of their new model man.
"In Age of System, Hunter Heyck explains why social scientists—shaped by encounters with the ongoing "organizational revolution" and its revolutionary technologies of communication and control—embraced a new and extremely influential perspective on science and nature, one that conceived of all things in terms of system, structure, function, organization, and process. He also explores how this emerging unified theory of human behavior implied a troubling similarity between humans and machines, with freighted implications for individual liberty and self-direction.
"These social scientists trained a generation of decision-makers in schools of business and public administration, wrote the basic textbooks from which millions learned how the economy, society, polity, culture, and even the mind worked, and drafted the position papers, books, and articles that helped set the terms of public discourse in a new era of mass media, think tanks, and issue networks. Drawing on close readings of key texts and a broad survey of more than 1,800 journal articles, Heyck follows the dollars—and the dreams—of a generation of scholars that believed in "the system." He maps the broad landscape of changes in the social sciences, focusing especially intently on the ideas and practices associated with modernization theory, rational choice theory, and modeling. A highly accomplished historian, Heyck relays this complicated story with unusual clarity."
to:NB  books:noted  history_of_science  american_history  science_as_a_social_process  social_science_methodology 
7 weeks ago
The Cybernetics Moment: Or Why We Call Our Age the Information Age
"Cybernetics—the science of communication and control as it applies to machines and to humans—originates from efforts during World War II to build automatic anti-aircraft systems. Following the war, this science extended beyond military needs to examine all systems that rely on information and feedback, from the level of the cell to that of society. In The Cybernetics Moment, Ronald R. Kline, a senior historian of technology, examines the intellectual and cultural history of cybernetics and information theory, whose language of "information," "feedback," and "control" transformed the idiom of the sciences, hastened the development of information technologies, and laid the conceptual foundation for what we now call the Information Age.
"Kline argues that, for about twenty years after 1950, the growth of cybernetics and information theory and ever-more-powerful computers produced a utopian information narrative—an enthusiasm for information science that influenced natural scientists, social scientists, engineers, humanists, policymakers, public intellectuals, and journalists, all of whom struggled to come to grips with new relationships between humans and intelligent machines.
"Kline traces the relationship between the invention of computers and communication systems and the rise, decline, and transformation of cybernetics by analyzing the lives and work of such notables as Norbert Wiener, Claude Shannon, Warren McCulloch, Margaret Mead, Gregory Bateson, and Herbert Simon. Ultimately, he reveals the crucial role played by the cybernetics moment—when cybernetics and information theory were seen as universal sciences—in setting the stage for our current preoccupation with information technologies."
to:NB  books:noted  history_of_science  history_of_ideas  cybernetics  information_theory  american_history  wiener.norbert  simon.herbert  the_present_before_it_was_widely_distributed 
7 weeks ago
Information at Sea: Shipboard Command and Control in the U.S. Navy, from Mobile Bay to Okinawa
"The brain of a modern warship is its combat information center (CIC). Data about friendly and enemy forces pour into this nerve center, contributing to command decisions about firing, maneuvering, and coordinating. Timothy S. Wolters has written the first book to investigate the history of the CIC and the many other command and control systems adopted by the U.S. Navy from the Civil War to World War II. What institutional ethos spurred such innovation? Information at Sea tells the fascinating stories of the naval and civilian personnel who developed an array of technologies for managing information at sea, from signal flares and radio to encryption machines and radar.
"Wolters uses previously untapped archival sources to explore how one of America's most technologically oriented institutions addressed information management before the advent of the digital computer. He argues that the human-machine systems used to coordinate forces were as critical to naval successes in World War II as the ships and commanders more familiar to historians."
to:NB  books:noted  history_of_technology  us_military  american_history  innovation  social_life_of_the_mind 
7 weeks ago
Space and the American Imagination
"People dreamed of cosmic exploration—winged spaceships and lunar voyages; space stations and robot astronauts—long before it actually happened. Space and the American Imagination traces the emergence of space travel in the popular mind, its expression in science fiction, and its influence on national space programs.
"Space exploration dramatically illustrates the power of imagination. Howard E. McCurdy shows how that power inspired people to attempt what they once deemed impossible. In a mere half-century since the launch of the first Earth-orbiting satellite in 1957, humans achieved much of what they had once only read about in the fiction of Jules Verne and H. G. Wells and the nonfiction of Willy Ley.
"Reaching these goals, however, required broad-based support, and McCurdy examines how advocates employed familiar metaphors to excite interest (promising, for example, that space exploration would recreate the American frontier experience) and prepare the public for daring missions into space. When unexpected realities and harsh obstacles threatened their progress, the space community intensified efforts to make their wildest dreams come true.
"This lively and important work remains relevant given contemporary questions about future plans at NASA. Fully revised and updated since its original publication in 1997, Space and the American Imagination includes a reworked introduction and conclusion and new chapters on robotics and space commerce."
to:NB  books:noted  american_history  history_of_ideas  space_exploration 
7 weeks ago
Competing with the Soviets: Science, Technology, and the State in Cold War America
"For most of the second half of the twentieth century, the United States and its allies competed with a hostile Soviet Union in almost every way imaginable except open military engagement. The Cold War placed two opposite conceptions of the good society before the uncommitted world and history itself, and science figured prominently in the picture. Competing with the Soviets offers a short, accessible introduction to the special role that science and technology played in maintaining state power during the Cold War, from the atomic bomb to the Human Genome Project.
"The high-tech machinery of nuclear physics and the space race are at the center of this story, but Audra J. Wolfe also examines the surrogate battlefield of scientific achievement in such diverse fields as urban planning, biology, and economics; explains how defense-driven federal investments created vast laboratories and research programs; and shows how unfamiliar worries about national security and corrosive questions of loyalty crept into the supposedly objective scholarly enterprise.
"Based on the assumption that scientists are participants in the culture in which they live, Competing with the Soviets looks beyond the debate about whether military influence distorted science in the Cold War. Scientists’ choices and opportunities have always been shaped by the ideological assumptions, political mandates, and social mores of their times. The idea that American science ever operated in a free zone outside of politics is, Wolfe argues, itself a legacy of the ideological Cold War that held up American science, and scientists, as beacons of freedom in contrast to their peers in the Soviet Union. Arranged chronologically and thematically, the book highlights how ideas about the appropriate relationships among science, scientists, and the state changed over time."
to:NB  books:noted  history_of_science  cold_war  american_history  science_as_a_social_process 
7 weeks ago
Reconfiguring the World
"Change in human understanding of the natural world during the early modern period marks one of the most important episodes in intellectual history. This era is often referred to as the scientific revolution, but recent scholarship has challenged traditional accounts. Here, in Reconfiguring the World, Margaret J. Osler treats the development of the sciences in Europe from the early sixteenth to the late seventeenth centuries as a complex and multifaceted process.
"The worldview embedded in modern science is a relatively recent development. Osler aims to convey a nuanced understanding of how the natural world looked to early modern thinkers such as Galileo, Descartes, Boyle, and Newton. She describes investigation and understanding of the natural world in terms that the thinkers themselves would have used. Tracing the views of the natural world to their biblical, Greek, and Arabic sources, Osler demonstrates the impact of the Renaissance recovery of ancient texts, printing, the Protestant Reformation, and the exploration of the New World. She shows how the traditional disciplinary boundaries established by Aristotle changed dramatically during this period and finds the tensions of science and religion expressed as differences between natural philosophy and theology.
"Far from a triumphalist account, Osler’s story includes false starts and dead ends. Ultimately, she shows how a few gifted students of nature changed the way we see ourselves and the universe."
to:NB  books:noted  history_of_science  early_modern_european_history  scientific_revolution 
7 weeks ago
Red to Blue | DCCC
For those of us in securely-Democratic districts...
7 weeks ago
Factor Modelling for High-Dimensional Time Series: Inference and Model Selection - Chan - 2016 - Journal of Time Series Analysis - Wiley Online Library
"Analysis of high-dimensional time series data is of increasing interest among different fields. This article studies high-dimensional time series from a dimension reduction perspective using factor modelling. Statistical inference is conducted using eigen-analysis of a certain non-negative definite matrix related to autocovariance matrices of the time series, which is applicable to fixed or increasing dimension. When the dimension goes to infinity, the rate of convergence and limiting distributions of estimated factors are established. Using the limiting distributions of estimated factors, a high-dimensional final prediction error criterion is proposed to select the number of factors. Asymptotic properties of the criterion are illustrated by simulation studies and real applications."
to:NB  time_series  factor_analysis  high-dimensional_statistics  statistics 
8 weeks ago
Conjuring Asia | East Asian History | Cambridge University Press
"The promise of magic has always commanded the human imagination, but the story of industrial modernity is usually seen as a process of disenchantment. Drawing on the writings and performances of the so-called 'Golden Age Magicians' from the turn of the twentieth century, Chris Goto-Jones unveils the ways in which European and North American encounters with (and representations of) Asia - the fabled Mystic East - worked to re-enchant experiences of the modern world. Beginning with a reconceptualization of the meaning of 'modern magic' itself - moving beyond conventional categories of 'real' and 'fake' magic - Goto-Jones' acclaimed book guides us on a magical mystery tour around India, China and Japan, showing us levitations and decapitations, magic duels and bullet catches, goldfish bowls and paper butterflies. In the end, this mesmerizing book reveals Orientalism as a kind of magic in itself, casting a spell over Western culture that leaves it transformed even today."
to:NB  books:noted  magic  orientalism  modernity  history 
8 weeks ago
Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures
"Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM."
to:NB  phylogenetics  statistics  re:phil-of-bayes_paper 
8 weeks ago
Constraint, natural selection, and the evolution of human body form
"Variation in body form among human groups is structured by a blend of natural selection driven by local climatic conditions and random genetic drift. However, attempts to test ecogeographic hypotheses have not distinguished between adaptive traits (i.e., those that evolved as a result of selection) and those that evolved as a correlated response to selection on other traits (i.e., nonadaptive traits), complicating our understanding of the relationship between climate and morphological distinctions among populations. Here, we use evolutionary quantitative methods to test if traits previously identified as supporting ecogeographic hypotheses were actually adaptive by estimating the force of selection on individual traits needed to drive among-group differentiation. Our results show that not all associations between trait means and latitude were caused by selection acting directly on each individual trait. Although radial and tibial length and biiliac and femoral head breadth show signs of responses to directional selection matching ecogeographic hypotheses, the femur was subject to little or no directional selection despite having shorter values by latitude. Additionally, in contradiction to ecogeographic hypotheses, the humerus was under directional selection for longer values by latitude. Responses to directional selection in the tibia and radius induced a nonadaptive correlated response in the humerus that overwhelmed its own trait-specific response to selection. This result emphasizes that mean differences between groups are not good indicators of which traits are adaptations in the absence of information about covariation among characteristics."

--- Not obvious to me how they can pick out the constraints here (maybe assuming equal within-grup covariances, and assuming they reflect constraints?), but presumably that's addressed in the paper.
to:NB  human_evolution  evolutionary_biology 
8 weeks ago
Unpaid, stressed, and confused: patients are the health care system's free labor - Vox
I'll just add that in my experience, this work often falls on the healthy spouse or children of a seriously ill person.
medicine  our_decrepit_institutions  via:?  have_read 
8 weeks ago
« earlier      
academia afghanistan agent-based_models american_history ancient_history archaeology art bad_data_analysis bad_science_journalism bayesian_consistency bayesianism biochemical_networks biology blogged book_reviews books:noted books:owned books:recommended bootstrap causal_inference causality central_asia central_limit_theorem class_struggles_in_america classifiers climate_change clustering cognitive_science collective_cognition community_discovery complexity computational_statistics confidence_sets corruption coveted crime cross-validation cthulhiana cultural_criticism cultural_exchange data_analysis data_mining debunking decision-making decision_theory delong.brad democracy density_estimation dimension_reduction distributed_systems dynamical_systems econometrics economic_growth economic_history economic_policy economics education empirical_processes ensemble_methods entropy_estimation epidemic_models epistemology ergodic_theory estimation evisceration evolution_of_cooperation evolutionary_biology experimental_psychology finance financial_crisis_of_2007-- financial_speculation fmri food fraud funny funny:academic funny:geeky funny:laughing_instead_of_screaming funny:malicious funny:pointed genetics graph_theory graphical_models have_read heard_the_talk heavy_tails high-dimensional_statistics hilbert_space history_of_ideas history_of_science human_genetics hypothesis_testing ideology imperialism in_nb inequality inference_to_latent_objects information_theory institutions kernel_methods kith_and_kin krugman.paul large_deviations lasso learning_theory liberman.mark likelihood linguistics literary_criticism machine_learning macro_from_micro macroeconomics market_failures_in_everything markov_models mathematics mixing model_selection modeling modern_ruins monte_carlo moral_psychology moral_responsibility mortgage_crisis natural_history_of_truthiness network_data_analysis networked_life networks neural_data_analysis neuroscience non-equilibrium nonparametrics optimization our_decrepit_institutions philosophy philosophy_of_science photos physics pittsburgh political_economy political_science practices_relating_to_the_transmission_of_genetic_information prediction pretty_pictures principal_components probability programming progressive_forces psychology public_policy r racism random_fields re:almost_none re:aos_project re:democratic_cognition re:do-institutions-evolve re:g_paper re:homophily_and_confounding re:network_differences re:smoothing_adjacency_matrices re:social_networks_as_sensor_networks re:stacs re:your_favorite_dsge_sucks recipes regression running_dogs_of_reaction science_as_a_social_process science_fiction simulation social_influence social_life_of_the_mind social_media social_networks social_science_methodology sociology something_about_america sparsity spatial_statistics state-space_models statistical_inference_for_stochastic_processes statistical_mechanics statistics stochastic_processes text_mining the_american_dilemma the_continuing_crises time_series to:blog to:nb to_be_shot_after_a_fair_trial to_read to_teach:complexity-and-inference to_teach:data-mining to_teach:statcomp to_teach:undergrad-ada track_down_references us_politics utter_stupidity vast_right-wing_conspiracy via:? via:henry_farrell via:jbdelong via:klk visual_display_of_quantitative_information whats_gone_wrong_with_america why_oh_why_cant_we_have_a_better_academic_publishing_system why_oh_why_cant_we_have_a_better_press_corps

Copy this bookmark: