12847
Common Property | Boston Review
Social insurance as rents from a share in (fairly literally) the commonwealth.
political_philosophy  welfare_state  hayek.f.a._von  paine.thomas  anderson.elizabeth  have_read
2 hours ago
Transformative Treatments
"Contemporary social-scientific research seeks to identify specific causal mechanisms for outcomes of theoretical interest. Experiments that randomize populations to treatment and control conditions are the “gold standard” for causal inference. We identify, describe, and analyze the problem posed by transformative treatments. Such treatments radically change treated individuals in a way that creates a mismatch in populations, but this mismatch is not empirically detectable at the level of counterfactual dependence. In such cases, the identification of causal pathways is underdetermined in a previously unrecognized way. Moreover, if the treatment is indeed transformative it breaks the inferential structure of the experimental design. Transformative treatments are not curiosities or “corner cases”, but are plausible mechanisms in a large class of events of theoretical interest, particularly ones where deliberate randomization is impractical and quasi-experimental designs are sought instead. They cast long-running debates about treatment and selection effects in a new light, and raise new methodological challenges."

--- After skimming, I'm left spluttering "but, but, _every_ intervention creates a new population!", so I am probably missing something fundamental, and should do more than just skim.
to:NB  causality  causal_inference  barely-comprehensible_metaphysics  healy.kieran  have_skimmed
yesterday
"Gunpowder Empire": Should We Generalize Mark Elvin's High-Level Equilibrium Trap?
Brad has saved me from writing a post (but not, perhaps, from promulgating a pet semi-crank notion).
yesterday
[1607.05506] Distribution-dependent concentration inequalities for tighter generalization bounds
"We prove several distribution-dependent extensions of Hoeffding and McDiarmid's inequalities with (difference-) unbounded and hierarchically (difference-) bounded functions. For this purpose, several assumptions about the probabilistic boundedness and bounded differences are introduced. Our approaches improve the previous concentration inequalities' bounds, and achieve tight bounds in some exceptional cases where the original inequalities cannot hold. Furthermore, we discuss the potential applications of our extensions in VC dimension and Rademacher complexity. Then we obtain generalization bounds for (difference-) unbounded loss functions and tighten the existing generalization bounds."
to:NB  deviation_inequalities  probability  learning_theory
2 days ago
[1604.01575] Clustering implies geometry in networks
"Network models with latent geometry have been used successfully in many applications in network science and other disciplines, yet it is usually impossible to tell if a given real network is geometric, meaning if it is a typical element in an ensemble of random geometric graphs. Here we identify structural properties of networks that guarantee that random graphs having these properties are geometric. Specifically we show that random graphs in which expected degree and clustering of every node are fixed to some constants are equivalent to random geometric graphs on the real line, if clustering is sufficiently strong. Large numbers of triangles, homogeneously distributed across all nodes as in real networks, are thus a consequence of network geometricity. The methods we use to prove this are quite general and applicable to other network ensembles, geometric or not, and to certain problems in quantum gravity."
to:NB  network_data_analysis  network_formation  latent_space_network_models
2 days ago
[1501.06835] Emergence of Soft Communities from Geometric Preferential Attachment
"All real networks are different, but many have some structural properties in common. There seems to be no consensus on what the most common properties are, but scale-free degree distributions, strong clustering, and community structure are frequently mentioned without question. Surprisingly, there exists no simple generative mechanism explaining all the three properties at once in growing networks. Here we show how latent network geometry coupled with preferential attachment of nodes to this geometry fills this gap. We call this mechanism geometric preferential attachment (GPA), and validate it against the Internet. GPA gives rise to soft communities that provide a different perspective on the community structure in networks. The connections between GPA and cosmological models, including inflation, are also discussed."
to:NB  networks  network_formation  community_discovery  latent_space_network_models
2 days ago
Denny, M.: Ecological Mechanics: Principles of Life’s Physical Interactions. (eBook and Hardcover)
"Plants and animals interact with each other and their surroundings, and these interactions—with all their complexity and contingency—control where species can survive and reproduce. In this comprehensive and groundbreaking introduction to the emerging field of ecological mechanics, Mark Denny explains how the principles of physics and engineering can be used to understand the intricacies of these remarkable relationships.
"Denny opens with a brief review of basic physics before introducing the fundamentals of diffusion, fluid mechanics, solid mechanics, and heat transfer, taking care to explain each in the context of living organisms. Why are corals of different shapes on different parts of a reef? How can geckos climb sheer walls? Why can birds and fish migrate farther than mammals? How do desert plants stay cool? The answers to these and a host of similar questions illustrate the principles of heat, mass, and momentum transport and set the stage for the book’s central topic—the application of these principles in ecology. Denny shows how variations in the environment—in both space and time—affect the performance of plants and animals. He introduces spectral analysis, a mathematical tool for quantifying the patterns in which environments vary, and uses it to analyze such subjects as the spread of invasive species. Synthesizing the book’s materials, the final chapters use ecological mechanics to predict the occurrence and consequences of extreme ecological events, explain the emergence of patterns in the distribution and abundance of organisms, and empower readers to explore further."
to:NB  books:noted  physics  biology  ecology  biophysics
3 days ago
Del Vecchio, D. and Murray, R.M.: Biomolecular Feedback Systems (eBook and Hardcover).
"This book provides an accessible introduction to the principles and tools for modeling, analyzing, and synthesizing biomolecular systems. It begins with modeling tools such as reaction-rate equations, reduced-order models, stochastic models, and specific models of important core processes. It then describes in detail the control and dynamical systems tools used to analyze these models. These include tools for analyzing stability of equilibria, limit cycles, robustness, and parameter uncertainty. Modeling and analysis techniques are then applied to design examples from both natural systems and synthetic biomolecular circuits. In addition, this comprehensive book addresses the problem of modular composition of synthetic circuits, the tools for analyzing the extent of modularity, and the design techniques for ensuring modular behavior. It also looks at design trade-offs, focusing on perturbations due to noise and competition for shared cellular resources."
to:NB  books:noted  biochemical_networks  feedback  biology  networks
3 days ago
[1607.06494] Stochastic Control via Entropy Compression
"We consider an agent trying to bring a system to an acceptable state by repeated probabilistic action (stochastic control). Specifically, in each step the agent observes the flaws present in the current state, selects one of them, and addresses it by probabilistically moving to a new state, one where the addressed flaw is most likely absent, but where one or more new flaws may be present. Several recent works on algorithmizations of the Lov\'{a}sz Local Lemma have established sufficient conditions for such an agent to succeed. Motivated by the paradigm of Partially Observable Markov Decision Processes (POMDPs) we study whether such stochastic control is also possible in a noisy environment, where both the process of state-observation and the process of state-evolution are subject to adversarial perturbation (noise). The introduction of noise causes the tools developed for LLL algorithmization to break down since the key LLL ingredient, the sparsity of the causality (dependence) relationship, no longer holds. To overcome this challenge we develop a new analysis where entropy plays a central role, both to measure the rate at which progress towards an acceptable state is made and the rate at which the noise undoes this progress. The end result is a sufficient condition that allows a smooth tradeoff between the intensity of the noise and the amenability of the system, recovering an asymmetric LLL condition in the noiseless case. To our knowledge, this is the first tractability result for a nontrivial class of POMDPs under stochastic memoryless control."
to:NB  control_theory  state-space_models  information_theory  via:ded-maxim
3 days ago
[1607.06534] The Landscape of Empirical Risk for Non-convex Losses
"We revisit the problem of learning a noisy linear classifier by minimizing the empirical risk associated to the square loss. While the empirical risk is non-convex, we prove that its structure is remarkably simple. Namely, when the sample size is larger than Cdlogd (with d the dimension, and C a constant) the following happen with high probability: (a) The empirical risk has a unique local minimum (which is also the global minimum); (b) Gradient descent converges exponentially fast to the global minimizer, from any initialization; (c) The global minimizer approaches the true parameter at nearly optimal rate. The core of our argument is to establish a uniform convergence result for the gradients and Hessians of the empirical risk."
to:NB  learning_theory  empirical_processes  optimization  via:ded-maxim
3 days ago
Fast Patchwork Bootstrap for Quantifying Estimation Uncertainties in Sparse Random Networks
"We propose a new method of nonparametric bootstrap to quantify estimation uncertainties in large and possibly sparse random networks. The method is tailored for inference on functions of network degree distribution, under the assump- tion that both network degree distribution and network or- der are unknown. The key idea is based on adaptation of the “blocking” argument, developed for bootstrapping of time series and re-tiling of spatial data, to random networks. We sample network blocks (patches) and bootstrap the data within these patches. To select an optimal patch size, we de- velop a new computationally efficient and data-driven cross- validation algorithm. The proposed fast patchwork boot- strap (FPB) methodology further extends the ideas devel- oped by [33] for a case of network mean degree, to infer- ence on a degree distribution. In addition, the FPB is sub- stantially less computationally expensive, requires less in- formation on a graph, and is free from nuisance parame- ters. In our simulation study, we show that the new boot- strap method outperforms competing approaches by pro- viding sharper and better calibrated confidence intervals for functions of a network degree distribution than other avail- able approaches. We illustrate the FPB in application to a study of the Erdo ̈s collaboration network."
3 days ago
[1607.06565] Controlling for Latent Homophily in Social Networks through Inferring Latent Locations
"Social influence cannot be identified from purely observational data on social networks, because such influence is generically confounded with latent homophily, i.e., with a node's network partners being informative about the node's attributes and therefore its behavior. We show that {\em if} the network grows according to either a community (stochastic block) model, or a continuous latent space model, then latent homophilous attributes can be consistently estimated from the global pattern of social ties. Moreover, these estimates are informative enough that controlling for them allows for unbiased and consistent estimation of social-influence effects in additive models. For community models, we also provide bounds on the finite-sample bias. These are the first results on the consistent estimation of social-influence effects in the presence of latent homophily, and we discuss the prospects for generalizing them."
self-promotion  social_networks  network_data_analysis  causal_inference  community_discovery  re:homophily_and_confounding  to:blog
3 days ago
Learning Fair Representations
"We propose a learning algorithm for fair clas- sification that achieves both group fairness (the proportion of members in a protected group receiving positive classification is iden- tical to the proportion in the population as a whole), and individual fairness (similar in- dividuals should be treated similarly). We formulate fairness as an optimization prob- lem of finding a good representation of the data with two competing goals: to encode the data as well as possible, while simultaneously obfuscating any information about member- ship in the protected group. We show posi- tive results of our algorithm relative to other known techniques, on three datasets. More- over, we demonstrate several advantages to our approach. First, our intermediate rep- resentation can be used for other classifica- tion tasks (i.e., transfer learning is possible); secondly, we take a step toward learning a distance metric which can find important di- mensions of the data for classification."

--- This looks really similar to Simon DeDeo's stuff.
to:NB  to_read  classifiers  re:prediction-without-racism  data_mining  privacy
5 days ago
Epidemic spreading on complex networks with community structures : Scientific Reports
"Many real-world networks display a community structure. We study two random graph models that create a network with similar community structure as a given network. One model preserves the exact community structure of the original network, while the other model only preserves the set of communities and the vertex degrees. These models show that community structure is an important determinant of the behavior of percolation processes on networks, such as information diffusion or virus spreading: the community structure can both enforce as well as inhibit diffusion processes. Our models further show that it is the mesoscopic set of communities that matters. The exact internal structures of communities barely influence the behavior of percolation processes across networks. This insensitivity is likely due to the relative denseness of the communities."
to:NB  epidemic_models  social_networks  re:do-institutions-evolve
5 days ago
Conflict Kitchen » Marathon Reading of the Shahnameh
"Over the course of three days, from noon until 8 pm, July 20-22, the public is invited to read passages from the Shahnameh in conjunction with scheduled readings by members of Pittsburgh’s Iranian community."

--- Of course I would bookmark this too late to do any good...
pittsburgh  persianate_culture  poetry
5 days ago
Delay Embeddings for Forced Systems. II. Stochastic Forcing | SpringerLink
"Takens’ Embedding Theorem forms the basis of virtually all approaches to the analysis of time series generated by nonlinear deterministic dynamical systems. It typically allows us to reconstruct an unknown dynamical system which gave rise to a given observed scalar time series simply by constructing a new state space out of successive values of the time series. This provides the theoretical foundation for many popular techniques, including those for the measurement of fractal dimensions and Liapunov exponents, for the prediction of future behaviour, for noise reduction and signal separation, and most recently for control and targeting. Current versions of Takens’ Theorem assume that the underlying system is autonomous (and noise-free). Unfortunately this is not the case for many real systems. In a previous paper, one of us showed how to extend Takens’ Theorem to deterministically forced systems. Here, we use similar techniques to prove a number of delay embedding theorems for arbitrarily and stochastically forced systems. As a special case, we obtain embedding results for Iterated Functions Systems, and we also briefly consider noisy observations."
to:NB  to_read  time_series  state-space_reconstruction  state-space_models  statistics  dynamical_systems  have_skimmed
7 days ago
Stylized Facts in the Social Sciences | Sociological Science
"Stylized facts are empirical regularities in search of theoretical, causal explanations. Stylized facts are both positive claims (about what is in the world) and normative claims (about what merits scholarly attention). Much of canonical social science research can be usefully characterized as the production or contestation of stylized facts. Beyond their value as grist for the theoretical mill of social scientists, stylized facts also travel directly into the political arena. Drawing on three recent examples, I show how stylized facts can interact with existing folk causal theories to reconstitute political debates and how tensions in the operationalization of folk concepts drive contention around stylized fact claims."
to:NB  sociology  social_science_methodology
7 days ago
IASC: The Hedgehog Review - Volume 18, No. 2 (Summer 2016) - The New Ruling Class -
At last, _comparatively_ intelligent reactionaries.
(I say "comparatively" because: (1) Notice how much time they spend on what people _said_ was going to happen before the British civil service got reformed, compared to how much they are able to actually _show_ about the effect of those reforms. They're just _insinuating_ that the opponents of the reforms were right; (2) They're unable to imagine _egalitarian_ alternatives to "meritocracy"; (3) leaving details to mere mechanicals while preserving tone or higher intuition is of course a classic _aristocratic_ attitude.)
8 days ago
Guala, F.: Understanding Institutions: The Science and Philosophy of Living Together. (eBook and Hardcover)
"Understanding Institutions proposes a new unified theory of social institutions that combines the best insights of philosophers and social scientists who have written on this topic. Francesco Guala presents a theory that combines the features of three influential views of institutions: as equilibria of strategic games, as regulative rules, and as constitutive rules.
"Guala explains key institutions like money, private property, and marriage, and develops a much-needed unification of equilibrium- and rules-based approaches. Although he uses game theory concepts, the theory is presented in a simple, clear style that is accessible to a wide audience of scholars working in different fields. Outlining and discussing various implications of the unified theory, Guala addresses venerable issues such as reflexivity, realism, Verstehen, and fallibilism in the social sciences. He also critically analyses the theory of "looping effects" and "interactive kinds" defended by Ian Hacking, and asks whether it is possible to draw a demarcation between social and natural science using the criteria of causal and ontological dependence. Focusing on current debates about the definition of marriage, Guala shows how these abstract philosophical issues have important practical and political consequences.
"Moving beyond specific cases to general models and principles, Understanding Institutions offers new perspectives on what institutions are, how they work, and what they can do for us."
to:NB  books:noted  institutions  social_theory  philosophy  marriage  re:do-institutions-evolve
9 days ago
[1507.03652] Lasso adjustments of treatment effect estimates in randomized experiments
"We provide a principled way for investigators to analyze randomized experiments when the number of covariates is large. Investigators often use linear multivariate regression to analyze randomized experiments instead of simply reporting the difference of means between treatment and control groups. Their aim is to reduce the variance of the estimated treatment effect by adjusting for covariates. If there are a large number of covariates relative to the number of observations, regression may perform poorly because of overfitting. In such cases, the Lasso may be helpful. We study the resulting Lasso-based treatment effect estimator under the Neyman-Rubin model of randomized experiments. We present theoretical conditions that guarantee that the estimator is more efficient than the simple difference-of-means estimator, and we provide a conservative estimator of the asymptotic variance, which can yield tighter confidence intervals than the difference-of-means estimator. Simulation and data examples show that Lasso-based adjustment can be advantageous even when the number of covariates is less than the number of observations. Specifically, a variant using Lasso for selection and OLS for estimation performs particularly well, and it chooses a smoothing parameter based on combined performance of Lasso and OLS."
to:NB  heard_the_talk  have_skimmed  yu.bin  lasso  regression  causal_inference  statistics
9 days ago
Phys. Rev. Lett. 117, 038103 (2016) - How Far from Equilibrium Is Active Matter?
"Active matter systems are driven out of thermal equilibrium by a lack of generalized Stokes-Einstein relation between injection and dissipation of energy at the microscopic scale. We consider such a system of interacting particles, propelled by persistent noises, and show that, at small but finite persistence time, their dynamics still satisfy a time-reversal symmetry. To do so, we compute perturbatively their steady-state measure and show that, for short persistent times, the entropy production rate vanishes. This endows such systems with an effective fluctuation-dissipation theorem akin to that of thermal equilibrium systems. Last, we show how interacting particle systems with viscous drags and correlated noises can be seen as in equilibrium with a viscoelastic bath but driven out of equilibrium by nonconservative forces, hence providing energetic insight into the departure of active systems from equilibrium"
to:NB  to_read  thermodynamics  statistical_mechanics  non-equilibrium  fluctuation-response
9 days ago
DHQ: Digital Humanities Quarterly: Six Degrees of Francis Bacon: A Statistical Method for Reconstructing Large Historical Social Networks
"In this paper we present a statistical method for inferring historical social networks from biographical documents as well as the scholarly aims for doing so. Existing scholarship on historical social networks is scattered across an unmanageable number of disparate books and articles. A researcher interested in how persons were connected to one another in our field of study, early modern Britain (c. 1500-1700), has no global, unified resource to which to turn. Manually building such a network is infeasible, since it would need to represent thousands of nodes and tens of millions of potential edges just to include the relations among the most prominent persons of the period. Our Six Degrees of Francis Bacon project takes up recent statistical techniques and digital tools to reconstruct and visualize the early modern social network.
"We describe in this paper the natural language processing tools and statistical graph learning techniques that we used to extract names and infer relations from the Oxford Dictionary of National Biography. We then explain the steps taken to test inferred relations against the knowledge of experts in order to improve the accuracy of the learning techniques. Our argument here is twofold: first, that the results of this process, a global visualization of Britain’s early modern social network, will be useful to scholars and students of the period; second, that the pipeline we have developed can, with local modifications, be reused by other scholars to generate networks for other historical or contemporary societies from biographical documents."

--- I have helped perpetrate an act of digital humanities.
to:NB  self-promotion  social_networks  text_mining  lasso  statistics  early_modern_european_history  kith_and_kin
12 days ago
Noneuclidean Spring Embedders
"We present a method by which force-directed algorithms for graph layouts can be generalized to calculate the layout of a graph in an arbitrary Riemannian geometry. The method relies on extending the Euclidean notions of distance, angle, and force-interactions to smooth non-Euclidean geometries via projections to and from appropriately chosen tangent spaces. In particular, we formally describe the calculations needed to extend such algorithms to hyperbolic and spherical geometries."
to:NB  to_read  hyperbolic_geometry  re:hyperbolic_networks  network_data_analysis  network_visualization  via:cris_moore
14 days ago
Smith, J.E. H.: The Philosopher: A History in Six Types. (eBook and Hardcover)
"What would the global history of philosophy look like if it were told not as a story of ideas but as a series of job descriptions—ones that might have been used to fill the position of philosopher at different times and places over the past 2,500 years? The Philosopher does just that, providing a new way of looking at the history of philosophy by bringing to life six kinds of figures who have occupied the role of philosopher in a wide range of societies around the world over the millennia—the Natural Philosopher, the Sage, the Gadfly, the Ascetic, the Mandarin, and the Courtier. The result is at once an unconventional introduction to the global history of philosophy and an original exploration of what philosophy has been—and perhaps could be again.
"By uncovering forgotten or neglected philosophical job descriptions, the book reveals that philosophy is a universal activity, much broader—and more gender inclusive—than we normally think today. In doing so, The Philosopher challenges us to reconsider our idea of what philosophers can do and what counts as philosophy."
to:NB  books:noted  history_of_ideas  philosophy  world_history  to_be_shot_after_a_fair_trial
16 days ago
Hassan, M.: Longing for the Lost Caliphate: A Transregional History. (eBook and Hardcover)
"In the United States and Europe, the word “caliphate” has conjured historically romantic and increasingly pernicious associations. Yet the caliphate’s significance in Islamic history and Muslim culture remains poorly understood. This book explores the myriad meanings of the caliphate for Muslims around the world through the analytical lens of two key moments of loss in the thirteenth and twentieth centuries. Through extensive primary-source research, Mona Hassan explores the rich constellation of interpretations created by religious scholars, historians, musicians, statesmen, poets, and intellectuals.
"Hassan fills a scholarly gap regarding Muslim reactions to the destruction of the Abbasid caliphate in Baghdad in 1258 and challenges the notion that the Mongol onslaught signaled an end to the critical engagement of Muslim jurists and intellectuals with the idea of an Islamic caliphate. She also situates Muslim responses to the dramatic abolition of the Ottoman caliphate in 1924 as part of a longer trajectory of transregional cultural memory, revealing commonalities and differences in how modern Muslims have creatively interpreted and reinterpreted their heritage. Hassan examines how poignant memories of the lost caliphate have been evoked in Muslim culture, law, and politics, similar to the losses and repercussions experienced by other religious communities, including the destruction of the Second Temple for Jews and the fall of Rome for Christians."
to:NB  books:noted  islam  islamic_civilization  history_of_ideas  uses_of_the_past
16 days ago
[1605.09522] Kernel Mean Embedding of Distributions: A Review and Beyonds
"A Hilbert space embedding of distributions---in short, kernel mean embedding---has recently emerged as a powerful machinery for probabilistic modeling, statistical inference, machine learning, and causal discovery. The basic idea behind this framework is to map distributions into a reproducing kernel Hilbert space (RKHS) in which the whole arsenal of kernel methods can be extended to probability measures. It gave rise to a great deal of research and novel applications of positive definite kernels. The goal of this survey is to give a comprehensive review of existing works and recent advances in this research area, and to discuss some of the most challenging issues and open problems that could potentially lead to new research directions. The survey begins with a brief introduction to the RKHS and positive definite kernels which forms the backbone of this survey, followed by a thorough discussion of the Hilbert space embedding of marginal distributions, theoretical guarantees, and review of its applications. The embedding of distributions enables us to apply RKHS methods to probability measures which prompts a wide range of applications such as kernel two-sample testing, independent testing, group anomaly detection, and learning on distributional data. Next, we discuss the Hilbert space embedding for conditional distributions, give theoretical insights, and review some applications. The conditional mean embedding enables us to perform sum, product, and Bayes' rules---which are ubiquitous in graphical model, probabilistic inference, and reinforcement learning---in a non-parametric way using the new representation of distributions in RKHS. We then discuss relationships between this framework and other related areas. Lastly, we give some suggestions on future research directions."
to:NB  statistics  probability  hilbert_space  kernel_methods
16 days ago
Tool-box or toy-box? Hard obscurantism in economic modeling - Springer
"“Hard obscurantism” is a species of the genus scholarly obscurantism. A rough intensional definition of hard obscurantism is that models and procedures become ends in themselves, dissociated from their explanatory functions. In the present article, I exemplify and criticize hard obscurantism by examining the writings of eminent economists and political scientists."
to:NB  economics  political_science  philosophy_of_science  bad_science  elster.jon
17 days ago
What is Shannon information? - Springer
"Despite of its formal precision and its great many applications, Shannon’s theory still offers an active terrain of debate when the interpretation of its main concepts is the task at issue. In this article we try to analyze certain points that still remain obscure or matter of discussion, and whose elucidation contribute to the assessment of the different interpretative proposals about the concept of information. In particular, we argue for a pluralist position, according to which the different views about information are no longer rival, but different interpretations of a single formal concept."
17 days ago
Greiner, A. and Semmler, W., Gong, G.: The Forces of Economic Growth: A Time Series Perspective. (eBook, Paperback and Hardcover)
"In economics, the emergence of New Growth Theory in recent decades has directed attention to an old and important problem: what are the forces of economic growth and how can public policy enhance them? This book examines major forces of growth--including spillover effects and externalities, education and formation of human capital, knowledge creation through deliberate research efforts, and public infrastructure investment. Unique in emphasizing the importance of different forces for particular stages of development, it offers wide-ranging policy implications in the process.
"The authors critically examine recently developed endogenous growth models, study the dynamic implications of modified models, and test the models empirically with modern time series methods that avoid the perils of heterogeneity in cross-country studies. Their empirical analyses, undertaken with newly constructed time series data for the United States and some core countries of the Euro zone, show that models containing scale effects, such as the R&D model and the human capital model, are compatible with time series evidence only after considerable modifications and nonlinearities are introduced. They also explore the relationship between growth and inequality, with particular focus on technological change and income disparity. The Forces of Economic Growth represents a comprehensive and up-to-date empirical time series perspective on the New Growth Theory."
to:NB  books:noted  economics  economic_growth  econometrics  time_series  statistics
19 days ago
Siebert, H.: Rules for the Global Economy (eBook, Paperback and Hardcover).
"Rules for the Global Economy is a timely examination of the conditions under which international rules of globalization come into existence, enabling world economic and financial systems to function and stabilize. Horst Siebert, a leading figure in international economics, explains that these institutional arrangements, such as the ones that govern banking, emerge when countries fail to solve economic problems on their own and cede part of their sovereignty to an international order. Siebert demonstrates that the rules result from a trial-and-error process--and usually after a crisis--in order to prevent pointless transaction costs and risks.
"Using an accessible and nonmathematical approach, Siebert links the rules to four areas: international trade relations, factor movements, financial flows, and the environment. He looks at the international division of labor in the trade of goods and services; flow of capital; diffusion of technology; migration of people, including labor and human capital; protection of the global environment; and stability of the monetary-financial system. He discusses the role of ethical norms and human rights in defining international regulations, and argues that the benefits of any rules system should be direct and visible. Comprehensively supporting rules-based interactions among international players, the book considers future issues of the global rules system."
to:NB  books:noted  economics  globalization  institutions  re:do-institutions-evolve
19 days ago
Approximation Methods in Probability Theory | Vydas Čekanavičius | Springer
"This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems.
"While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful."
to:NB  probability  approximation  mathematics  convergence_of_stochastic_processes
20 days ago
Geiger , Heckerman , King , Meek : Stratified exponential families: Graphical models and model selection
"We describe a hierarchy of exponential families which is useful for distinguishing types of graphical models. Undirected graphical models with no hidden variables are linear exponential families (LEFs). Directed acyclic graphical (DAG) models and chain graphs with no hidden variables, includ­ ing DAG models with several families of local distributions, are curved exponential families (CEFs). Graphical models with hidden variables are what we term stratified exponential families (SEFs). A SEF is a finite union of CEFs of various dimensions satisfying some regularity conditions. We also show that this hierarchy of exponential families is noncollapsing with respect to graphical models by providing a graphical model which is a CEF but not a LEF and a graphical model that is a SEF but not a CEF. Finally, we show how to compute the dimension of a stratified exponential family. These results are discussed in the context of model selection of graphical models."
to:NB  have_read  graphical_models  exponential_families  statistics  geometry  via:rvenkat
20 days ago
[1604.07125] Efficient Inference of Average Treatment Effects in High Dimensions via Approximate Residual Balancing
"There are many settings where researchers are interested in estimating average treatment effects and are willing to rely on the unconfoundedness assumption, which requires that the treatment assignment be as good as random conditional on pre-treatment variables. The unconfoundedness assumption is often more plausible if a large number of pre-treatment variables are included in the analysis, but this can worsen the finite sample properties of standard approaches to treatment effect estimation. There are some recent proposals on how to extend classical methods to the high dimensional setting; however, to our knowledge, all existing method rely on consistent estimability of the propensity score, i.e., the probability of receiving treatment given pre-treatment variables. In this paper, we propose a new method for estimating average treatment effects in high dimensional linear settings that attains dimension-free rates of convergence for estimating average treatment effects under substantially weaker assumptions than existing methods: Instead of requiring the propensity score to be estimable, we only require overlap, i.e., that the propensity score be uniformly bounded away from 0 and 1. Procedurally, out method combines balancing weights with a regularized regression adjustment."

--- For the causal-ML reading group.
--- Pros: (i) The paper is very slick. (ii) Essentially, they are doing a regression adjustment, but the weights are enhancing the influence of data points in the "treated" part of the predictor space over what OLS would do.
--- Cons: (i) Without linearity, everything falls apart. (ii) _Pace_ their claim that they're just using the lasso as a predictor, they really do need accurate estimation (in L1) of the true coefficient vector. (iii) Hence all the usual uncheckable conditions people impose on the lasso get imported. [Though I guess you could use some other regularization and hope.] (iv) It's not at all clear that their optimization isn't just a disguised form of estimating propensity scores.
to:NB  have_read  causal_inference  statistics  regression  lasso  imbens.guido_w.  athey.susan  wager.stefan
20 days ago
The Fissured Workplace — David Weil | Harvard University Press
"For much of the twentieth century, large companies employing many workers formed the bedrock of the U.S. economy. Today, on the list of big business’s priorities, sustaining the employer-worker relationship ranks far below building a devoted customer base and delivering value to investors. As David Weil’s groundbreaking analysis shows, large corporations have shed their role as direct employers of the people responsible for their products, in favor of outsourcing work to small companies that compete fiercely with one another. The result has been declining wages, eroding benefits, inadequate health and safety conditions, and ever-widening income inequality.
"From the perspectives of CEOs and investors, fissuring—splitting off functions that were once managed internally—has been a phenomenally successful business strategy, allowing companies to become more streamlined and drive down costs. Despite giving up direct control to subcontractors, vendors, and franchises, these large companies have figured out how to maintain quality standards and protect the reputation of the brand. They produce brand-name products and services without the cost of maintaining an expensive workforce. But from the perspective of workers, this lucrative strategy has meant stagnation in wages and benefits and a lower standard of living—if they are fortunate enough to have a job at all.
"Weil proposes ways to modernize regulatory policies and laws so that employers can meet their obligations to workers while allowing companies to keep the beneficial aspects of this innovative business strategy."
to:NB  books:noted  economics  labor  corporations  class_struggles_in_america
21 days ago
Modeling the Heavens: Sphairopoiia and Ptolemy’s Planetary Hypotheses
"This article investigates sphairopoiia, the art of making instruments that display the heavens, in Claudius Ptolemy’s Planetary Hypotheses. It takes up two questions: what kind of instrument does Ptolemy describe? And, could such an instrument have been constructed? I argue that Ptolemy did not propose one specific type of instrument, but instead he offered a range of possible designs, with the details to be worked out by the craftsman. Moreover, in addition to exhibiting his astronomical models and having the ability to estimate predictions, the instrument he proposed would have also shown the physical workings of the heavens. What emerges is both a clearer idea of what Ptolemy wanted the technician to build, and the purpose of such instruments."
to:NB  history_of_science  astronomy  ptolemy  modeling  prediction  history_of_technology
21 days ago
Estimation and Testing Under Sparsity | Sara van de Geer | Springer
"Taking the Lasso method as its starting point, this book describes the main ingredients needed to study general loss functions and sparsity-inducing regularizers. It also provides a semi-parametric approach to establishing confidence intervals and tests. Sparsity-inducing methods have proven to be very useful in the analysis of high-dimensional data. Examples include the Lasso and group Lasso methods, and the least squares method with other norm-penalties, such as the nuclear norm. The illustrations provided include generalized linear models, density estimation, matrix completion and sparse principal components. Each chapter ends with a problem section. The book can be used as a textbook for a graduate or PhD course."
to:NB  books:noted  statistics  sparsity  high-dimensional_statistics  lasso  hypothesis_testing  confidence_sets  van_de_geer.sara  to_read  empirical_processes
22 days ago
Statistics for Mathematicians - A Rigorous First Course | Victor M. Panaretos | Springer
"This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students."
to:NB  statistics
22 days ago
Seasonal Adjustment Methods and Real Time Trend-Cycle | Estela Bee Dagum | Springer
"This book explores widely used seasonal adjustment methods and recent developments in real time trend-cycle estimation. It discusses in detail the properties and limitations of X12ARIMA, TRAMO-SEATS and STAMP - the main seasonal adjustment methods used by statistical agencies.  Several real-world cases illustrate each method and real data examples can be followed throughout the text. The trend-cycle estimation is presented using nonparametric techniques based on moving averages, linear filters and reproducing kernel Hilbert spaces, taking recent advances into account. The book provides a systematical treatment of results that to date have been scattered throughout the literature.
"Seasonal adjustment and real time trend-cycle prediction play an essential part at all levels of activity in modern economies. They are used by governments to counteract cyclical recessions, by central banks to control inflation, by decision makers for better modeling and planning and by hospitals, manufacturers, builders, transportation, and consumers in general to decide on appropriate action.
"This book appeals to practitioners in government institutions, finance and business, macroeconomists, and other professionals who use economic data as well as academic researchers in time series analysis, seasonal adjustment methods, filtering and signal extraction. It is also useful for graduate and final-year undergraduate courses in econometrics and time series with a good understanding of linear regression and matrix algebra, as well as ARIMA modelling."
22 days ago
[1606.08813] EU regulations on algorithmic decision-making and a "right to explanation"
"We summarize the potential impact that the European Union's new General Data Protection Regulation will have on the routine use of machine learning algorithms. Slated to take effect as law across the EU in 2018, it will restrict automated individual decision-making (that is, algorithms that make decisions based on user-level predictors) which "significantly affect" users. The law will also create a "right to explanation," whereby a user can ask for an explanation of an algorithmic decision that was made about them. We argue that while this law will pose large challenges for industry, it highlights opportunities for machine learning researchers to take the lead in designing algorithms and evaluation frameworks which avoid discrimination."
to:NB  explanation  statistics  prediction  decision-making  flaxman.seth
22 days ago
Estimating peer effects in networks with peer encouragement designs
"Peer effects, in which the behavior of an individual is affected by the behavior of their peers, are central to social science. Because peer effects are often confounded with homophily and common external causes, recent work has used randomized experiments to estimate effects of specific peer behaviors. These experiments have often relied on the experimenter being able to randomly modulate mechanisms by which peer behavior is transmitted to a focal individual. We describe experimental designs that instead randomly assign individuals’ peers to encouragements to behaviors that directly affect those individuals. We illustrate this method with a large peer encouragement design on Facebook for estimating the effects of receiving feedback from peers on posts shared by focal individuals. We find evidence for substantial effects of receiving marginal feedback on multiple behaviors, including giving feedback to others and continued posting. These findings provide experimental evidence for the role of behaviors directed at specific individuals in the adoption and continued use of communication technologies. In comparison, observational estimates differ substantially, both underestimating and overestimating effects, suggesting that researchers and policy makers should be cautious in relying on them."
22 days ago
Mendelson, S.E.: Changing Course: Ideas, Politics, and the Soviet Withdrawal from Afghanistan. (eBook, Paperback and Hardcover) [1998]
"Soviet foreign policy changed dramatically in the 1980s. The shift, bitterly resisted by the country's foreign policy traditionalists, ultimately contributed to the collapse of the Soviet Union and the end of the Cold War. In Changing Course, Sarah Mendelson demonstrates that interpretations that stress the impact of the international system, and particularly of U.S. foreign policy, or that focus on the role of ideas or politics alone, fail to explain the contingent process of change. Mendelson tells a story of internal battles where "misfit" ideas--ones that severely challenged the status quo--were turned into policies. She draws on firsthand interviews with those who ran Soviet foreign policy and the war in Afghanistan and on recently declassified material from Soviet archives to show that both ideas and political strategies were needed to make reform happen.
"Focusing on the Soviet decision to withdraw from Afghanistan, Mendelson details the strategies used by the Gorbachev coalition to shift the internal balance of power in favor of constituencies pushing new ideas--mutual security, for example--while undermining the power of old constituencies resistant to change. The interactive dynamic between ideas and politics that she identifies in the case of the Soviet withdrawal from Afghanistan is fundamental to understanding other shifts in Soviet foreign policy and the end of the Cold War. Her exclusive interviews with the foreign policy elite also offer a unique glimpse of the inner workings of the former Soviet power structure."
to:NB  books:noted  soviet-afghan_war  afghanistan  ussr  political_science  20th_century_history
23 days ago
Marek, C.: In the Land of a Thousand Gods: A History of Asia Minor in the Ancient World. (Hardcover)
"This monumental book provides the first comprehensive history of Asia Minor from prehistory to the Roman imperial period. In this English-language edition of the critically acclaimed German book, Christian Marek masterfully employs ancient sources to illuminate civic institutions, urban and rural society, agriculture, trade and money, the influential Greek writers of the Second Sophistic, the notoriously bloody exhibitions of the gladiatorial arena, and more.
"In the Land of a Thousand Gods is truly panoramic in scope. Blending rich narrative with in-depth analyses of political, social, and economic history, the book traces Asia Minor’s shifting orientation between East and West and examines its role as both a melting pot of nations and a bridge for cultural transmission. Marek takes readers from the earliest known Stone Age settlements to the end of antiquity. He covers the emergence of early Greek poetry and science, the invention of coinage, Persian domination, the prosperity of cities under the Hellenistic kings, and the establishment of Roman provinces. Marek draws on the latest research—in fields ranging from demography and economics to architecture and religion—to describe how Asia Minor became a center of culture and wealth in the Roman Empire. He shows how the advancement of Hellenic culture and civic autonomy was the irreversible legacy of the Pax Romana."
to:NB  books:noted  history  ancient_history  roman_empire  imperialism  cultural_exchange
23 days ago
[1606.03490] The Mythos of Model Interpretability
"Supervised machine learning models boast remarkable predictive capabilities. But can you trust your model? Will it work in deployment? What else can it tell you about the world? We want models to be not only good, but interpretable. And yet the task of interpretation appears underspecified. Papers provide diverse and sometimes non-overlapping motivations for interpretability, and offer myriad notions of what attributes render models interpretable. Despite this ambiguity, many papers proclaim interpretability axiomatically, absent further explanation. In this paper, we seek to refine the discourse on interpretability. First, we examine the motivations underlying interest in interpretability, finding them to be diverse and occasionally discordant. Then, we address model properties and techniques thought to confer interpretability, identifying transparency to humans and post-hoc explanations as competing notions. Throughout, we discuss the feasibility and desirability of different notions, and question the oft-made assertions that linear models are interpretable and that deep neural networks are not."
to:NB  data_mining  statistics  modeling  via:vaguery  to_teach  to_be_shot_after_a_fair_trial
23 days ago
[1606.07772] The emotional arcs of stories are dominated by six basic shapes
"Advances in computing power, natural language processing, and digitization of text now make it possible to study our a culture's evolution through its texts using a "big data" lens. Our ability to communicate relies in part upon a shared emotional experience, with stories often following distinct emotional trajectories, forming patterns that are meaningful to us. Here, by classifying the emotional arcs for a filtered subset of 1,737 stories from Project Gutenberg's fiction collection, we find a set of six core trajectories which form the building blocks of complex narratives. We strengthen our findings by separately applying optimization, linear decomposition, supervised learning, and unsupervised learning. For each of these six core emotional arcs, we examine the closest characteristic stories in publication today and find that particular emotional arcs enjoy greater success, as measured by downloads."
to:NB  narrative  text_mining  dodds.peter_sheridan  literary_criticism  literary_history  to_be_shot_after_a_fair_trial  via:rvenkat
23 days ago
The Preference for Belief Consonance by Russell Golman, George Loewenstein, Karl O. Moene, Luca Zarri :: SSRN
"We consider the determinants and consequences of a source of utility that has received limited attention from economists: people’s desire for the beliefs of other people to align with their own. We relate this ‘preference for belief consonance’ to a variety of other constructs that have been explored by economists, including identity, ideology, homophily and fellow-feeling. We review different possible explanations for why people care about others’ beliefs and propose that the preference for belief consonance leads to a range of disparate phenomena, including motivated belief-formation, proselytizing, selective exposure to media, avoidance of conversational minefields, pluralistic ignorance, belief-driven clustering, intergroup belief polarization and conflict. We also discuss an explanation for why disputes are often so intense between groups whose beliefs are, by external observers’ standards, highly similar to one-another."
to:NB  social_psychology  collective_cognition  social_life_of_the_mind  re:democratic_cognition  golman.russell  lowenstein.george  via:?
23 days ago
Education and Equality, Allen
"American education as we know it today—guaranteed by the state to serve every child in the country—is still less than a hundred years old. It’s no wonder we haven’t agreed yet as to exactly what role education should play in our society. In these Tanner Lectures, Danielle Allen brings us much closer, examining the ideological impasse between vocational and humanistic approaches that has plagued educational discourse, offering a compelling proposal to finally resolve the dispute.
"Allen argues that education plays a crucial role in the cultivation of political and social equality and economic fairness, but that we have lost sight of exactly what that role is and should be. Drawing on thinkers such as John Rawls and Hannah Arendt, she sketches out a humanistic baseline that re-links education to equality, showing how doing so can help us reframe policy questions. From there, she turns to civic education, showing that we must reorient education’s trajectory toward readying students for lives as democratic citizens. Deepened by commentaries from leading thinkers Tommie Shelby, Marcelo Suárez-Orozco, Michael Rebell, and Quiara Alegría Hudes that touch on issues ranging from globalization to law to linguistic empowerment, this book offers a critical clarification of just how important education is to democratic life, as well as a stirring defense of the humanities."
to:NB  books:noted  education  equality  political_philosophy  allen.danielle_s.  kith_and_kin  coveted
26 days ago
Oil and Water: Being Han in Xinjiang, Cliff
"For decades, China’s Xinjiang region has been the site of clashes between long-residing Uyghur and Han settlers. Up until now, much scholarly attention has been paid to state actions and the Uyghur’s efforts to resist cultural and economic repression. This has left the other half of the puzzle—the motivations and ambitions of Han settlers themselves—sorely understudied.
"With Oil and Water, anthropologist Tom Cliff offers the first ethnographic study of Han in Xinjiang, using in-depth vignettes, oral histories, and more than fifty original photographs to explore how and why they became the people they are now. By shifting focus to the lived experience of ordinary Han settlers, Oil and Water provides an entirely new perspective on Chinese nation building in the twenty-first century and demonstrates the vital role that Xinjiang Han play in national politics—not simply as Beijing’s pawns, but as individuals pursuing their own survival and dreams on the frontier."
to:NB  books:noted  china  central_asia  ethnography
26 days ago
Understanding Police Intelligence Work, James
"Procedural and moral shortcomings in both child abuse cases and the long-term deployment of undercover police officers have raised questions about the effectiveness and efficacy of intelligence work, and yet intelligence work plays an ever growing role in policing. Part of a new series on evidence-based policing, this book is the first to offer a comprehensive, fully up-to-date account of how police can—and do—use intelligence, assessing the threats and opportunities presented by new digital technology, like the widespread use of social media and the emergence of “big data,” and applying both a practical and an ethical lens to police intelligence activities."
to:NB  books:noted  police_and_policing  intelligence_(spying)  surveillance  national_surveillance_state  data_mining
26 days ago
Knowledge Games: How Playing Games Can Solve Problems, Create Insight, and Make Change
"Imagine if new knowledge and insights came not just from research centers, think tanks, and universities but also from games, of all things. Video games have been viewed as causing social problems, but what if they actually helped solve them? This question drives Karen Schrier’s Knowledge Games, which seeks to uncover the potentials and pitfalls of using games to make discoveries, solve real-world problems, and better understand our world. For example, so-called knowledge games—such as Foldit, a protein-folding puzzle game, SchoolLife, which crowdsources bullying interventions, and Reverse the Odds, in which mobile game players analyze breast cancer data—are already being used by researchers to gain scientific, psychological, and humanistic insights.
"Schrier argues that knowledge games are potentially powerful because of their ability to motivate a crowd of problem solvers within a dynamic system while also tapping into the innovative data processing and computational abilities of games. In the near future, Schrier asserts, knowledge games may be created to understand and predict voting behavior, climate concerns, historical perspectives, online harassment, susceptibility to depression, or optimal advertising strategies, among other things.
"In addition to investigating the intersection of games, problem solving, and crowdsourcing, Schrier examines what happens when knowledge emerges from games and game players rather than scientists, professionals, and researchers. This accessible book also critiques the limits and implications of games and considers how they may redefine what it means to produce knowledge, to play, to educate, and to be a citizen."
to:NB  books:noted  computer_games  collective_cognition  social_life_of_the_mind  re:democratic_cognition
27 days ago
The Mediterranean World: From the Fall of Rome to the Rise of Napoleon
"Located at the intersection of Asia, Africa, and Europe, the Mediterranean has connected societies for millennia, creating a shared space of intense economic, cultural, and political interaction. Greek temples in Sicily, Roman ruins in North Africa, and Ottoman fortifications in Greece serve as reminders that the Mediterranean has no fixed national boundaries or stable ethnic and religious identities."

In The Mediterranean World, Monique O’Connell and Eric R Dursteler examine the history of this contested region from the medieval to the early modern era, beginning with the fall of Rome around 500 CE and closing with Napoleon’s attempted conquest of Egypt in 1798. Arguing convincingly that the Mediterranean should be studied as a singular unit, the authors explore the centuries when no lone power dominated the Mediterranean Sea and invaders brought their own unique languages and cultures to the region.

Structured around four interlocking themes—mobility, state development, commerce, and frontiers—this beautifully illustrated book brings new dimensions to the concepts of Mediterranean nationality and identity.
to:NB  books:noted  medieval_eurasian_history  world_history  history  mediterranean  state-building  cultural_exchange
27 days ago
Cluster failure: Why fMRI inferences for spatial extent have inflated false-positive rates
"Functional MRI (fMRI) is 25 years old, yet surprisingly its most common statistical methods have not been validated using real data. Here, we used resting-state fMRI data from 499 healthy controls to conduct 3 million task group analyses. Using this null data with different experimental designs, we estimate the incidence of significant results. In theory, we should find 5% false positives (for a significance threshold of 5%), but instead we found that the most common software packages for fMRI analysis (SPM, FSL, AFNI) can result in false-positive rates of up to 70%. These results question the validity of some 40,000 fMRI studies and may have a large impact on the interpretation of neuroimaging results."

--- Nichols is a serious guy (and co-author of one of the best fMRI textbooks I've seen).
to:NB  spatial_statistics  hypothesis_testing  fmri  neural_data_analysis  statistics  bad_data_analysis  nichols.thomas_e.
27 days ago
The Market for Data: The Changing Role of Social Sciences in Shaping the Law by Elizabeth Warren :: SSRN
"A vigorous market for scholarly data exists, as journalists, lobbyists and legislators search for facts to pepper their public statements and better influence public opinion. In the bankruptcy area, data providers, such as the Credit Research Center located at Georgetown University, have taken money from the consumer credit industry to produce studies supporting the credit industry's political positions. In the case of the CRC, the studies bear the University logo, but the Center describes the data as "proprietary," belonging exclusively to the industry funders who decide what data are released and what data are held private. This paper explores the implications of such funding arrangements on independent research and ultimately on the public debates."
to:NB  deceiving_us_has_become_an_industrial_process  natural_history_of_truthiness  corruption  academia  think_tanks  our_decrepit_institutions  social_science_methodology  social_life_of_the_mind  via:henry_farrell
27 days ago
[1606.09082] Formation of homophily in academic performance: students prefer to change their friends rather than performance
"Homophily, the tendency of individuals to associate with others who share similar traits, has been identified as a major driving force in the formation and evolution of social ties. In many cases, it is not clear if homophily is the result of a socialization process, where individuals change their traits according to the dominance of that trait in their local social networks, or if it results from a selection process, in which individuals reshape their social networks so that their traits match those in the new environment. Here we demonstrate the detailed temporal formation of strong homophily in academic achievements of high school and university students. We analyze a unique dataset that contains information about the detailed time evolution of a friendship network of 6,000 students across 42 months. Combining the evolving social network data with the time series of the academic performance (GPA) of individual students, we show that academic homophily is a result of selection: students prefer to gradually reorganize their social networks according to their performance levels, rather than adapting their performance to the level of their local group. We find no signs for a pull effect, where a social environment of good performers motivates bad students to improve their performance. We are able to understand the underlying dynamics of grades and networks with a simple model. The lack of a social pull effect in classical educational settings could have important implications for the understanding of the observed persistence of segregation, inequality and social immobility in societies."

--- On a quick skim, they do not actually address the confounding problem (and I wonder if they have actually read their reference [41]).
to:NB  to_read  social_networks  homophily  social_influence  education  re:homophily_and_confounding  to_be_shot_after_a_fair_trial
27 days ago
A Novel Nonparametric Approach for Neural Encoding and Decoding Models of Multimodal Receptive Fields
"Pyramidal neurons recorded from the rat hippocampus and entorhinal cortex, such as place and grid cells, have diverse receptive fields, which are either unimodal or multimodal. Spiking activity from these cells encodes information about the spatial position of a freely foraging rat. At fine timescales, a neuron’s spike activity also depends significantly on its own spike history. However, due to limitations of current parametric modeling approaches, it remains a challenge to estimate complex, multimodal neuronal receptive fields while incorporating spike history dependence. Furthermore, efforts to decode the rat’s trajectory in one- or two-dimensional space from hippocampal ensemble spiking activity have mainly focused on spike history–independent neuronal encoding models. In this letter, we address these two important issues by extending a recently introduced nonparametric neural encoding framework that allows modeling both complex spatial receptive fields and spike history dependencies. Using this extended nonparametric approach, we develop novel algorithms for decoding a rat’s trajectory based on recordings of hippocampal place cells and entorhinal grid cells. Results show that both encoding and decoding models derived from our new method performed significantly better than state-of-the-art encoding and decoding models on 6 minutes of test data. In addition, our model’s performance remains invariant to the apparent modality of the neuron’s receptive field."
to:NB  neuroscience  neural_coding_and_decoding  neural_data_analysis  statistics  nonparametrics
29 days ago
A Single Hidden Layer Feedforward Network with Only One Neuron in the Hidden Layer Can Approximate Any Univariate Function
"The possibility of approximating a continuous function on a compact subset of the real line by a feedforward single hidden layer neural network with a sigmoidal activation function has been studied in many papers. Such networks can approximate an arbitrary continuous function provided that an unlimited number of neurons in a hidden layer is permitted. In this note, we consider constructive approximation on any finite interval of $\mathbb{R}$ by neural networks with only one neuron in the hidden layer. We construct algorithmically a smooth, sigmoidal, almost monotone activation function $\sigma$ providing approximation to an arbitrary continuous function within any degree of accuracy. This algorithm is implemented in a computer program, which computes the value of $\sigma$ at any reasonable point of the real axis."
29 days ago
Mastering Feature Engineering - O'Reilly Media
"Feature engineering is essential to applied machine learning, but using domain knowledge to strengthen your predictive models can be difficult and expensive. To help fill the information gap on feature engineering, this complete hands-on guide teaches beginning-to-intermediate data scientists how to work with this widely practiced but little discussed topic."

Author Alice Zheng explains common practices and mathematical principles to help engineer features for new data and tasks. If you understand basic machine learning concepts like supervised and unsupervised learning, you’re ready to get started. Not only will you learn how to implement feature engineering in a systematic and principled way, you’ll also learn how to practice better data science.
to:NB  books:noted  data_mining  statistics  data_analysis  to_teach:data-mining  kith_and_kin  zheng.alice
4 weeks ago
Have human societies evolved? Evidence from history and pre-history - Springer
"I ask whether social evolutionary theories found in sociology, archaeology, and anthropology are useful in explaining human development from the Stone Age to the present-day. My data are partly derived from the four volumes of The Sources of Social Power, but I add statistical data on the growth of complexity and power in human groups. I distinguish three levels of evolutionary theory. The first level offers a minimalist definition of evolution in terms of social groups responding and adapting to changes in their social and natural environment. This is acceptable but trivial. The hard part is to elucidate what kinds of response are drawn from what changes, and all sociology shares in this difficulty. This model also tends to over-emphasize external pressures and neglect human inventiveness. The second level of theory is “multilineal” evolution in which various paths of endogenous development, aided by diffusion of practices between societies, dominate the historical and pre-historical record. This is acceptable as a model applied to some times, places, and practices, but when applied more generally it slides into a multi-causal analysis that is also conventional in the social sciences. The third level is a theory of general evolution for the entire human experience. Here materialist theories are dominant but they are stymied by their neglect of ideological, military, and political power relations. There is no acceptable theory of general social evolution. Thus the contribution of social evolutionary theory to the social sciences has been limited."
to:NB  social_evolution  cultural_evolution  mann.michael  re:do-institutions-evolve
4 weeks ago
Expectations of brilliance underlie gender distributions across academic disciplines | Science
"The gender imbalance in STEM subjects dominates current debates about women’s underrepresentation in academia. However, women are well represented at the Ph.D. level in some sciences and poorly represented in some humanities (e.g., in 2011, 54% of U.S. Ph.D.’s in molecular biology were women versus only 31% in philosophy). We hypothesize that, across the academic spectrum, women are underrepresented in fields whose practitioners believe that raw, innate talent is the main requirement for success, because women are stereotyped as not possessing such talent. This hypothesis extends to African Americans’ underrepresentation as well, as this group is subject to similar stereotypes. Results from a nationwide survey of academics support our hypothesis (termed the field-specific ability beliefs hypothesis) over three competing hypotheses."
to:NB  education  academia  sexism  racism  to_be_shot_after_a_fair_trial
5 weeks ago
Lowes, J.L.: The Road to Xanadu: A Study in the Ways of the Imagination. (eBook, Paperback and Hardcover)
"John Livingston Lowes's classic work shows how various images from Coleridge's extensive reading, particularly in travel literature, coalesced to form the imagistic texture of his two most famous poems, The Rime of the Ancient Mariner" and "Kubla Khan.""
books:recommended  books:owned  creativity  imagination  psychology  literary_history  literary_criticism  romanticism  poetry
5 weeks ago
Dupree, L.: Afghanistan (eBook, Paperback and Hardcover).
"The ancient land and the modern nation of Afghanistan are the subject of Louis Dupree's book. Both in the text and in over a hundred illustrations, he identifies the major patterns of Afghan history, society, and culture as they have developed from the Stone Age to the present.
"Originally published in 1973."

--- A classic, but now only historical.
books:recommended  in_NB  afghanistan  anthropology  20th_century_history  19th_century_history
5 weeks ago
Vartanian, A.: Diderot and Descartes (eBook, Paperback and Hardcover).
"A study of scientific naturalism in the Enlightenment. In tracing the materialism of Diderot, La Mettrie, Buffon, and D’Holbach to its sources, it offers a fresh appraisal of the total influence of Descartes on the Enlightenment."
in_NB  books:recommended  coveted  history_of_ideas  philosophy  enlightenment  vartanian.aram  de_la_mettrie.julian_offray  descartes.rene  diderot.denis
5 weeks ago
Vartanian, A.: LaMettrie's L'Homme Machine (eBook, Paperback and Hardcover).
"As a classic of the French Enlightenment, L’Homme Machine has in the past been of equal interest to students of philosophy, science, and literature. The present edition offers the first established text, with extensive notes. In his introduction, Dr. Vartanian discusses La Mettrie’s thesis, its sources, the place of the man-machine idea in the development of La Mettrie’s materialism, and its critical impact on the intellectual struggles of the eighteenth century."
books:recommended  in_NB  history_of_ideas  history_of_science  materialism  man_a_machine  philosophy  de_la_mettrie.julien_offray  vartanian.aram
5 weeks ago

Copy this bookmark:

description:

tags: