12883
Factor Modelling for High-Dimensional Time Series: Inference and Model Selection - Chan - 2016 - Journal of Time Series Analysis - Wiley Online Library
"Analysis of high-dimensional time series data is of increasing interest among different fields. This article studies high-dimensional time series from a dimension reduction perspective using factor modelling. Statistical inference is conducted using eigen-analysis of a certain non-negative definite matrix related to autocovariance matrices of the time series, which is applicable to fixed or increasing dimension. When the dimension goes to infinity, the rate of convergence and limiting distributions of estimated factors are established. Using the limiting distributions of estimated factors, a high-dimensional final prediction error criterion is proposed to select the number of factors. Asymptotic properties of the criterion are illustrated by simulation studies and real applications."
to:NB  time_series  factor_analysis  high-dimensional_statistics  statistics
16 hours ago
Conjuring Asia | East Asian History | Cambridge University Press
"The promise of magic has always commanded the human imagination, but the story of industrial modernity is usually seen as a process of disenchantment. Drawing on the writings and performances of the so-called 'Golden Age Magicians' from the turn of the twentieth century, Chris Goto-Jones unveils the ways in which European and North American encounters with (and representations of) Asia - the fabled Mystic East - worked to re-enchant experiences of the modern world. Beginning with a reconceptualization of the meaning of 'modern magic' itself - moving beyond conventional categories of 'real' and 'fake' magic - Goto-Jones' acclaimed book guides us on a magical mystery tour around India, China and Japan, showing us levitations and decapitations, magic duels and bullet catches, goldfish bowls and paper butterflies. In the end, this mesmerizing book reveals Orientalism as a kind of magic in itself, casting a spell over Western culture that leaves it transformed even today."
to:NB  books:noted  magic  orientalism  modernity  history
yesterday
Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures
"Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM."
to:NB  phylogenetics  statistics  re:phil-of-bayes_paper
yesterday
Constraint, natural selection, and the evolution of human body form

--- Not obvious to me how they can pick out the constraints here (maybe assuming equal within-grup covariances, and assuming they reflect constraints?), but presumably that's addressed in the paper.
to:NB  human_evolution  evolutionary_biology
yesterday
Unpaid, stressed, and confused: patients are the health care system's free labor - Vox
I'll just add that in my experience, this work often falls on the healthy spouse or children of a seriously ill person.
yesterday
Spiritual Despots: Modern Hinduism and the Genealogies of Self-Rule, Scott
"Historians of religion have examined at length the Protestant Reformation and the liberal idea of the self-governing individual that arose from it. In Spiritual Despots, J. Barton Scott reveals an unexamined piece of this story: how Protestant technologies of asceticism became entangled with Hindu spiritual practices to create an ideal of the “self-ruling subject” crucial to both nineteenth-century reform culture and early twentieth-century anticolonialism in India. Scott uses the quaint term “priestcraft” to track anticlerical polemics that vilified religious hierarchy, celebrated the individual, and endeavored to reform human subjects by freeing them from external religious influence. By drawing on English, Hindi, and Gujarati reformist writings, Scott provides a panoramic view of precisely how the specter of the crafty priest transformed religion and politics in India.
"Through this alternative genealogy of the self-ruling subject, Spiritual Despots demonstrates that Hindu reform movements cannot be understood solely within the precolonial tradition, but rather need to be read alongside other movements of their period. The book’s focus moves fluidly between Britain and India—engaging thinkers such as James Mill, Keshub Chunder Sen, Max Weber, Karsandas Mulji, Helena Blavatsky, M. K. Gandhi, and others—to show how colonial Hinduism shaped major modern discourses about the self. Throughout, Scott sheds much-needed light how the rhetoric of priestcraft and practices of worldly asceticism played a crucial role in creating a new moral and political order for twentieth-century India and demonstrates the importance of viewing the emergence of secularism through the colonial encounter."
to:NB  books:noted  india  cultural_exchange  rhetorical_self-fashioning  religion  history_of_religion  asceticism
3 days ago
After the Map: Cartography, Navigation, and the Transformation of Territory in the Twentieth Century, Rankin
"For most of the twentieth century, maps were indispensable. They were how governments understood, managed, and defended their territory, and during the two world wars they were produced by the hundreds of millions. Cartographers and journalists predicted the dawning of a “map-minded age,” where increasingly state-of-the-art maps would become everyday tools. By the century’s end, however, there had been decisive shift in mapping practices, as the dominant methods of land surveying and print publication were increasingly displaced by electronic navigation systems.
"In After the Map, William Rankin argues that although this shift did not render traditional maps obsolete, it did radically change our experience of geographic knowledge, from the God’s-eye view of the map to the embedded subjectivity of GPS. Likewise, older concerns with geographic truth and objectivity have been upstaged by a new emphasis on simplicity, reliability, and convenience. After the Map shows how this change in geographic perspective is ultimately a transformation of the nature of territory, both social and political."

--- Some of these claims seem just bizarre, especially that last sentence.
books:noted  to:NB  maps  to_be_shot_after_a_fair_trial
3 days ago
Segregation: A Global History of Divided Cities, Nightingale
"When we think of segregation, what often comes to mind is apartheid South Africa, or the American South in the age of Jim Crow—two societies fundamentally premised on the concept of the separation of the races. But as Carl H. Nightingale shows us in this magisterial history, segregation is everywhere, deforming cities and societies worldwide.
"Starting with segregation’s ancient roots, and what the archaeological evidence reveals about humanity’s long-standing use of urban divisions to reinforce political and economic inequality, Nightingale then moves to the world of European colonialism. It was there, he shows, segregation based on color—and eventually on race—took hold; the British East India Company, for example, split Calcutta into “White Town” and “Black Town.” As we follow Nightingale’s story around the globe, we see that division replicated from Hong Kong to Nairobi, Baltimore to San Francisco, and more. The turn of the twentieth century saw the most aggressive segregation movements yet, as white communities almost everywhere set to rearranging whole cities along racial lines. Nightingale focuses closely on two striking examples: Johannesburg, with its state-sponsored separation, and Chicago, in which the goal of segregation was advanced by the more subtle methods of real estate markets and housing policy.
"For the first time ever, the majority of humans live in cities, and nearly all those cities bear the scars of segregation. This unprecedented, ambitious history lays bare our troubled past, and sets us on the path to imagining the better, more equal cities of the future."
to:NB  books:noted  cities  racism  world_history
6 days ago
Radium and the Secret of Life, Campos
"Before the hydrogen bomb indelibly associated radioactivity with death, many chemists, physicians, botanists, and geneticists believed that radium might hold the secret to life. Physicists and chemists early on described the wondrous new element in lifelike terms such as “decay” and “half-life,” and made frequent references to the “natural selection” and “evolution” of the elements. Meanwhile, biologists of the period used radium in experiments aimed at elucidating some of the most basic phenomena of life, including metabolism and mutation.
"From the creation of half-living microbes in the test tube to charting the earliest histories of genetic engineering, Radium and the Secret of Life highlights previously unknown interconnections between the history of the early radioactive sciences and the sciences of heredity. Equating the transmutation of radium with the biological transmutation of living species, biologists saw in metabolism and mutation properties that reminded them of the new element. These initially provocative metaphoric links between radium and life proved remarkably productive and ultimately led to key biological insights into the origin of life, the nature of heredity, and the structure of the gene. Radium and the Secret of Life recovers a forgotten history of the connections between radioactivity and the life sciences that existed long before the dawn of molecular biology."
to:NB  books:noted  radioactivity  molecular_biology  genetics  physics  biology  history_of_science  history_of_physics
7 days ago
Dogon Restudied: A Field Evaluation of the Work of Marcel Griaule [and Comments and Replies] on JSTOR
"This restudy of the Dogon of Mali asks whether the texts produced by Marcel Griaule depict a society that is recognizable to the researcher and to the Dogon today and answers the question more or less in the negative. The picture of Dogon religion presented in _Dieu d'eau_ and _Le renard pale_ proved impossible to replicate in the field, even as the shadowy remnant of a largely forgotten past. The reasons for this, it is suggested, lie in the particular field situation of Griaule's research, including features of the ethnographer's approach, the political setting, the experience and predilections of the informants, and the values of Dogon culture."

--- This is an extraordinary story of a cluster of (mostly) good intentions producing horribly skewed results, which then took on a bizarre life of their own.
to:NB  ethnography  epidemiology_of_representations  scholarly_misconstruction_of_reality  to:blog  natural_history_of_truthiness
7 days ago
How political idealism leads us astray - Vox
The book sounds like warmed-over Popper, but then I suppose he does need that every so often. (I say this as someone who imprinted _very_ thoroughly on _The Open Society and Its Enemies_.)
7 days ago
Learning Minimal Latent Directed Information Polytrees
"We propose an approach for learning latent directed polytrees as long as there exists an appropriately defined discrepancy measure between the observed nodes. Specifically, we use our approach for learning directed information polytrees where samples are available from only a subset of processes. Directed information trees are a new type of probabilistic graphical models that represent the causal dynamics among a set of random processes in a stochastic system. We prove that the approach is consistent for learning minimal latent directed trees. We analyze the sample complexity of the learning task when the empirical estimator of mutual information is used as the discrepancy measure."
to:NB  to_read  information_theory  causal_inference  causal_discovery  statistics  chow-liu_trees  coleman.todd
7 days ago
Ours to Hack and to Own, ed. Scholz and Schneider - OR Books
"Here, for the first time in one volume, are some of the most cogent thinkers and doers on the subject of the cooptation of the Internet, and how we can resist and reverse the process. The activists who have put together Ours to Hack and to Own argue for a new kind of online economy: platform cooperativism, which combines the rich heritage of cooperatives with the promise of 21st-century technologies, free from monopoly, exploitation, and surveillance.
"The on-demand economy is reversing the rights and protections workers fought for centuries to win. Ordinary Internet users, meanwhile, retain little control over their personal data. While promising to be the great equalizers, online platforms have often exacerbated social inequalities. Can the Internet be owned and governed differently? What if Uber drivers set up their own platform, or if a city’s residents controlled their own version of Airbnb? This book shows that another kind of Internet is possible—and that, in a new generation of online platforms, it is already taking shape."
to:NB  books:noted  workers_cooperatives  networked_life
8 days ago
Bail, C.A.: Terrified: How Anti-Muslim Fringe Organizations Became Mainstream. (eBook, Paperback and Hardcover)
"In July 2010, Terry Jones, the pastor of a small fundamentalist church in Florida, announced plans to burn two hundred Qur’ans on the anniversary of the September 11 attacks. Though he ended up canceling the stunt in the face of widespread public backlash, his threat sparked violent protests across the Muslim world that left at least twenty people dead. In Terrified, Christopher Bail demonstrates how the beliefs of fanatics like Jones are inspired by a rapidly expanding network of anti-Muslim organizations that exert profound influence on American understanding of Islam.
"Bail traces how the anti-Muslim narrative of the political fringe has captivated large segments of the American media, government, and general public, validating the views of extremists who argue that the United States is at war with Islam and marginalizing mainstream Muslim-Americans who are uniquely positioned to discredit such claims. Drawing on cultural sociology, social network theory, and social psychology, he shows how anti-Muslim organizations gained visibility in the public sphere, commandeered a sense of legitimacy, and redefined the contours of contemporary debate, shifting it ever outward toward the fringe. Bail illustrates his pioneering theoretical argument through a big-data analysis of more than one hundred organizations struggling to shape public discourse about Islam, tracing their impact on hundreds of thousands of newspaper articles, television transcripts, legislative debates, and social media messages produced since the September 11 attacks. The book also features in-depth interviews with the leaders of these organizations, providing a rare look at how anti-Muslim organizations entered the American mainstream."
to:NB  books:noted  islamophobia  running_dogs_of_reaction  social_movements  whats_gone_wrong_with_america  the_continuing_crises
8 days ago
How Multiple Imputation Makes a Difference
"Political scientists increasingly recognize that multiple imputation represents a superior strategy for analyzing missing data to the widely used method of list- wise deletion. However, there has been little systematic investigation of how mul- tiple imputation affects existing empirical knowledge in the discipline. This article presents the first large-scale examination of the empirical effects of substituting mul- tiple imputation for listwise deletion in political science. The examination focuses on research in the major subfield of comparative and international political economy (CIPE) as an illustrative example. Specifically, I use multiple imputation to reana- lyze the results of almost every quantitative CIPE study published during a recent five-year period in International Organization and World Politics, two of the leading subfield journals in CIPE. The outcome is striking: in almost half of the studies, key results “disappear” (by conventional statistical standards) when reanalyzed."
to:NB  have_skimmed  re:ADAfaEPoV  missing_data  statistics  political_science  via:henry_farrell
8 days ago
teachers are laborers, not merchants – Fredrik deBoer
"Here’s the model that the constant “online education will replace physical colleges” types advance: education is about gaining knowledge; knowledge is stored in the heads of teachers; schooling is the transfer of that knowledge from the teacher’s head to the student’s head; physical facilities are expensive, but online equivalents are cheap; therefore someone will build an Amazon that cuts out the overhead of the physical campus and connects students to teachers in the online space or, alternatively, cuts teachers out altogether and just transfers the information straight into the brains of the student.
"The basic failure here is the basic model of transfer of information, like teachers are merchants who sell discrete products known as knowledge or skills. In fact education is far more a matter of labor, of teachers working to push that information into the heads of students, or more accurately, to compel students to push it into their own heads. And this work is fundamentally social, and requires human accountability, particularly for those who lack prerequisite skills.
"I’ve said this before: if education was really about access to information, then anyone with a library card could have skipped college well before the internet. The idea that the internet suddenly made education obsolete because it freed information from being hidden away presumes that information was kept under lock and key. But, you know, books exist and are pretty cheap and they contain information. Yet if you have a class of undergraduates sit in a room for an hour twice a week with some chemistry textbooks, I can tell you that most of them aren’t going to learn a lot of chemistry. The printing press did not make teachers obsolete, and neither has the internet."
10 days ago
Temporal Evolution of Social Innovation: What Matters? : SIAM Journal on Applied Dynamical Systems: Vol. 15, No. 3 (Society for Industrial and Applied Mathematics)
"Variations in patterns of innovation propagation found across complex networks are governed by the preference for an innovation and the topology (or connectivity pattern) of a network. This paper incorporates the interplay of these two features, which has received scant attention so far, in a simple model to study the temporal evolution of innovation in a social network. An individual upon interaction with an acceptor in the neighborhood progresses from an uninformed state to being informed before accepting the innovation, with a probability $\lambda$ that specifies the preference for innovation. Using only one intermediate information acquisition stage, the model concisely brings out a variety of patterns. Time taken to attain maximum velocity in a class of connectivity $k$ and population $N_k$ depends on $\lambda^{-2}k^{-2}N_{k}^{1/2}$. More importantly, we establish the lower bound that the average connectivity of a random network having minimum connectivity as low as 2 can attain and still be able to overtake the corresponding innovation emergence in a scale-free network. We show computationally and analytically the conditions in which the propagation in random networks may lead or lag behind that in scale-free networks. Hierarchical propagation is evident across connectivity classes within scale-free networks, as well as across random networks with distinct values of $k$, population. For highly preferred innovations, however, the hierarchy within scale-free networks tends to be insignificant. We verify using stochastic dominance, an uncertainty in class contributions in the upper range of connectivity. This makes innovation hard to administer in finite size networks."

--- This is almost what's needed, except there should be variation in susceptibility, correlated with degree. (Perhaps that would work out as amplifying or reducing effective degree, with constant susceptibility?)
to:NB  to_read  diffusion_of_innovations  epidemic_models  social_networks  re:do-institutions-evolve
11 days ago
Ancestors, Territoriality, and Gods - A Natural History of Religion | Ina Wunn | Springer
"This books sets out to explain how and why religion came into being. Today this question is as fascinating as ever, especially since religion has moved to the centre of socio-political relationships. In contrast to the current, but incomplete approaches from disciplines such as cognitive science and psychology, the present authors adopt a new approach, equally manifest and constructive, that explains the origins of religion based strictly on behavioural biology. They employ accepted research results that remove all need for speculation. Decisive factors for the earliest demonstrations of religion are thus territorial behaviour and ranking, coping with existential fears, and conflict solution with the help of rituals. These in turn, in a process of cultural evolution, are shown to be the roots of the historical and contemporary religions."

--- Because "existential fears" is clearly a non-psychological concept.
to:NB  books:noted  religion  psychoceramica
15 days ago
A Delayed Review of This Changes Everything: Capitalism vs the Climate by Naomi Klein
"The view that capitalism is a style of thinking, progress is a myth, and political contestation is irrelevant to “true” social change belongs not just to this one book but to all the commentators who found nothing to criticize. That’s the real problem."
book_reviews  climate_change  progressive_forces  dorman.peter  have_read  via:?
15 days ago
Does Peer Review Work? An Experiment of Experimentalism by Daniel E. Ho :: SSRN
"Ensuring the accuracy and consistency of highly decentralized and discretionary decision making is a core challenge for the administrative state. The widely influential school of “democratic experimentalism” posits that peer review provides a way forward, but systematic evidence remains limited. This Article provides the first empirical study of the feasibility and effects of peer review as a governance mechanism, based on a unique randomized controlled trial conducted with the largest health department in Washington State (Public Health-Seattle and King County). We randomly assigned half of the food safety inspection staff to engage in an intensive peer review process for a 4-month period. Pairs of inspectors jointly visited establishments, separately assessed health code violations, and deliberated about divergences on health code implementation. Our findings are threefold. First, observing identical conditions, inspectors disagreed 60% of the time. These joint inspection results in turn helped to pinpoint challenging code items and to efficiently develop training and guidance documents during weekly sessions. Second, analyzing over 28,000 independently conducted inspections across the peer review and control groups, we find that the intervention caused an increase in violations detected and scored by 17-19%. Third, peer review appeared to decrease variability across inspectors, thereby improving the consistency of inspections. As a result of this trial, King County has now instituted peer review as a standard practice. Our study has rich implications for the feasibility, promise, practice, and pitfalls of peer review, democratic experimentalism, and the administrative state."
to:NB  to_read  peer_review  public_policy  regulation  re:democratic_cognition  experimental_sociology
17 days ago
Campbell, J.: Polarized: Making Sense of a Divided America. (eBook and Hardcover)
"Many continue to believe that the United States is a nation of political moderates. In fact, it is a nation divided. It has been so for some time and has grown more so. This book provides a new and historically grounded perspective on the polarization of America, systematically documenting how and why it happened.
"Polarized presents commonsense benchmarks to measure polarization, draws data from a wide range of historical sources, and carefully assesses the quality of the evidence. Through an innovative and insightful use of circumstantial evidence, it provides a much-needed reality check to claims about polarization. This rigorous yet engaging and accessible book examines how polarization displaced pluralism and how this affected American democracy and civil society.
"Polarized challenges the widely held belief that polarization is the product of party and media elites, revealing instead how the American public in the 1960s set in motion the increase of polarization. American politics became highly polarized from the bottom up, not the top down, and this began much earlier than often thought. The Democrats and the Republicans are now ideologically distant from each other and about equally distant from the political center. Polarized also explains why the parties are polarized at all, despite their battle for the decisive median voter. No subject is more central to understanding American politics than political polarization, and no other book offers a more in-depth and comprehensive analysis of the subject than this one."
to:NB  books:noted  us_politics  political_science  whats_gone_wrong_with_america
17 days ago
[1608.00607] Configuring Random Graph Models with Fixed Degree Sequences
"Random graph null models have found widespread application in diverse research communities analyzing network datasets. The most popular family of random graph null models, called configuration models, are defined as uniform distributions over a space of graphs with a fixed degree sequence. Commonly, properties of an empirical network are compared to properties of an ensemble of graphs from a configuration model in order to quantify whether empirical network properties are meaningful or whether they are instead a common consequence of the particular degree sequence. In this work we study the subtle but important decisions underlying the specification of a configuration model, and investigate the role these choices play in graph sampling procedures and a suite of applications. We place particular emphasis on the importance of specifying the appropriate graph labeling---stub-labeled or vertex-labeled---under which to consider a null model, a choice that closely connects the study of random graphs to the study of random contingency tables. We show that the choice of graph labeling is inconsequential for studies of simple graphs, but can have a significant impact on analyses of multigraphs or graphs with self-loops. The importance of these choices is demonstrated through a series of three in-depth vignettes, analyzing three different network datasets under many different configuration models and observing substantial differences in study conclusions under different models. We argue that in each case, only one of the possible configuration models is appropriate. While our work focuses on undirected static networks, it aims to guide the study of directed networks, dynamic networks, and all other network contexts that are suitably studied through the lens of random graph null models."
to:NB  network_data_analysis  statistics  null_models  to_teach:baby-nets
17 days ago
Headley, J.M.: The Europeanization of the World: On the Origins of Human Rights and Democracy. (eBook, Paperback and Hardcover)
"The Europeanization of the World puts forward a defense of Western civilization and the unique gifts it has bequeathed to the world-in particular, human rights and constitutional democracy-at a time when many around the globe equate the West with hubris and thinly veiled imperialism. John Headley argues that the Renaissance and the Reformation provided the effective currents for the development of two distinctive political ideas. The first is the idea of a common humanity, derived from antiquity, developed through natural law, and worked out in the new emerging global context to provide the basis for today's concept of universal human rights. The second is the idea of political dissent, first posited in the course of the Protestant Reformation and later maturing in the politics of the British monarchy.
"Headley traces the development and implications of this first idea from antiquity to the present. He examines the English revolution of 1688 and party government in Britain and America into the early nineteenth century. And he challenges the now--common stance in historical studies of moral posturing against the West. Headley contends that these unique ideas are Western civilization's most precious export, however presently distorted. Certainly European culture has its dark side--Auschwitz is but one example. Yet as Headley shows, no other civilization in history has bequeathed so sustained a tradition of universalizing aspirations as the West. The Europeanization of the World makes an argument that is controversial but long overdue. Written by one of our preeminent scholars of the Renaissance and Reformation, this elegantly reasoned book is certain to spark a much-needed reappraisal of the Western tradition."
to:NB  books:noted  democracy  human_rights  modernity  to_be_shot_after_a_fair_trial
17 days ago
Governed by a Spirit of Opposition: The Origins of American Political Practice in Colonial Philadelphia
"During the colonial era, ordinary Philadelphians played an unusually active role in political life. Because the city lacked a strong central government, private individuals working in civic associations of their own making shouldered broad responsibility for education, poverty relief, church governance, fire protection, and even taxation and military defense. These organizations dramatically expanded the opportunities for white men—rich and poor alike—to shape policies that immediately affected their communities and their own lives.
"In Governed by a Spirit of Opposition, Jessica Choppin Roney explains how allowing people from all walks of life to participate in political activities amplified citizen access and democratic governance. Merchants, shopkeepers, carpenters, brewers, shoemakers, and silversmiths served as churchwardens, street commissioners, constables, and Overseers of the Poor. They volunteered to fight fires, organized relief for the needy, contributed money toward the care of the sick, took up arms in defense of the community, raised capital for local lending, and even interjected themselves in Indian diplomacy. Ultimately, Roney suggests, popular participation in charity, schools, the militia, and informal banks empowered people in this critically important colonial city to overthrow the existing government in 1776 and re-envision the parameters of democratic participation.
"Governed by a Spirit of Opposition argues that the American Revolution did not occasion the birth of commonplace political activity or of an American culture of voluntary association. Rather, the Revolution built upon a long history of civic engagement and a complicated relationship between the practice of majority-rule and exclusionary policy-making on the part of appointed and self-selected constituencies."
to:NB  books:noted  american_history  civil_society  self-organization  heard_the_talk  where_by_"heard_the_talk"_i_mean_"heard_it_explained_over_drinks"  institutions  re:democratic_cognition  democracy
20 days ago
The Rise of Modern Science Explained | History Science and Technology | Cambridge University Press
"For centuries, laymen and priests, lone thinkers and philosophical schools in Greece, China, the Islamic world and Europe reflected with wisdom and perseverance on how the natural world fits together. As a rule, their methods and conclusions, while often ingenious, were misdirected when viewed from the perspective of modern science. In the 1600s thinkers such as Galileo, Kepler, Descartes, Bacon and many others gave revolutionary new twists to traditional ideas and practices, culminating in the work of Isaac Newton half a century later. It was as if the world was being created anew. But why did this recreation begin in Europe rather than elsewhere? This book caps H. Floris Cohen's career-long effort to find answers to this classic question. Here he sets forth a rich but highly accessible account of what, against many odds, made it happen and why."
to:NB  books:noted  history_of_science  comparative_history  scientific_revolution
27 days ago
Struck, P.T.: Divination and Human Nature: A Cognitive History of Intuition in Classical Antiquity. (eBook and Hardcover)
"Divination and Human Nature casts a new perspective on the rich tradition of ancient divination—the reading of divine signs in oracles, omens, and dreams. Popular attitudes during classical antiquity saw these readings as signs from the gods while modern scholars have treated such beliefs as primitive superstitions. In this book, Peter Struck reveals instead that such phenomena provoked an entirely different accounting from the ancient philosophers. These philosophers produced subtle studies into what was an odd but observable fact—that humans could sometimes have uncanny insights—and their work signifies an early chapter in the cognitive history of intuition.
"Examining the writings of Plato, Aristotle, the Stoics, and the Neoplatonists, Struck demonstrates that they all observed how, setting aside the charlatans and swindlers, some people had premonitions defying the typical bounds of rationality. Given the wide differences among these ancient thinkers, Struck notes that they converged on seeing this surplus insight as an artifact of human nature, projections produced under specific conditions by our physiology. For the philosophers, such unexplained insights invited a speculative search for an alternative and more naturalistic system of cognition.
"Recovering a lost piece of an ancient tradition, Divination and Human Nature illustrates how philosophers of the classical era interpreted the phenomena of divination as a practice closer to intuition and instinct than magic."
to:NB  books:noted  divination  history_of_ideas  philosophy  ancient_history  re:evidence-based_haruspicy
28 days ago
Common Property | Boston Review
Social insurance as rents from a share in (fairly literally) the commonwealth.
political_philosophy  welfare_state  hayek.f.a._von  paine.thomas  anderson.elizabeth  have_read
28 days ago
Politics and Institutionalism: Explaining Durability and Change on JSTOR
"From the complex literatures on "institutionalisms" in political science and sociology, various components of institutional change are identified: mutability, contradiction, multiplicity, containment and diffusion, learning and innovation, and mediation. This exercise results in a number of clear prescriptions for the analysis of politics and institutional change: disaggregate institutions into schemas and resources; decompose institutional durability into processes of reproduction, disruption, and response to disruption; and, above all, appreciate the multiplicity and heterogeneity of the institutions that make up the social world. Recent empirical work on identities, interests, alternatives, and political innovation illustrates how political scientists and sociologists have begun to document the consequences of institutional contradiction and multiplicity and to trace the workings of institutional containment, diffusion, and mediation."
to:NB  to_read  institutions  social_theory  diffusion_of_innovations  re:do-institutions-evolve  via:henry_farrell
29 days ago
Transformative Treatments
"Contemporary social-scientific research seeks to identify specific causal mechanisms for outcomes of theoretical interest. Experiments that randomize populations to treatment and control conditions are the “gold standard” for causal inference. We identify, describe, and analyze the problem posed by transformative treatments. Such treatments radically change treated individuals in a way that creates a mismatch in populations, but this mismatch is not empirically detectable at the level of counterfactual dependence. In such cases, the identification of causal pathways is underdetermined in a previously unrecognized way. Moreover, if the treatment is indeed transformative it breaks the inferential structure of the experimental design. Transformative treatments are not curiosities or “corner cases”, but are plausible mechanisms in a large class of events of theoretical interest, particularly ones where deliberate randomization is impractical and quasi-experimental designs are sought instead. They cast long-running debates about treatment and selection effects in a new light, and raise new methodological challenges."

--- After skimming, I'm left spluttering "but, but, _every_ intervention creates a new population!", so I am probably missing something fundamental, and should do more than just skim.
to:NB  causality  causal_inference  barely-comprehensible_metaphysics  healy.kieran  have_skimmed
29 days ago
[1607.05506] Distribution-dependent concentration inequalities for tighter generalization bounds
"We prove several distribution-dependent extensions of Hoeffding and McDiarmid's inequalities with (difference-) unbounded and hierarchically (difference-) bounded functions. For this purpose, several assumptions about the probabilistic boundedness and bounded differences are introduced. Our approaches improve the previous concentration inequalities' bounds, and achieve tight bounds in some exceptional cases where the original inequalities cannot hold. Furthermore, we discuss the potential applications of our extensions in VC dimension and Rademacher complexity. Then we obtain generalization bounds for (difference-) unbounded loss functions and tighten the existing generalization bounds."
to:NB  deviation_inequalities  probability  learning_theory
4 weeks ago
[1604.01575] Clustering implies geometry in networks
"Network models with latent geometry have been used successfully in many applications in network science and other disciplines, yet it is usually impossible to tell if a given real network is geometric, meaning if it is a typical element in an ensemble of random geometric graphs. Here we identify structural properties of networks that guarantee that random graphs having these properties are geometric. Specifically we show that random graphs in which expected degree and clustering of every node are fixed to some constants are equivalent to random geometric graphs on the real line, if clustering is sufficiently strong. Large numbers of triangles, homogeneously distributed across all nodes as in real networks, are thus a consequence of network geometricity. The methods we use to prove this are quite general and applicable to other network ensembles, geometric or not, and to certain problems in quantum gravity."
to:NB  network_data_analysis  network_formation  latent_space_network_models
4 weeks ago
[1501.06835] Emergence of Soft Communities from Geometric Preferential Attachment
"All real networks are different, but many have some structural properties in common. There seems to be no consensus on what the most common properties are, but scale-free degree distributions, strong clustering, and community structure are frequently mentioned without question. Surprisingly, there exists no simple generative mechanism explaining all the three properties at once in growing networks. Here we show how latent network geometry coupled with preferential attachment of nodes to this geometry fills this gap. We call this mechanism geometric preferential attachment (GPA), and validate it against the Internet. GPA gives rise to soft communities that provide a different perspective on the community structure in networks. The connections between GPA and cosmological models, including inflation, are also discussed."
to:NB  networks  network_formation  community_discovery  latent_space_network_models
4 weeks ago
Denny, M.: Ecological Mechanics: Principles of Life’s Physical Interactions. (eBook and Hardcover)
"Plants and animals interact with each other and their surroundings, and these interactions—with all their complexity and contingency—control where species can survive and reproduce. In this comprehensive and groundbreaking introduction to the emerging field of ecological mechanics, Mark Denny explains how the principles of physics and engineering can be used to understand the intricacies of these remarkable relationships.
"Denny opens with a brief review of basic physics before introducing the fundamentals of diffusion, fluid mechanics, solid mechanics, and heat transfer, taking care to explain each in the context of living organisms. Why are corals of different shapes on different parts of a reef? How can geckos climb sheer walls? Why can birds and fish migrate farther than mammals? How do desert plants stay cool? The answers to these and a host of similar questions illustrate the principles of heat, mass, and momentum transport and set the stage for the book’s central topic—the application of these principles in ecology. Denny shows how variations in the environment—in both space and time—affect the performance of plants and animals. He introduces spectral analysis, a mathematical tool for quantifying the patterns in which environments vary, and uses it to analyze such subjects as the spread of invasive species. Synthesizing the book’s materials, the final chapters use ecological mechanics to predict the occurrence and consequences of extreme ecological events, explain the emergence of patterns in the distribution and abundance of organisms, and empower readers to explore further."
to:NB  books:noted  physics  biology  ecology  biophysics
4 weeks ago
Del Vecchio, D. and Murray, R.M.: Biomolecular Feedback Systems (eBook and Hardcover).
"This book provides an accessible introduction to the principles and tools for modeling, analyzing, and synthesizing biomolecular systems. It begins with modeling tools such as reaction-rate equations, reduced-order models, stochastic models, and specific models of important core processes. It then describes in detail the control and dynamical systems tools used to analyze these models. These include tools for analyzing stability of equilibria, limit cycles, robustness, and parameter uncertainty. Modeling and analysis techniques are then applied to design examples from both natural systems and synthetic biomolecular circuits. In addition, this comprehensive book addresses the problem of modular composition of synthetic circuits, the tools for analyzing the extent of modularity, and the design techniques for ensuring modular behavior. It also looks at design trade-offs, focusing on perturbations due to noise and competition for shared cellular resources."
to:NB  books:noted  biochemical_networks  feedback  biology  networks
4 weeks ago
[1607.06494] Stochastic Control via Entropy Compression
"We consider an agent trying to bring a system to an acceptable state by repeated probabilistic action (stochastic control). Specifically, in each step the agent observes the flaws present in the current state, selects one of them, and addresses it by probabilistically moving to a new state, one where the addressed flaw is most likely absent, but where one or more new flaws may be present. Several recent works on algorithmizations of the Lov\'{a}sz Local Lemma have established sufficient conditions for such an agent to succeed. Motivated by the paradigm of Partially Observable Markov Decision Processes (POMDPs) we study whether such stochastic control is also possible in a noisy environment, where both the process of state-observation and the process of state-evolution are subject to adversarial perturbation (noise). The introduction of noise causes the tools developed for LLL algorithmization to break down since the key LLL ingredient, the sparsity of the causality (dependence) relationship, no longer holds. To overcome this challenge we develop a new analysis where entropy plays a central role, both to measure the rate at which progress towards an acceptable state is made and the rate at which the noise undoes this progress. The end result is a sufficient condition that allows a smooth tradeoff between the intensity of the noise and the amenability of the system, recovering an asymmetric LLL condition in the noiseless case. To our knowledge, this is the first tractability result for a nontrivial class of POMDPs under stochastic memoryless control."
to:NB  control_theory  state-space_models  information_theory  via:ded-maxim
4 weeks ago
[1607.06534] The Landscape of Empirical Risk for Non-convex Losses
"We revisit the problem of learning a noisy linear classifier by minimizing the empirical risk associated to the square loss. While the empirical risk is non-convex, we prove that its structure is remarkably simple. Namely, when the sample size is larger than Cdlogd (with d the dimension, and C a constant) the following happen with high probability: (a) The empirical risk has a unique local minimum (which is also the global minimum); (b) Gradient descent converges exponentially fast to the global minimizer, from any initialization; (c) The global minimizer approaches the true parameter at nearly optimal rate. The core of our argument is to establish a uniform convergence result for the gradients and Hessians of the empirical risk."
to:NB  learning_theory  empirical_processes  optimization  via:ded-maxim
4 weeks ago
Fast Patchwork Bootstrap for Quantifying Estimation Uncertainties in Sparse Random Networks
"We propose a new method of nonparametric bootstrap to quantify estimation uncertainties in large and possibly sparse random networks. The method is tailored for inference on functions of network degree distribution, under the assump- tion that both network degree distribution and network or- der are unknown. The key idea is based on adaptation of the “blocking” argument, developed for bootstrapping of time series and re-tiling of spatial data, to random networks. We sample network blocks (patches) and bootstrap the data within these patches. To select an optimal patch size, we de- velop a new computationally efficient and data-driven cross- validation algorithm. The proposed fast patchwork boot- strap (FPB) methodology further extends the ideas devel- oped by [33] for a case of network mean degree, to infer- ence on a degree distribution. In addition, the FPB is sub- stantially less computationally expensive, requires less in- formation on a graph, and is free from nuisance parame- ters. In our simulation study, we show that the new boot- strap method outperforms competing approaches by pro- viding sharper and better calibrated confidence intervals for functions of a network degree distribution than other avail- able approaches. We illustrate the FPB in application to a study of the Erdo ̈s collaboration network."
4 weeks ago
[1607.06565] Controlling for Latent Homophily in Social Networks through Inferring Latent Locations
"Social influence cannot be identified from purely observational data on social networks, because such influence is generically confounded with latent homophily, i.e., with a node's network partners being informative about the node's attributes and therefore its behavior. We show that {\em if} the network grows according to either a community (stochastic block) model, or a continuous latent space model, then latent homophilous attributes can be consistently estimated from the global pattern of social ties. Moreover, these estimates are informative enough that controlling for them allows for unbiased and consistent estimation of social-influence effects in additive models. For community models, we also provide bounds on the finite-sample bias. These are the first results on the consistent estimation of social-influence effects in the presence of latent homophily, and we discuss the prospects for generalizing them."
self-promotion  social_networks  network_data_analysis  causal_inference  community_discovery  re:homophily_and_confounding  to:blog
4 weeks ago
Learning Fair Representations
"We propose a learning algorithm for fair clas- sification that achieves both group fairness (the proportion of members in a protected group receiving positive classification is iden- tical to the proportion in the population as a whole), and individual fairness (similar in- dividuals should be treated similarly). We formulate fairness as an optimization prob- lem of finding a good representation of the data with two competing goals: to encode the data as well as possible, while simultaneously obfuscating any information about member- ship in the protected group. We show posi- tive results of our algorithm relative to other known techniques, on three datasets. More- over, we demonstrate several advantages to our approach. First, our intermediate rep- resentation can be used for other classifica- tion tasks (i.e., transfer learning is possible); secondly, we take a step toward learning a distance metric which can find important di- mensions of the data for classification."

--- This looks really similar to Simon DeDeo's stuff.
to:NB  to_read  classifiers  re:prediction-without-racism  data_mining  privacy
4 weeks ago
Epidemic spreading on complex networks with community structures : Scientific Reports
"Many real-world networks display a community structure. We study two random graph models that create a network with similar community structure as a given network. One model preserves the exact community structure of the original network, while the other model only preserves the set of communities and the vertex degrees. These models show that community structure is an important determinant of the behavior of percolation processes on networks, such as information diffusion or virus spreading: the community structure can both enforce as well as inhibit diffusion processes. Our models further show that it is the mesoscopic set of communities that matters. The exact internal structures of communities barely influence the behavior of percolation processes across networks. This insensitivity is likely due to the relative denseness of the communities."
to:NB  epidemic_models  social_networks  re:do-institutions-evolve
4 weeks ago
Conflict Kitchen » Marathon Reading of the Shahnameh
"Over the course of three days, from noon until 8 pm, July 20-22, the public is invited to read passages from the Shahnameh in conjunction with scheduled readings by members of Pittsburgh’s Iranian community."

--- Of course I would bookmark this too late to do any good...
pittsburgh  persianate_culture  poetry
4 weeks ago
Delay Embeddings for Forced Systems. II. Stochastic Forcing | SpringerLink
"Takens’ Embedding Theorem forms the basis of virtually all approaches to the analysis of time series generated by nonlinear deterministic dynamical systems. It typically allows us to reconstruct an unknown dynamical system which gave rise to a given observed scalar time series simply by constructing a new state space out of successive values of the time series. This provides the theoretical foundation for many popular techniques, including those for the measurement of fractal dimensions and Liapunov exponents, for the prediction of future behaviour, for noise reduction and signal separation, and most recently for control and targeting. Current versions of Takens’ Theorem assume that the underlying system is autonomous (and noise-free). Unfortunately this is not the case for many real systems. In a previous paper, one of us showed how to extend Takens’ Theorem to deterministically forced systems. Here, we use similar techniques to prove a number of delay embedding theorems for arbitrarily and stochastically forced systems. As a special case, we obtain embedding results for Iterated Functions Systems, and we also briefly consider noisy observations."
to:NB  to_read  time_series  state-space_reconstruction  state-space_models  statistics  dynamical_systems  have_skimmed
5 weeks ago
Stylized Facts in the Social Sciences | Sociological Science
"Stylized facts are empirical regularities in search of theoretical, causal explanations. Stylized facts are both positive claims (about what is in the world) and normative claims (about what merits scholarly attention). Much of canonical social science research can be usefully characterized as the production or contestation of stylized facts. Beyond their value as grist for the theoretical mill of social scientists, stylized facts also travel directly into the political arena. Drawing on three recent examples, I show how stylized facts can interact with existing folk causal theories to reconstitute political debates and how tensions in the operationalization of folk concepts drive contention around stylized fact claims."
to:NB  sociology  social_science_methodology
5 weeks ago
IASC: The Hedgehog Review - Volume 18, No. 2 (Summer 2016) - The New Ruling Class -
At last, _comparatively_ intelligent reactionaries.
(I say "comparatively" because: (1) Notice how much time they spend on what people _said_ was going to happen before the British civil service got reformed, compared to how much they are able to actually _show_ about the effect of those reforms. They're just _insinuating_ that the opponents of the reforms were right; (2) They're unable to imagine _egalitarian_ alternatives to "meritocracy"; (3) leaving details to mere mechanicals while preserving tone or higher intuition is of course a classic _aristocratic_ attitude.)
5 weeks ago
Guala, F.: Understanding Institutions: The Science and Philosophy of Living Together. (eBook and Hardcover)
"Understanding Institutions proposes a new unified theory of social institutions that combines the best insights of philosophers and social scientists who have written on this topic. Francesco Guala presents a theory that combines the features of three influential views of institutions: as equilibria of strategic games, as regulative rules, and as constitutive rules.
"Guala explains key institutions like money, private property, and marriage, and develops a much-needed unification of equilibrium- and rules-based approaches. Although he uses game theory concepts, the theory is presented in a simple, clear style that is accessible to a wide audience of scholars working in different fields. Outlining and discussing various implications of the unified theory, Guala addresses venerable issues such as reflexivity, realism, Verstehen, and fallibilism in the social sciences. He also critically analyses the theory of "looping effects" and "interactive kinds" defended by Ian Hacking, and asks whether it is possible to draw a demarcation between social and natural science using the criteria of causal and ontological dependence. Focusing on current debates about the definition of marriage, Guala shows how these abstract philosophical issues have important practical and political consequences.
"Moving beyond specific cases to general models and principles, Understanding Institutions offers new perspectives on what institutions are, how they work, and what they can do for us."
to:NB  books:noted  institutions  social_theory  philosophy  marriage  re:do-institutions-evolve
5 weeks ago
[1507.03652] Lasso adjustments of treatment effect estimates in randomized experiments
"We provide a principled way for investigators to analyze randomized experiments when the number of covariates is large. Investigators often use linear multivariate regression to analyze randomized experiments instead of simply reporting the difference of means between treatment and control groups. Their aim is to reduce the variance of the estimated treatment effect by adjusting for covariates. If there are a large number of covariates relative to the number of observations, regression may perform poorly because of overfitting. In such cases, the Lasso may be helpful. We study the resulting Lasso-based treatment effect estimator under the Neyman-Rubin model of randomized experiments. We present theoretical conditions that guarantee that the estimator is more efficient than the simple difference-of-means estimator, and we provide a conservative estimator of the asymptotic variance, which can yield tighter confidence intervals than the difference-of-means estimator. Simulation and data examples show that Lasso-based adjustment can be advantageous even when the number of covariates is less than the number of observations. Specifically, a variant using Lasso for selection and OLS for estimation performs particularly well, and it chooses a smoothing parameter based on combined performance of Lasso and OLS."
to:NB  heard_the_talk  have_skimmed  yu.bin  lasso  regression  causal_inference  statistics
5 weeks ago
Phys. Rev. Lett. 117, 038103 (2016) - How Far from Equilibrium Is Active Matter?
"Active matter systems are driven out of thermal equilibrium by a lack of generalized Stokes-Einstein relation between injection and dissipation of energy at the microscopic scale. We consider such a system of interacting particles, propelled by persistent noises, and show that, at small but finite persistence time, their dynamics still satisfy a time-reversal symmetry. To do so, we compute perturbatively their steady-state measure and show that, for short persistent times, the entropy production rate vanishes. This endows such systems with an effective fluctuation-dissipation theorem akin to that of thermal equilibrium systems. Last, we show how interacting particle systems with viscous drags and correlated noises can be seen as in equilibrium with a viscoelastic bath but driven out of equilibrium by nonconservative forces, hence providing energetic insight into the departure of active systems from equilibrium"
to:NB  to_read  thermodynamics  statistical_mechanics  non-equilibrium  fluctuation-response
5 weeks ago
DHQ: Digital Humanities Quarterly: Six Degrees of Francis Bacon: A Statistical Method for Reconstructing Large Historical Social Networks
"In this paper we present a statistical method for inferring historical social networks from biographical documents as well as the scholarly aims for doing so. Existing scholarship on historical social networks is scattered across an unmanageable number of disparate books and articles. A researcher interested in how persons were connected to one another in our field of study, early modern Britain (c. 1500-1700), has no global, unified resource to which to turn. Manually building such a network is infeasible, since it would need to represent thousands of nodes and tens of millions of potential edges just to include the relations among the most prominent persons of the period. Our Six Degrees of Francis Bacon project takes up recent statistical techniques and digital tools to reconstruct and visualize the early modern social network.
"We describe in this paper the natural language processing tools and statistical graph learning techniques that we used to extract names and infer relations from the Oxford Dictionary of National Biography. We then explain the steps taken to test inferred relations against the knowledge of experts in order to improve the accuracy of the learning techniques. Our argument here is twofold: first, that the results of this process, a global visualization of Britain’s early modern social network, will be useful to scholars and students of the period; second, that the pipeline we have developed can, with local modifications, be reused by other scholars to generate networks for other historical or contemporary societies from biographical documents."

--- I have helped perpetrate an act of digital humanities.
to:NB  self-promotion  social_networks  text_mining  lasso  statistics  early_modern_european_history  kith_and_kin
5 weeks ago
Noneuclidean Spring Embedders
"We present a method by which force-directed algorithms for graph layouts can be generalized to calculate the layout of a graph in an arbitrary Riemannian geometry. The method relies on extending the Euclidean notions of distance, angle, and force-interactions to smooth non-Euclidean geometries via projections to and from appropriately chosen tangent spaces. In particular, we formally describe the calculations needed to extend such algorithms to hyperbolic and spherical geometries."
to:NB  to_read  hyperbolic_geometry  re:hyperbolic_networks  network_data_analysis  network_visualization  via:cris_moore
6 weeks ago
Smith, J.E. H.: The Philosopher: A History in Six Types. (eBook and Hardcover)
"What would the global history of philosophy look like if it were told not as a story of ideas but as a series of job descriptions—ones that might have been used to fill the position of philosopher at different times and places over the past 2,500 years? The Philosopher does just that, providing a new way of looking at the history of philosophy by bringing to life six kinds of figures who have occupied the role of philosopher in a wide range of societies around the world over the millennia—the Natural Philosopher, the Sage, the Gadfly, the Ascetic, the Mandarin, and the Courtier. The result is at once an unconventional introduction to the global history of philosophy and an original exploration of what philosophy has been—and perhaps could be again.
"By uncovering forgotten or neglected philosophical job descriptions, the book reveals that philosophy is a universal activity, much broader—and more gender inclusive—than we normally think today. In doing so, The Philosopher challenges us to reconsider our idea of what philosophers can do and what counts as philosophy."
to:NB  books:noted  history_of_ideas  philosophy  world_history  to_be_shot_after_a_fair_trial
6 weeks ago
Hassan, M.: Longing for the Lost Caliphate: A Transregional History. (eBook and Hardcover)
"In the United States and Europe, the word “caliphate” has conjured historically romantic and increasingly pernicious associations. Yet the caliphate’s significance in Islamic history and Muslim culture remains poorly understood. This book explores the myriad meanings of the caliphate for Muslims around the world through the analytical lens of two key moments of loss in the thirteenth and twentieth centuries. Through extensive primary-source research, Mona Hassan explores the rich constellation of interpretations created by religious scholars, historians, musicians, statesmen, poets, and intellectuals.
"Hassan fills a scholarly gap regarding Muslim reactions to the destruction of the Abbasid caliphate in Baghdad in 1258 and challenges the notion that the Mongol onslaught signaled an end to the critical engagement of Muslim jurists and intellectuals with the idea of an Islamic caliphate. She also situates Muslim responses to the dramatic abolition of the Ottoman caliphate in 1924 as part of a longer trajectory of transregional cultural memory, revealing commonalities and differences in how modern Muslims have creatively interpreted and reinterpreted their heritage. Hassan examines how poignant memories of the lost caliphate have been evoked in Muslim culture, law, and politics, similar to the losses and repercussions experienced by other religious communities, including the destruction of the Second Temple for Jews and the fall of Rome for Christians."
to:NB  books:noted  islam  islamic_civilization  history_of_ideas  uses_of_the_past
6 weeks ago
[1605.09522] Kernel Mean Embedding of Distributions: A Review and Beyonds
"A Hilbert space embedding of distributions---in short, kernel mean embedding---has recently emerged as a powerful machinery for probabilistic modeling, statistical inference, machine learning, and causal discovery. The basic idea behind this framework is to map distributions into a reproducing kernel Hilbert space (RKHS) in which the whole arsenal of kernel methods can be extended to probability measures. It gave rise to a great deal of research and novel applications of positive definite kernels. The goal of this survey is to give a comprehensive review of existing works and recent advances in this research area, and to discuss some of the most challenging issues and open problems that could potentially lead to new research directions. The survey begins with a brief introduction to the RKHS and positive definite kernels which forms the backbone of this survey, followed by a thorough discussion of the Hilbert space embedding of marginal distributions, theoretical guarantees, and review of its applications. The embedding of distributions enables us to apply RKHS methods to probability measures which prompts a wide range of applications such as kernel two-sample testing, independent testing, group anomaly detection, and learning on distributional data. Next, we discuss the Hilbert space embedding for conditional distributions, give theoretical insights, and review some applications. The conditional mean embedding enables us to perform sum, product, and Bayes' rules---which are ubiquitous in graphical model, probabilistic inference, and reinforcement learning---in a non-parametric way using the new representation of distributions in RKHS. We then discuss relationships between this framework and other related areas. Lastly, we give some suggestions on future research directions."
to:NB  statistics  probability  hilbert_space  kernel_methods
6 weeks ago
Tool-box or toy-box? Hard obscurantism in economic modeling - Springer
"“Hard obscurantism” is a species of the genus scholarly obscurantism. A rough intensional definition of hard obscurantism is that models and procedures become ends in themselves, dissociated from their explanatory functions. In the present article, I exemplify and criticize hard obscurantism by examining the writings of eminent economists and political scientists."
to:NB  economics  political_science  philosophy_of_science  bad_science  elster.jon
6 weeks ago
What is Shannon information? - Springer
"Despite of its formal precision and its great many applications, Shannon’s theory still offers an active terrain of debate when the interpretation of its main concepts is the task at issue. In this article we try to analyze certain points that still remain obscure or matter of discussion, and whose elucidation contribute to the assessment of the different interpretative proposals about the concept of information. In particular, we argue for a pluralist position, according to which the different views about information are no longer rival, but different interpretations of a single formal concept."
6 weeks ago
Greiner, A. and Semmler, W., Gong, G.: The Forces of Economic Growth: A Time Series Perspective. (eBook, Paperback and Hardcover)
"In economics, the emergence of New Growth Theory in recent decades has directed attention to an old and important problem: what are the forces of economic growth and how can public policy enhance them? This book examines major forces of growth--including spillover effects and externalities, education and formation of human capital, knowledge creation through deliberate research efforts, and public infrastructure investment. Unique in emphasizing the importance of different forces for particular stages of development, it offers wide-ranging policy implications in the process.
"The authors critically examine recently developed endogenous growth models, study the dynamic implications of modified models, and test the models empirically with modern time series methods that avoid the perils of heterogeneity in cross-country studies. Their empirical analyses, undertaken with newly constructed time series data for the United States and some core countries of the Euro zone, show that models containing scale effects, such as the R&D model and the human capital model, are compatible with time series evidence only after considerable modifications and nonlinearities are introduced. They also explore the relationship between growth and inequality, with particular focus on technological change and income disparity. The Forces of Economic Growth represents a comprehensive and up-to-date empirical time series perspective on the New Growth Theory."
to:NB  books:noted  economics  economic_growth  econometrics  time_series  statistics
6 weeks ago
Siebert, H.: Rules for the Global Economy (eBook, Paperback and Hardcover).
"Rules for the Global Economy is a timely examination of the conditions under which international rules of globalization come into existence, enabling world economic and financial systems to function and stabilize. Horst Siebert, a leading figure in international economics, explains that these institutional arrangements, such as the ones that govern banking, emerge when countries fail to solve economic problems on their own and cede part of their sovereignty to an international order. Siebert demonstrates that the rules result from a trial-and-error process--and usually after a crisis--in order to prevent pointless transaction costs and risks.
"Using an accessible and nonmathematical approach, Siebert links the rules to four areas: international trade relations, factor movements, financial flows, and the environment. He looks at the international division of labor in the trade of goods and services; flow of capital; diffusion of technology; migration of people, including labor and human capital; protection of the global environment; and stability of the monetary-financial system. He discusses the role of ethical norms and human rights in defining international regulations, and argues that the benefits of any rules system should be direct and visible. Comprehensively supporting rules-based interactions among international players, the book considers future issues of the global rules system."
to:NB  books:noted  economics  globalization  institutions  re:do-institutions-evolve
6 weeks ago
Approximation Methods in Probability Theory | Vydas Čekanavičius | Springer
"This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems.
"While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful."
to:NB  probability  approximation  mathematics  convergence_of_stochastic_processes
6 weeks ago
Geiger , Heckerman , King , Meek : Stratified exponential families: Graphical models and model selection
"We describe a hierarchy of exponential families which is useful for distinguishing types of graphical models. Undirected graphical models with no hidden variables are linear exponential families (LEFs). Directed acyclic graphical (DAG) models and chain graphs with no hidden variables, includ­ ing DAG models with several families of local distributions, are curved exponential families (CEFs). Graphical models with hidden variables are what we term stratified exponential families (SEFs). A SEF is a finite union of CEFs of various dimensions satisfying some regularity conditions. We also show that this hierarchy of exponential families is noncollapsing with respect to graphical models by providing a graphical model which is a CEF but not a LEF and a graphical model that is a SEF but not a CEF. Finally, we show how to compute the dimension of a stratified exponential family. These results are discussed in the context of model selection of graphical models."
to:NB  have_read  graphical_models  exponential_families  statistics  geometry  via:rvenkat
6 weeks ago
[1604.07125] Efficient Inference of Average Treatment Effects in High Dimensions via Approximate Residual Balancing
"There are many settings where researchers are interested in estimating average treatment effects and are willing to rely on the unconfoundedness assumption, which requires that the treatment assignment be as good as random conditional on pre-treatment variables. The unconfoundedness assumption is often more plausible if a large number of pre-treatment variables are included in the analysis, but this can worsen the finite sample properties of standard approaches to treatment effect estimation. There are some recent proposals on how to extend classical methods to the high dimensional setting; however, to our knowledge, all existing method rely on consistent estimability of the propensity score, i.e., the probability of receiving treatment given pre-treatment variables. In this paper, we propose a new method for estimating average treatment effects in high dimensional linear settings that attains dimension-free rates of convergence for estimating average treatment effects under substantially weaker assumptions than existing methods: Instead of requiring the propensity score to be estimable, we only require overlap, i.e., that the propensity score be uniformly bounded away from 0 and 1. Procedurally, out method combines balancing weights with a regularized regression adjustment."

--- For the causal-ML reading group.
--- Pros: (i) The paper is very slick. (ii) Essentially, they are doing a regression adjustment, but the weights are enhancing the influence of data points in the "treated" part of the predictor space over what OLS would do.
--- Cons: (i) Without linearity, everything falls apart. (ii) _Pace_ their claim that they're just using the lasso as a predictor, they really do need accurate estimation (in L1) of the true coefficient vector. (iii) Hence all the usual uncheckable conditions people impose on the lasso get imported. [Though I guess you could use some other regularization and hope.] (iv) It's not at all clear that their optimization isn't just a disguised form of estimating propensity scores.
to:NB  have_read  causal_inference  statistics  regression  lasso  imbens.guido_w.  athey.susan  wager.stefan
7 weeks ago

Copy this bookmark:

description:

tags: