11433
Beautiful Data: A History of Vision and Reason Since 1945 (Experimental Futures) by Orit Halpern - Powell's Books
"Beautiful Data is both a history of big data and interactivity, and a sophisticated meditation on ideas about vision and cognition in the second half of the twentieth century. Contending that our forms of attention, observation, and truth are contingent and contested, Orit Halpern historicizes the ways that we are trained, and train ourselves, to observe and analyze the world. Tracing the postwar impact of cybernetics and the communication sciences on the social and human sciences, design, arts, and urban planning, she finds a radical shift in attitudes toward recording and displaying information. These changed attitudes produced what she calls communicative objectivity: new forms of observation, rationality, and economy based on the management and analysis of data. Halpern complicates assumptions about the value of data and visualization, arguing that changes in how we manage and train perception, and define reason and intelligence, are also transformations in governmentality. She also challenges the paradoxical belief that we are experiencing a crisis of attention caused by digital media, a crisis that can be resolved only through intensified media consumption."
to:NB  books:noted  visual_display_of_quantitative_information  data_analysis  history_of_ideas 
18 hours ago
Probability: The Classical Limit Theorems
"Probability theory has been extraordinarily successful at describing a variety of phenomena, from the behaviour of gases to the transmission of messages, and is, besides, a powerful tool with applications throughout mathematics. At its heart are a number of concepts familiar in one guise or another to many: Gauss' bell-shaped curve, the law of averages, and so on, concepts that crop up in so many settings they are in some sense universal. This universality is predicted by probability theory to a remarkable degree. This book explains that theory and investigates its ramifications. Assuming a good working knowledge of basic analysis, real and complex, the author maps out a route from basic probability, via random walks, Brownian motion, the law of large numbers and the central limit theorem, to aspects of ergodic theorems, equilibrium and nonequilibrium statistical mechanics, communication over a noisy channel, and random matrices."
to:NB  probability  books:noted  ergodic_theory  central_limit_theorem  stochastic_processes 
yesterday
Optimization Models | Cambridge University Press
"Emphasizing practical understanding over the technicalities of specific algorithms, this elegant textbook is an accessible introduction to the field of optimization, focusing on powerful and reliable convex optimization techniques. Students and practitioners will learn how to recognize, simplify, model and solve optimization problems - and apply these principles to their own projects. A clear and self-contained introduction to linear algebra demonstrates core mathematical concepts in a way that is easy to follow, and helps students to understand their practical relevance. Requiring only a basic understanding of geometry, calculus, probability and statistics, and striking a careful balance between accessibility and rigor, it enables students to quickly understand the material, without being overwhelmed by complex mathematics. Accompanied by numerous end-of-chapter problems, an online solutions manual for instructors, and relevant examples from diverse fields including engineering, data science, economics, finance, and management, this is the perfect introduction to optimization for undergraduate and graduate students."
in_NB  optimization  convexity  books:noted  to_teach:statcomp  to_teach:freshman_seminar_on_optimization 
yesterday
Religious Networks in the Roman Empire The Spread of New Ideas | Ancient history | Cambridge University Press
"The first three centuries AD saw the spread of new religious ideas through the Roman Empire, crossing a vast and diverse geographical, social and cultural space. In this innovative study, Anna Collar explores both how this happened and why. Drawing on research in the sociology and anthropology of religion, physics and computer science, Collar explores the relationship between social networks and religious transmission to explore why some religious movements succeed, while others, seemingly equally successful at a certain time, ultimately fail. Using extensive epigraphic data, Collar provides new interpretations of the diffusion of ideas across the social networks of the Jewish Diaspora and the cults of Jupiter Dolichenus and Theos Hypsistos, and in turn offers important reappraisals of the spread of religious innovations in the Roman Empire. This study will be a valuable resource for students and scholars of ancient history, archaeology, ancient religion and network theory."
in_NB  books:noted  epidemiology_of_representations  history_of_religion  ancient_history  roman_empire  social_networks  social_life_of_the_mind  diffusion_of_innovations 
yesterday
Warlords, Strongman Governors, and the State in Afghanistan | Comparative law | Cambridge University Press
"Warlords have come to represent enemies of peace, security, and “good governance” in the collective intellectual imagination. In this book Dipali Mukhopadhyay asserts that, in fact, not all warlords are created equal. Under certain conditions, some of these much-maligned actors are both able and willing to become effective governors on behalf of the state. This provocative argument is based on extensive fieldwork in Afghanistan, where Mukhopadhyay examined warlord-governors who have served as valuable exponents of the Karzai regime in its struggle to assert control over key segments of the countryside. She explores the complex ecosystems that came to constitute provincial political life after 2001 and goes on to expose the rise of “strongman” governance in two important Afghan provinces. While this brand of governance falls far short of international expectations, its emergence reflects the reassertion of the Afghan state in material and symbolic terms that deserve our attention. This book pushes past canonical views of warlordism and state building to consider the logic of the weak state as it has arisen in challenging, conflict-ridden societies like Afghanistan."
in_NB  afghanistan  war  state-building 
yesterday
Big Brother’s Liberal Friends — Crooked Timber
A typically outstanding comment from Bruce Wilder is worth recording here in full:

"The apparatus of surveillance and the system of classification are both parts of a vast system of secrecy — aspects of the architecture of the secret state, the deep state.
"I’ve had a security clearance, and so have some personal acquaintance with the system of classification and what is classified, why it is classified and so on, as well as experience with the effect classification has on people, their behavior and administration. I see people sometimes elaborate the claim that, of course the state must have the capacity to keep some information confidential, which is undoubtedly true, but sidesteps the central issue, which is, what does the system of classification do? what does the secrecy of the deep state do? What is the function of the system of classification?
"From my personal acquaintance, I do not think it can be said that its function is to keep secrets. Real secrets are rarely classified. Information is classified so that it can be communicated, and in the present system operated by the U.S. military and intelligence establishment, broadcast. I suppose, without knowing as an historic fact, that the system of classification originated during WWII as a means to distribute information on a need-to-know basis, but that’s not what goes on now. The compartmentalization that the term, classification, implies, is largely absent. That Manning or Snowden could obtain and release the sheer volume of documents that they did — not the particular content of any of them — is the first and capital revelation concerning what the system is, and is not. The system is not keeping confidential information confidential, nor is it keeping secrets; it is broadcasting information.
"The very idea that a system that broadcasts information in a way that allows someone at the level of a Manning or Snowden to accumulate vast numbers of documents has kept any secrets from the secret services of China or Russia is, on its face, absurd. The system revealed by the simple fact of the nature of Snowden’s and Manning’s breaches is not capable of keeping secrets. Snowden was a contractor at a peripheral location, Manning a soldier of very low rank."
national_surveillance_state  to:blog 
2 days ago
Instagram and Art Theory
Hmmm. I bounced off _Ways of Seeing_ as a college freshman, but maybe it's time to revisit...
art  art_criticism  photography  social_media  rhetorical_self-fashioning  presentation_of_self 
2 days ago
Information Networks: Evidence from Illegal Insider Trading Tips by Kenneth R. Ahern :: SSRN
The description given by A. Z. Jacobs cannot be bettered: "Insider trading happening adorably between childhood BFFs, not professional contacts." (https://twitter.com/az_jacobs/status/525003389713653761)

--- After reading through: a truly heroic piece of data acquisition. Not wild about things like OLS and ordinal logistic regression (would it hurt you to use a spline once in a while?), but probably wouldn't affect the over-all trend, which is all this is really good for. One thought not explored here: might the tendency for those further down a tip chain to make more money be part of how these particular networks got caught? (He's sensitive on the, to put it delicately, sample-selection biases, but doesn't go into whether that might be one of them.)
to:NB  social_networks  finance  corruption  economics  network_data_analysis  have_read 
2 days ago
AER (104,11) p. 3737 - The Power of Communication
"In this paper, I offer two ways in which firms can collude: secret monitoring and infrequent coordination. Such collusion is enforceable with intuitive communication protocols. I make my case in the context of a repeated Cournotoligopoly with flexible production, prices that follow a Brownian motion and no monetary side payments, an environment where it has previously been argued that any collusion is impossible. Trade associations can easily facilitate collusion by mediating communication amongst firms."
to:NB  economics  game_theory  imperfect_competition  seldom_meet_even_for_merriment_or_diversion_but_it_ends_in_some_conspiracy_against_the_public 
2 days ago
AER (104,11) p. 3635 - Structural Transformation, the Mismeasurement of Productivity Growth, and the Cost Disease of Services
"If workers self-select into industries based upon their relative productivity in different tasks, and comparative advantage is aligned with absolute advantage, then the average efficacy of a sector's workforce will be negatively correlated with its employment share. This might explain the difference in the reported productivity growth of contracting goods and expanding services. Instrumenting with defense expenditures, I find the elasticity of worker efficacy with respect to employment shares is substantially negative, albeit imprecisely estimated. The estimates suggest that the view that goods and services have similar productivity growth rates is a plausible alternative characterization of growth in developed economies."

--- How on Earth is that a valid instrument for this question???
to:NB  to_read  economics  economic_growth  productivity  econometrics  instrumental_variables 
2 days ago
D3 Traveller Duffel – SDR Traveller
Explanation for tags: There are very, very few people who need a $1000 duffel made from a fabric "four times stronger than kevlar". I am glad such things exist for them. (In fact I actually know someone who could have used it in a former profession.) But those people would not need moody art photos of the bag amidst gorgeous desolation to sell them on it.
luggage  conspicuous_consumption  conspicuous_to_the_cognoscenti_anyway  functionalism_as_a_fetishism  design  via:warrenellis 
2 days ago
Unfogged: All Hail Duke
"Boxill is a senior lecturer in the philosophy department and was chair of the faculty from 2011 to earlier this year. She directs the university's Parr Center for Ethics."

--- Remarks:
1. Well, many of us study what we don't understand.
2. Eliminate organized college sports.
3. Nobody could get away with that in a novel.
funny:academic  funny:pointed  funny:tasteless  academia  our_decrepit_institutions  corruption  ethics  moral_philosophy 
5 days ago
Curried Butternut Soup Recipe | MyRecipes.com
An autumn and winter stand-by, though I leave off the cheese.
food  recipes  have_made 
5 days ago
Thomas Piketty's Capital in the 21st Century - Boing Boing
I should not be surprised that Boing Boing has one of the best general-audience reviews of Piketty.
book_reviews  piketty.thomas  political_economy  inequality 
6 days ago
Is Pittsburgh The New Austin? The Austin We Hoped And Dreamed Of, The Austin That Was Foretold?
"Could it be? With low rents, a thriving arts scene, and old-school American authenticity, one city may be becoming the unexpected new home for a generation: Pittsburgh, PA. But is it truly the new Austin—the new Austin that was foretold to us? Do we dare dream it so?
"Up-and-coming millennials are among those hit hardest by the sluggish economy, and they’re looking for the next affordable, undiscovered city to settle in. Could their long quest be at its end? The historically rich steel-manufacturing town is home to over 40 colleges and universities brimming with young people, all with one shared dream: to be the next Austin. Not the Austin of old, but the one to come, promised to each other in shared dreams and secret whispers.
"Could it be possible that we have found the worthy successor to Austin?
"“But what of Portland?” one might ask.
"“What of Portland,” indeed. For years, tales of old claimed the Pacific Northwest city was the promised land—a new Eden upon this earth, where men and women would be free to think, and live, and love. A blossoming Eden teeming with hope and food trucks.
"But the old tales are clear reminders that the devil himself could not craft a trap so beautiful in its deceitfulness. And now, we ask, could our salvation lie in the City of Steel? It can. It must.
"“The journey has been long, and the trials have been many,” says Pittsburgh mayor Bill Peduto, “But young people looking for working-class authenticity surrounded by gorgeous natural scenery need look no further. It is written.”
"From microbreweries, to vegan dining, to endless converted industrial living and studio spaces, Pittsburgh may finally give a lost generation of wanderers a city to call home.
"“Oh, yeah, Pittsburgh is totally the new Austin,” said 26-year-old recent transplant and reclaimed wood sculptor Melinda Rodriguez. “The Austin foretold to us by our fathers and their fathers. The circle is complete.”
"Shooing away her pug with her foot, she added, “Speak not to me of Brooklyn. That name is a curse.”"

--- _The Onion_ is America's finest source of urbanism.
funny:pointed  funny:because_its_true  pittsburgh  class_struggles_in_america  have_read  to:blog 
7 days ago
Neural and cognitive characteristics of extraordinary altruists
"Altruistic behavior improves the welfare of another individual while reducing the altruist’s welfare. Humans’ tendency to engage in altruistic behaviors is unevenly distributed across the population, and individual variation in altruistic tendencies may be genetically mediated. Although neural endophenotypes of heightened or extreme antisocial behavior tendencies have been identified in, for example, studies of psychopaths, little is known about the neural mechanisms that support heightened or extreme prosocial or altruistic tendencies. In this study, we used structural and functional magnetic resonance imaging to assess a population of extraordinary altruists: altruistic kidney donors who volunteered to donate a kidney to a stranger. Such donations meet the most stringent definitions of altruism in that they represent an intentional behavior that incurs significant costs to the donor to benefit an anonymous, nonkin other. Functional imaging and behavioral tasks included face-emotion processing paradigms that reliably distinguish psychopathic individuals from controls. Here we show that extraordinary altruists can be distinguished from controls by their enhanced volume in right amygdala and enhanced responsiveness of this structure to fearful facial expressions, an effect that predicts superior perceptual sensitivity to these expressions. These results mirror the reduced amygdala volume and reduced responsiveness to fearful facial expressions observed in psychopathic individuals. Our results support the possibility of a neural basis for extraordinary altruism. We anticipate that these findings will expand the scope of research on biological mechanisms that promote altruistic behaviors to include neural mechanisms that support affective and social responsiveness."

--- I will admit that it's hard to argue their subject pool isn't ecologically valid.
to:NB  neuropsychology  altruism  extraordinary_if_true 
7 days ago
Pre-Columbian mycobacterial genomes reveal seals as a source of New World human tuberculosis : Nature : Nature Publishing Group
"Modern strains of Mycobacterium tuberculosis from the Americas are closely related to those from Europe, supporting the assumption that human tuberculosis was introduced post-contact1. This notion, however, is incompatible with archaeological evidence of pre-contact tuberculosis in the New World2. Comparative genomics of modern isolates suggests that M. tuberculosis attained its worldwide distribution following human dispersals out of Africa during the Pleistocene epoch3, although this has yet to be confirmed with ancient calibration points. Here we present three 1,000-year-old mycobacterial genomes from Peruvian human skeletons, revealing that a member of the M. tuberculosis complex caused human disease before contact. The ancient strains are distinct from known human-adapted forms and are most closely related to those adapted to seals and sea lions. Two independent dating approaches suggest a most recent common ancestor for the M. tuberculosis complex less than 6,000 years ago, which supports a Holocene dispersal of the disease. Our results implicate sea mammals as having played a role in transmitting the disease to humans across the ocean."

--- Shorter: tuberculosis was spread around the world by _seals_.
to:NB  historical_genetics  tuberculosis  plagues_and_peoples  to_read 
7 days ago
The high heritability of educational achievement reflects many genetically influenced traits, not just intelligence
"Because educational achievement at the end of compulsory schooling represents a major tipping point in life, understanding its causes and correlates is important for individual children, their families, and society. Here we identify the general ingredients of educational achievement using a multivariate design that goes beyond intelligence to consider a wide range of predictors, such as self-efficacy, personality, and behavior problems, to assess their independent and joint contributions to educational achievement. We use a genetically sensitive design to address the question of why educational achievement is so highly heritable. We focus on the results of a United Kingdom-wide examination, the General Certificate of Secondary Education (GCSE), which is administered at the end of compulsory education at age 16. GCSE scores were obtained for 13,306 twins at age 16, whom we also assessed contemporaneously on 83 scales that were condensed to nine broad psychological domains, including intelligence, self-efficacy, personality, well-being, and behavior problems. The mean of GCSE core subjects (English, mathematics, science) is more heritable (62%) than the nine predictor domains (35–58%). Each of the domains correlates significantly with GCSE results, and these correlations are largely mediated genetically. The main finding is that, although intelligence accounts for more of the heritability of GCSE than any other single domain, the other domains collectively account for about as much GCSE heritability as intelligence. Together with intelligence, these domains account for 75% of the heritability of GCSE. We conclude that the high heritability of educational achievement reflects many genetically influenced traits, not just intelligence."

--- To be truly fair to the g-mongers, one would have to consider how much of measured "self-efficacy" (etc.) is either a partial measurement of general intelligence, or caused by general intelligence (and, of course, vice versa). Without having read more than the abstract, I will commit to donating $20 to (hmmm) Wikipedia if this possibility is adequately dealt with. (A totally separate issue is that we know longer schooling increases IQ, so this would need to use a measure of intelligence that was taken _before_ school attainment was finished.)
to:NB  human_genetics  heritability  iq  to_be_shot_after_a_fair_trial  re:g_paper  psychometrics  inequality  education  transmission_of_inequality 
7 days ago
Shoshan Zuboff on “Big Data” as Surveillance Capitalism
In which "If you're not paying for the service, you're the product being sold" is elaborated on for seven pages, completely with gratuitous references to Arendt, Orwell, and, so help me, expositions of John Searle's philosophy of speech acts. There is a real issue here (how do we organize a useful Internet without having everyone subject to for-profit surveillance all the time?), and even some real thoughts, but really.
have_read  networked_life  via:? 
8 days ago
Community Detection via Random and Adaptive Sampling | COLT 2014 | JMLR W&CP
"In this paper, we consider networks consisting of a finite number of non-overlapping communities. To extract these communities, the interaction between pairs of nodes may be sampled from a large available data set, which allows a given node pair to be sampled several times. When a node pair is sampled, the observed outcome is a binary random variable, equal to 1 if nodes interact and to 0 otherwise. The outcome is more likely to be positive if nodes belong to the same communities. For a given budget of node pair samples or observations, we wish to jointly design a sampling strategy (the sequence of sampled node pairs) and a clustering algorithm that recover the hidden communities with the highest possible accuracy. We consider both non-adaptive and adaptive sampling strategies, and for both classes of strategies, we derive fundamental performance limits satisfied by any sampling and clustering algorithm. In particular, we provide necessary conditions for the existence of algorithms recovering the communities accurately as the network size grows large. We also devise simple algorithms that accurately reconstruct the communities when this is at all possible, hence proving that the proposed necessary conditions for accurate community detection are also sufficient. The classical problem of community detection in the stochastic block model can be seen as a particular instance of the problems consider here. But our framework covers more general scenarios where the sequence of sampled node pairs can be designed in an adaptive manner. The paper provides new results for the stochastic block model, and extends the analysis to the case of adaptive sampling."
to:NB  community_discovery  network_sampling  network_data_analysis  statistics 
9 days ago
Optimal learners for multiclass problems | COLT 2014 | JMLR W&CP
"The fundamental theorem of statistical learning states that for binary classification problems, any Empirical Risk Minimization (ERM) learning rule has close to optimal sample complexity. In this paper we seek for a generic optimal learner for multiclass prediction. We start by proving a surprising result: a generic optimal multiclass learner must be improper, namely, it must have the ability to output hypotheses which do not belong to the hypothesis class, even though it knows that all the labels are generated by some hypothesis from the class. In particular, no ERM learner is optimal. This brings back the fundamental question of “how to learn”? We give a complete answer to this question by giving a new analysis of the one-inclusion multiclass learner of Rubinstein et el (2006) showing that its sample complexity is essentially optimal. Then, we turn to study the popular hypothesis class of generalized linear classifiers. We derive optimal learners that, unlike the one-inclusion algorithm, are computationally efficient. Furthermore, we show that the sample complexity of these learners is better than the sample complexity of the ERM rule, thus settling in negative an open question due to Collins (2005)"

--- The announced result is so counter-intuitive my first impulse is to wonder if one of the definitions shouldn't be tweaked.
to:NB  learning_theory  classifiers 
9 days ago
BOOTSTRAP JOINT PREDICTION REGIONS - Wolf - 2014 - Journal of Time Series Analysis - Wiley Online Library
"Many statistical applications require the forecast of a random variable of interest over several periods into the future. The sequence of individual forecasts, one period at a time, is called a path forecast, where the term path refers to the sequence of individual future realizations of the random variable. The problem of constructing a corresponding joint prediction region has been rather neglected in the literature so far: such a region is supposed to contain the entire future path with a prespecified probability. We develop bootstrap methods to construct joint prediction regions. The resulting regions are proven to be asymptotically consistent under a mild high-level assumption. We compare the finite-sample performance of our joint prediction regions with some previous proposals via Monte Carlo simulations. An empirical application to a real data set is also provided."
to:NB  prediction  time_series  bootstrap  to_read 
9 days ago
Human Computation
"Human Computation is an international and interdisciplinary forum for the electronic publication and print archiving of high-quality scholarly articles in all areas of human computation, which concerns the design or analysis of information processing systems in which humans participate as computational elements."

--- No doubt unfairly, I regard both the logo, and the presence of an article by M. C. Bateson in the first issue, as warning signs. Still, the last tag applies.
via:?  collective_cognition  distributed_systems  social_life_of_the_mind  to_be_shot_after_a_fair_trial 
9 days ago
On Thermonuclear War — Herman Kahn
"I am not going to explain how much of my courtship doctrine was based on this book and ones like it aside from admitting it was not a small fraction."
books:noted  nukes  book_reviews  nicoll.james  futurology 
10 days ago
Robert Wiebe's Self-Rule and American Democracy | Waggish
"What emerged with industrialization in the United States was a three-class system, conflictual but not revolutionary: one class geared to national institutions and policies, one dominating local affairs, and one sunk beneath both of those in the least rewarding jobs and least stable environments–in the terminology of my account, a national class, a local middle class, and a lower class. New hierarchies ordered relations inside both the national and lower middle class; both of those classes, in turn, counted on hierarchies to control the lower class."
in_NB  books:noted  us_politics  american_history  class_struggles_in_america  democracy 
12 days ago
Artificial sweeteners induce glucose intolerance by altering the gut microbiota : Nature : Nature Publishing Group
"Non-caloric artificial sweeteners (NAS) are among the most widely used food additives worldwide, regularly consumed by lean and obese individuals alike. NAS consumption is considered safe and beneficial owing to their low caloric content, yet supporting scientific data remain sparse and controversial. Here we demonstrate that consumption of commonly used NAS formulations drives the development of glucose intolerance through induction of compositional and functional alterations to the intestinal microbiota. These NAS-mediated deleterious metabolic effects are abrogated by antibiotic treatment, and are fully transferrable to germ-free mice upon faecal transplantation of microbiota configurations from NAS-consuming mice, or of microbiota anaerobically incubated in the presence of NAS. We identify NAS-altered microbial metabolic pathways that are linked to host susceptibility to metabolic disease, and demonstrate similar NAS-induced dysbiosis and glucose intolerance in healthy human subjects. Collectively, our results link NAS consumption, dysbiosis and metabolic abnormalities, thereby calling for a reassessment of massive NAS usage."
to:NB  to_read  experimental_biology  food  oops 
12 days ago
Pleistocene cave art from Sulawesi, Indonesia : Nature : Nature Publishing Group
"Archaeologists have long been puzzled by the appearance in Europe ~40–35 thousand years (kyr) ago of a rich corpus of sophisticated artworks, including parietal art (that is, paintings, drawings and engravings on immobile rock surfaces)1, 2 and portable art (for example, carved figurines)3, 4, and the absence or scarcity of equivalent, well-dated evidence elsewhere, especially along early human migration routes in South Asia and the Far East, including Wallacea and Australia5, 6, 7, 8, where modern humans (Homo sapiens) were established by 50 kyr ago9, 10. Here, using uranium-series dating of coralloid speleothems directly associated with 12 human hand stencils and two figurative animal depictions from seven cave sites in the Maros karsts of Sulawesi, we show that rock art traditions on this Indonesian island are at least compatible in age with the oldest European art11. The earliest dated image from Maros, with a minimum age of 39.9 kyr, is now the oldest known hand stencil in the world. In addition, a painting of a babirusa (‘pig-deer’) made at least 35.4 kyr ago is among the earliest dated figurative depictions worldwide, if not the earliest one. Among the implications, it can now be demonstrated that humans were producing rock art by ~40 kyr ago at opposite ends of the Pleistocene Eurasian world."
to:NB  archaeology  human_evolution 
12 days ago
On inference of causality for discrete state models in a multiscale context
"Discrete state models are a common tool of modeling in many areas. E.g., Markov state models as a particular representative of this model family became one of the major instruments for analysis and understanding of processes in molecular dynamics (MD). Here we extend the scope of discrete state models to the case of systematically missing scales, resulting in a nonstationary and nonhomogeneous formulation of the inference problem. We demonstrate how the recently developed tools of nonstationary data analysis and information theory can be used to identify the simultaneously most optimal (in terms of describing the given data) and most simple (in terms of complexity and causality) discrete state models. We apply the resulting formalism to a problem from molecular dynamics and show how the results can be used to understand the spatial and temporal causality information beyond the usual assumptions. We demonstrate that the most optimal explanation for the appropriately discretized/coarse-grained MD torsion angles data in a polypeptide is given by the causality that is localized both in time and in space, opening new possibilities for deploying percolation theory and stochastic subgridscale modeling approaches in the area of MD."
to:NB  to_read  re:AoS_project  time_series  markov_models  statistics 
12 days ago
Pre-Industrial Inequality - Milanovic - 2010 - The Economic Journal - Wiley Online Library
"Is inequality largely the result of the Industrial Revolution? Or, were pre-industrial incomes as unequal as they are today? This article infers inequality across individuals within each of the 28 pre-industrial societies, for which data were available, using what are known as social tables. It applies two new concepts: the inequality possibility frontier and the inequality extraction ratio. They compare the observed income inequality to the maximum feasible inequality that, at a given level of income, might have been ‘extracted’ by those in power. The results give new insights into the connection between inequality and economic development in the very long run."
to:NB  to_read  economics  economic_history  inequality  great_transformation 
12 days ago
[0902.3837] Innovated higher criticism for detecting sparse signals in correlated noise
"Higher criticism is a method for detecting signals that are both sparse and weak. Although first proposed in cases where the noise variables are independent, higher criticism also has reasonable performance in settings where those variables are correlated. In this paper we show that, by exploiting the nature of the correlation, performance can be improved by using a modified approach which exploits the potential advantages that correlation has to offer. Indeed, it turns out that the case of independent noise is the most difficult of all, from a statistical viewpoint, and that more accurate signal detection (for a given level of signal sparsity and strength) can be obtained when correlation is present. We characterize the advantages of correlation by showing how to incorporate them into the definition of an optimal detection boundary. The boundary has particularly attractive properties when correlation decays at a polynomial rate or the correlation matrix is Toeplitz."
to:NB  multiple_testing  hypothesis_testing  time_series  have_skimmed  jin.jiashun  hall.peter 
12 days ago
Higher criticism in the context of unknown distribution, non-independence and classification
"Higher criticism has been proposed as a tool for highly multiple hypothesis testing or signal detection, initially in cases where the distribution of a test statistic (or the noise in a signal) is known and the component tests are statisti- cally independent. In this paper we explore the extent to which the assumptions of known distribution and independence can be relaxed, and we consider too the ap- plication of higher criticism to classification. It is shown that effective distribution approximations can be achieved by using a threshold approach; that is, by disre- garding data components unless their significance level exceeds a sufficiently high value. This method exploits the good relative accuracy of approximations to light- tailed distributions. In particular, it can be effective when the true distribution is founded on something like a Studentised mean, or on an average of related type, which is commonly the case in practice. The issue of dependence among vector components is also shown not to be a serious difficulty in many circumstances."
to:NB  have_skimmed  multiple_testing  statistics  hall.peter  hypothesis_testing  statistical_inference_for_stochastic_processes  re:network_differences 
12 days ago
[math/0410072] Higher criticism for detecting sparse heterogeneous mixtures
"Higher criticism, or second-level significance testing, is a multiple-comparisons concept mentioned in passing by Tukey. It concerns a situation where there are many independent tests of significance and one is interested in rejecting the joint null hypothesis. Tukey suggested comparing the fraction of observed significances at a given \alpha-level to the expected fraction under the joint null. In fact, he suggested standardizing the difference of the two quantities and forming a z-score; the resulting z-score tests the significance of the body of significance tests. We consider a generalization, where we maximize this z-score over a range of significance levels 0<\alpha\leq\alpha_0.
"We are able to show that the resulting higher criticism statistic is effective at resolving a very subtle testing problem: testing whether n normal means are all zero versus the alternative that a small fraction is nonzero. The subtlety of this ``sparse normal means'' testing problem can be seen from work of Ingster and Jin, who studied such problems in great detail. In their studies, they identified an interesting range of cases where the small fraction of nonzero means is so small that the alternative hypothesis exhibits little noticeable effect on the distribution of the p-values either for the bulk of the tests or for the few most highly significant tests.
"In this range, when the amplitude of nonzero means is calibrated with the fraction of nonzero means, the likelihood ratio test for a precisely specified alternative would still succeed in separating the two hypotheses."

--- It makes a lot more sense that the name would come from someone like Tukey.
to:NB  multiple_testing  hypothesis_testing  empirical_processes  statistics  donoho.david  jin.jiashun  tukey.john_w.  have_read  re:network_differences 
12 days ago
Human preferences for sexually dimorphic faces may be evolutionarily novel
"A large literature proposes that preferences for exaggerated sex typicality in human faces (masculinity/femininity) reflect a long evolutionary history of sexual and social selection. This proposal implies that dimorphism was important to judgments of attractiveness and personality in ancestral environments. It is difficult to evaluate, however, because most available data come from large-scale, industrialized, urban populations. Here, we report the results for 12 populations with very diverse levels of economic development. Surprisingly, preferences for exaggerated sex-specific traits are only found in the novel, highly developed environments. Similarly, perceptions that masculine males look aggressive increase strongly with development and, specifically, urbanization. These data challenge the hypothesis that facial dimorphism was an important ancestral signal of heritable mate value. One possibility is that highly developed environments provide novel opportunities to discern relationships between facial traits and behavior by exposing individuals to large numbers of unfamiliar faces, revealing patterns too subtle to detect with smaller samples."

--- Another possibility is that these are recent cultural conventions! (Maybe they have some subtle way of ruling that out...)
to:NB  psychology  evolutionary_psychology  cultural_differences  to_be_shot_after_a_fair_trial 
13 days ago
No evidence for genetic assortative mating beyond that due to population stratification
"Domingue et al. (1) use genome-wide SNPs to show in non-Hispanic US whites that spouses are genetically more similar than random pairs of individuals. We argue that, although this reported result is descriptively true, the spousal genetic similarity can be explained by assortment on shared ancestry (i.e., population stratification) and thus does not reflect genetic assortative mating as interpreted by Dominigue et al. This greatly affects the implications of the findings for understanding assortative mating in humans."
to:NB  human_genetics 
13 days ago
HCN ice in Titan/'s high-altitude southern polar cloud : Nature : Nature Publishing Group
"Titan’s middle atmosphere is currently experiencing a rapid change of season after northern spring arrived in 2009 (refs 1, 2). A large cloud was observed3 for the first time above Titan’s southern pole in May 2012, at an altitude of 300 kilometres. A temperature maximum was previously observed there, and condensation was not expected for any of Titan’s atmospheric gases. Here we report that this cloud is composed of micrometre-sized particles of frozen hydrogen cyanide (HCN ice). The presence of HCN particles at this altitude, together with temperature determinations from mid-infrared observations, indicate a dramatic cooling of Titan’s atmosphere inside the winter polar vortex in early 2012. Such cooling is in contrast to previously measured high-altitude warming in the polar vortex1, and temperatures are a hundred degrees colder than predicted by circulation models4. These results show that post-equinox cooling at the winter pole of Titan is much more efficient than previously thought."
to:NB  titan  astronomy 
13 days ago
[1001.0591] Comparing Distributions and Shapes using the Kernel Distance
"Starting with a similarity function between objects, it is possible to define a distance metric on pairs of objects, and more generally on probability distributions over them. These distance metrics have a deep basis in functional analysis, measure theory and geometric measure theory, and have a rich structure that includes an isometric embedding into a (possibly infinite dimensional) Hilbert space. They have recently been applied to numerous problems in machine learning and shape analysis.
"In this paper, we provide the first algorithmic analysis of these distance metrics. Our main contributions are as follows: (i) We present fast approximation algorithms for computing the kernel distance between two point sets P and Q that runs in near-linear time in the size of (P cup Q) (note that an explicit calculation would take quadratic time). (ii) We present polynomial-time algorithms for approximately minimizing the kernel distance under rigid transformation; they run in time O(n + poly(1/epsilon, log n)). (iii) We provide several general techniques for reducing complex objects to convenient sparse representations (specifically to point sets or sets of points sets) which approximately preserve the kernel distance. In particular, this allows us to reduce problems of computing the kernel distance between various types of objects such as curves, surfaces, and distributions to computing the kernel distance between point sets. These take advantage of the reproducing kernel Hilbert space and a new relation linking binary range spaces to continuous range spaces with bounded fat-shattering dimension."
to:NB  to_read  kernel_estimators  two-sample_tests  statistics  probability  re:network_differences 
14 days ago
[1307.7760] Geometric Inference on Kernel Density Estimates
"We show that geometric inference of a point cloud can be calculated by examining its kernel density estimate. This intermediate step results in the inference being statically robust to noise and allows for large computational gains and scalability (e.g. on 100 million points). In particular, by first creating a coreset for the kernel density estimate, the data representing the final geometric and topological structure has size depending only on the error tolerance, not on the size of the original point set or the complexity of the structure. To achieve this result, we study how to replace distance to a measure, as studied by Chazal, Cohen-Steiner, and Merigot, with the kernel distance. The kernel distance is monotonic with the kernel density estimate (sublevel sets of the kernel distance are superlevel sets of the kernel density estimate), thus allowing us to examine the kernel density estimate in this manner. We show it has several computational and stability advantages. Moreover, we provide an algorithm to estimate its topology using weighted Vietoris-Rips complexes."
to:NB  geometry  kernel_estimators  density_estimation  statistics  computational_statistics 
14 days ago
Quality and efficiency for kernel density estimates in large data
"Kernel density estimates are important for a broad variety of applications. Their construction has been well-studied, but existing techniques are expensive on massive datasets and/or only provide heuristic approximations without theoretical guarantees. We propose randomized and deterministic algorithms with quality guarantees which are orders of magnitude more efficient than previous algorithms. Our algorithms do not require knowledge of the kernel or its bandwidth parameter and are easily parallelizable. We demonstrate how to implement our ideas in a centralized setting and in MapReduce, although our algorithms are applicable to any large-scale data processing framework. Extensive experiments on large real datasets demonstrate the quality, efficiency, and scalability of our techniques."

--- Ungated version: http://www.cs.utah.edu/~lifeifei/papers/kernelsigmod13.pdf
to:NB  to_read  kernel_estimators  computational_statistics  statistics  density_estimation  to_teach:statcomp  to_teach:undergrad-ADA 
14 days ago
Norbert Wiener, 1894-1964
Memorial issue of the Bulletin of the AMS, now open access...
in_NB  wiener.norbert  lives_of_the_scientists  mathematics  stochastic_processes 
15 days ago
Nonparametric Estimation of Küllback-Leibler Divergence
"In this letter, we introduce an estimator of Küllback-Leibler divergence based on two independent samples. We show that on any finite alphabet, this estimator has an exponentially decaying bias and that it is consistent and asymptotically normal. To explain the importance of this estimator, we provide a thorough analysis of the more standard plug-in estimator. We show that it is consistent and asymptotically normal, but with an infinite bias. Moreover, if we modify the plug-in estimator to remove the rare events that cause the bias to become infinite, the bias still decays at a rate no faster than . Further, we extend our results to estimating the symmetrized Küllback-Leibler divergence. We conclude by providing simulation results, which show that the asymptotic properties of these estimators hold even for relatively small sample sizes."

--- Trivial, but: _Kullback_ didn't spell his name with an umlaut --- why here?
to:NB  entropy_estimation  information_theory  statistics  nonparametrics 
16 days ago
Arguments, More than Confidence, Explain the Good Performance of Reasoning Groups by Emmanuel Trouche, Emmanuel Sander, Hugo Mercier :: SSRN
"In many intellective tasks groups consistently outperform individuals. One factor is that the individual(s) with the best answer is able to convince the other group members using sound argumentation. Another factor is that the most confident group member imposes her answer whether it is right or wrong. In Experiments 1 and 2, individual participants were given arguments against their answer in intellective tasks. Demonstrating sound argumentative competence, many participants changed their mind to adopt the correct answer even though the arguments had no confidence markers, and barely any participant changed their mind to adopt an incorrect answer. Confidence could not explain who changed their mind, as the least confident participants were as likely to change their mind as the most confident. In Experiments 3 (adults) and 4 (10-year-olds), participants solved intellective tasks individually and then in groups, before solving transfer problems individually. Demonstrating again sound argumentative competence, participants adopted the correct answer when it was present in the group, and many succeeded in transferring this understanding to novel problems. Moreover, the group member with the right answer nearly always managed to convince the group even when she was not the most confident. These results show that argument quality can overcome confidence among the factors influencing the discussion of intellective tasks. Explanations for apparent exceptions are discussed."
to:NB  to_read  cognitive_science  experimental_psychology  social_life_of_the_mind  collective_cognition  re:democratic_cognition  mercier.hugo 
17 days ago
The Virtues of Ingenuity: Reasoning and Arguing without Bias - Springer
This paper describes and defends the “virtues of ingenuity”: detachment, lucidity, thoroughness. Philosophers traditionally praise these virtues for their role in the practice of using reasoning to solve problems and gather information. Yet, reasoning has other, no less important uses. Conviction is one of them. A recent revival of rhetoric and argumentative approaches to reasoning (in psychology, philosophy and science studies) has highlighted the virtues of persuasiveness and cast a new light on some of its apparent vices—bad faith, deluded confidence, confirmation and myside biases. Those traits, it is often argued, will no longer look so detrimental once we grasp their proper function: arguing in order to persuade, rather than thinking in order to solve problems. Some of these biases may even have a positive impact on intellectual life. Seen in this light, the virtues of ingenuity may well seem redundant. Defending them, I argue that the vices of conviction are not innocuous. If generalized, they would destabilize argumentative practices. Argumentation is a common good that is threatened when every arguer pursues conviction at the expense of ingenuity. Bad faith, myside biases and delusions of all sorts are neither called for nor explained by argumentative practices. To avoid a collapse of argumentation, mere civil virtues (respect, humility or honesty) do not suffice: we need virtues that specifically attach to the practice of making conscious inferences.
to:NB  to_read  rhetoric  epistemology  social_life_of_the_mind  re:democratic_cognition  via:? 
17 days ago
globalinequality: Ahistoricism in Acemoglu-Robinson
Hinting at the MacLeod thesis, that the historical role of Communism was to lay the groundwork for capitalism...
have_read  economic_growth  institutions  economic_policy  development_economics  communism  china  china:prc  ussr 
19 days ago
Revisiting the Impact of Teachers
"Chetty, Friedman, and Rockoff (hereafter CFR) use teacher switch- ing as a quasi-experiment to test for bias from student sorting in value added (VA) models of teacher effectiveness. They conclude that VA estimates are not meaningfully biased by student sorting (CFR 2014a). A companion paper finds that high-VA teachers have large effects on students’ later outcomes (CFR 2014b). I reproduce CFR’s analysis in data from North Carolina. Their key reported results are all success- fully replicated. Further investigation, however, reveals that the quasi- experiment is invalid: Teacher switching is correlated with changes in students’ prior grade scores that bias the key coefficient toward a find- ing of no bias. Estimates that adjust for changes in students’ prior achievement find evidence of moderate bias in VA scores, in the middle of the range suggested by Rothstein (2009). The association between VA and long-run outcomes is not robust and quite sensitive to controls."

--- Is the data available for either this or CFR? If not, perhaps, the last tag is a mistake.
to:NB  education  regression  causal_inference  hierarchical_statistical_models  to_teach:undergrad-ADA 
19 days ago
[1408.4102] Estimation of Monotone Treatment Effects in Network Experiments
"Randomized experiments on social networks are a trending research topic. Such experiments pose statistical challenges due to the possibility of interference between units. We propose a new method for estimating attributable treatment effects under interference. The method does not require partial interference, but instead uses an identifying assumption that is similar to requiring nonnegative treatment effects. Observed pre-treatment social network information can be used to customize the test statistic, so as to increase power without making assumptions on the data generating process. The inversion of the test statistic is a combinatorial optimization problem which has a tractable relaxation, yielding conservative estimates of the attributable effect."
statistics  network_data_analysis  causal_inference  experimental_design  re:experiments_on_networks  kith_and_kin  choi.david_s.  in_NB 
19 days ago
Evaluating link prediction methods - Online First - Springer
"Link prediction is a popular research area with important applications in a variety of disciplines, including biology, social science, security, and medicine. The fundamental requirement of link prediction is the accurate and effective prediction of new links in networks. While there are many different methods proposed for link prediction, we argue that the practical performance potential of these methods is often unknown because of challenges in the evaluation of link prediction, which impact the reliability and reproducibility of results. We describe these challenges, provide theoretical proofs and empirical examples demonstrating how current methods lead to questionable conclusions, show how the fallacy of these conclusions is illuminated by methods we propose, and develop recommendations for consistent, standard, and applicable evaluation metrics. We also recommend the use of precision-recall threshold curves and associated areas in lieu of receiver operating characteristic curves due to complications that arise from extreme imbalance in the link prediction classification problem."
to:NB  network_data_analysis  statistics  cross-validation  link_prediction  re:XV_for_networks 
21 days ago
Pathways to Exploration: Rationales and Approaches for a U.S. Program of Human Space Exploration | The National Academies Press
"The United States has publicly funded its human spaceflight program on a continuous basis for more than a half-century, through three wars and a half-dozen recessions, from the early Mercury and Gemini suborbital and Earth orbital missions, to the lunar landings, and thence to the first reusable winged crewed spaceplane that the United States operated for three decades. Today the United States is the major partner in a massive orbital facility - the International Space Station - that is becoming the focal point for the first tentative steps in commercial cargo and crewed orbital space flights. And yet, the long-term future of human spaceflight beyond this project is unclear. Pronouncements by multiple presidents of bold new ventures by Americans to the Moon, to Mars, and to an asteroid in its native orbit, have not been matched by the same commitment that accompanied President Kennedy's now fabled 1961 speech-namely, the substantial increase in NASA funding needed to make it happen. Are we still committed to advancing human spaceflight? What should a long-term goal be, and what does the United States need to do to achieve it?"
to:NB  books:noted  space_exploration 
22 days ago
Identifying the Culprit: Assessing Eyewitness Identification | The National Academies Press
"Eyewitnesses play an important role in criminal cases when they can identify culprits. Estimates suggest that tens of thousands of eyewitnesses make identifications in criminal investigations each year. Research on factors that affect the accuracy of eyewitness identification procedures has given us an increasingly clear picture of how identifications are made, and more importantly, an improved understanding of the principled limits on vision and memory that can lead to failure of identification. Factors such as viewing conditions, duress, elevated emotions, and biases influence the visual perception experience. Perceptual experiences are stored by a system of memory that is highly malleable and continuously evolving, neither retaining nor divulging content in an informational vacuum. As such, the fidelity of our memories to actual events may be compromised by many factors at all stages of processing, from encoding to storage and retrieval. Unknown to the individual, memories are forgotten, reconstructed, updated, and distorted. Complicating the process further, policies governing law enforcement procedures for conducting and recording identifications are not standard, and policies and practices to address the issue of misidentification vary widely. These limitations can produce mistaken identifications with significant consequences. What can we do to make certain that eyewitness identification convicts the guilty and exonerates the innocent?"
to:NB  books:noted  psychology  law 
22 days ago
Causal tracking reliabilism and the Gettier problem - Springer
"This paper argues that reliabilism can handle Gettier cases once it restricts knowledge producing reliable processes to those that involve a suitable causal link between the subject’s belief and the fact it references. Causal tracking reliabilism (as this version of reliabilism is called) also avoids the problems that refuted the causal theory of knowledge, along with problems besetting more contemporary theories (such as virtue reliabilism and the “safety” account of knowledge). Finally, causal tracking reliabilism allows for a response to Linda Zagzebski’s challenge that no theory of knowledge can both eliminate the possibility of Gettier cases while also allowing fully warranted but false beliefs."
to:NB  epistemology 
23 days ago
Hume’s definitions of ‘Cause’: Without idealizations, within the bounds of science - Springer
"Interpreters have found it exceedingly difficult to understand how Hume could be right in claiming that his two definitions of ‘cause’ are essentially the same. As J. A. Robinson points out, the definitions do not even seem to be extensionally equivalent. Don Garrett offers an influential solution to this interpretative problem, one that attributes to Hume the reliance on an ideal observer. I argue that the theoretical need for an ideal observer stems from an idealized concept of definition, which many interpreters, including Garrett, attribute to Hume. I argue that this idealized concept of definition indeed demands an unlimited or infinite ideal observer. But there is substantial textual evidence indicating that Hume disallows the employment of idealizations in general in the sciences. Thus Hume would reject the idealized concept of definition and its corresponding ideal observer. I then put forward an expert-relative reading of Hume’s definitions of ‘cause’, which also renders both definitions extensionally equivalent. On the expert-relative reading, the meaning of ‘cause’ changes with better observations and experiments, but it also allows Humean definitions to play important roles within our normative practices. Finally, I consider and reject Henry Allison’s argument that idealized definitions and their corresponding infinite minds are necessary for expert reflection on the limitations of current science."
to:NB  causality  hume.david  history_of_ideas  philosophy 
24 days ago
The New Spirit of Capitalism
"Why is the critique of capitalism so ineffective today? In this major work, the sociologists Eve Chiapello and Luc Boltanski suggest that we should be addressing the crisis of anticapitalist critique by exploring its very roots.
"Via an unprecedented analysis of management texts which influenced the thinking of employers and contributed to reorganization of companies over the last decades, the authors trace the contours of a new spirit of capitalism. From the middle of the 1970s onwards, capitalism abandoned the hierarchical Fordist work structure and developed a new network-based form of organization which was founded on employee initiative and relative work autonomy, but at the cost of material and psychological security.
"This new spirit of capitalism triumphed thanks to a remarkable recuperation of the “artistic critique”—that which, after May 1968, attacked the alienation of everyday life by capitalism and bureaucracy. At the same time, the “social critique” was disarmed by the appearance of neocapitalism and remained fixated on the old schemas of hierarchical production."
to:NB  books:noted  social_criticism  sociology  the_wired_ideology  capitalism  management 
27 days ago
Vidyasagar, M.: Hidden Markov Processes: Theory and Applications to Biology
"This book explores important aspects of Markov and hidden Markov processes and the applications of these ideas to various problems in computational biology. The book starts from first principles, so that no previous knowledge of probability is necessary. However, the work is rigorous and mathematical, making it useful to engineers and mathematicians, even those not interested in biological applications. A range of exercises is provided, including drills to familiarize the reader with concepts and more advanced problems that require deep thinking about the theory. Biological applications are taken from post-genomic biology, especially genomics and proteomics.
"The topics examined include standard material such as the Perron-Frobenius theorem, transient and recurrent states, hitting probabilities and hitting times, maximum likelihood estimation, the Viterbi algorithm, and the Baum-Welch algorithm. The book contains discussions of extremely useful topics not usually seen at the basic level, such as ergodicity of Markov processes, Markov Chain Monte Carlo (MCMC), information theory, and large deviation theory for both i.i.d and Markov processes. The book also presents state-of-the-art realization theory for hidden Markov models. Among biological applications, it offers an in-depth look at the BLAST (Basic Local Alignment Search Technique) algorithm, including a comprehensive explanation of the underlying theory. Other applications such as profile hidden Markov models are also explored."
to:NB  books:noted  markov_models  state-space_models  em_algorithm  large_deviations  stochastic_processes  statistical_inference_for_stochastic_processes  statistics  genomics  bioinformatics  vidyasagar.mathukumali 
27 days ago
AER (104,10) p. 3115 - Financial Networks and Contagion
"We study cascades of failures in a network of interdependent financial organizations: how discontinuous changes in asset values (e.g., defaults and shutdowns) trigger further failures, and how this depends on network structure. Integration (greater dependence on counterparties) and diversification (more counterparties per organization) have different, nonmonotonic effects on the extent of cascades. Diversification connects the network initially, permitting cascades to travel; but as it increases further, organizations are better insured against one another's failures. Integration also faces trade-offs: increased dependence on other organizations versus less sensitivity to own investments. Finally, we illustrate the model with data on European debt cross-holdings."
to:NB  social_networks  finance  economics  jackson.matthew_o. 
28 days ago
Reconstructing Macroeconomic Theory to Manage Economic Policy
"Macroeconomics has not done well in recent years: The standard models didn't predict the Great Recession; and even said it couldn't happen. After the bubble burst, the models did not predict the full consequences.
"The paper traces the failures to the attempts, beginning in the 1970s, to reconcile macro and microeconomics, by making the former adopt the standard competitive micro-models that were under attack even then, from theories of imperfect and asymmetric information, game theory, and behavioral economics.
"The paper argues that any theory of deep downturns has to answer these questions: What is the source of the disturbances? Why do seemingly small shocks have such large effects? Why do deep downturns last so long? Why is there such persistence, when we have the same human, physical, and natural resources today as we had before the crisis?
"The paper presents a variety of hypotheses which provide answers to these questions, and argues that models based on these alternative assumptions have markedly different policy implications, including large multipliers. It explains why the apparent liquidity trap today is markedly different from that envisioned by Keynes in the Great Depression, and why the Zero Lower Bound is not the central impediment to the effectiveness of monetary policy in restoring the economy to full employment."
to:NB  to_read  macroeconomics  economics  stiglitz.joseph  financial_crisis_of_2007-- 
28 days ago
Sociality, Hierarchy, Health: Comparative Biodemography: Papers from a Workshop | The National Academies Press
"Sociality, Hierarchy, Health: Comparative Biodemography is a collection of papers that examine cross-species comparisons of social environments with a focus on social behaviors along with social hierarchies and connections, to examine their effects on health, longevity, and life histories. This report covers a broad spectrum of nonhuman animals, exploring a variety of measures of position in social hierarchies and social networks, drawing links among these factors to health outcomes and trajectories, and comparing them to those in humans. Sociality, Hierarchy, Health revisits both the theoretical underpinnings of biodemography and the empirical findings that have emerged over the past two decades."
to:NB  books:noted  social_networks  medicine  inequality  sociology  natural_science_of_the_human_species  ethology  primates 
29 days ago
IEEE Xplore Abstract - Control theoretic smoothing splines
"Some of the relationships between optimal control and statistics are examined. We produce generalized, smoothing splines by solving an optimal control problem for linear control systems, minimizing the L2-norm of the control signal, while driving the scalar output of the control system close to given, prespecified interpolation points. We then prove a convergence result for the smoothing splines, using results from the theory of numerical quadrature. Finally, we show, in simulations, that our approach works in practice as well as in theory"
to:NB  splines  control_theory  smoothing  statistics  via:arsyed 
4 weeks ago
Knox, P., ed.: Atlas of Cities (Hardcover).
"More than half the world’s population lives in cities, and that proportion is expected to rise to three-quarters by 2050. Urbanization is a global phenomenon, but the way cities are developing, the experience of city life, and the prospects for the future of cities vary widely from region to region. The Atlas of Cities presents a unique taxonomy of cities that looks at different aspects of their physical, economic, social, and political structures; their interactions with each other and with their hinterlands; the challenges and opportunities they present; and where cities might be going in the future.
"Each chapter explores a particular type of city—from the foundational cities of Greece and Rome and the networked cities of the Hanseatic League, through the nineteenth-century modernization of Paris and the industrialization of Manchester, to the green and “smart” cities of today. Expert contributors explore how the development of these cities reflects one or more of the common themes of urban development: the mobilizing function (transport, communication, and infrastructure); the generative function (innovation and technology); the decision-making capacity (governance, economics, and institutions); and the transformative capacity (society, lifestyle, and culture)."
to:NB  books:noted  cities  visual_display_of_quantitative_information  geography 
4 weeks ago
Network-based statistical comparison of citation topology of bibliographic databases : Scientific Reports : Nature Publishing Group
"Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, while, not surprisingly, Web of Science is significantly more reliable from the perspective of consistency. This work can serve either as a reference for scholars in bibliometrics and scientometrics or a scientific evaluation guideline for governments and research agencies."

--- The methods don't look very compelling, but the last tag applies.
to:NB  network_data_analysis  bibliometry  citation_networks  re:network_differences 
4 weeks ago
« earlier      
academia afghanistan agent-based_models american_history archaeology art bad_data_analysis bad_science_journalism bayesian_consistency bayesianism biochemical_networks book_reviews books:noted books:owned books:recommended bootstrap cartoons cats causal_inference causality central_asia central_limit_theorem class_struggles_in_america classifiers climate_change clustering cognitive_science collective_cognition comics community_discovery complexity computational_statistics confidence_sets corruption coveted cthulhiana cultural_criticism data_analysis data_mining debunking decision-making decision_theory delong.brad democracy density_estimation dimension_reduction distributed_systems dynamical_systems econometrics economic_history economic_policy economics education empirical_processes ensemble_methods entropy_estimation epidemic_models epistemology ergodic_theory estimation evisceration evolutionary_biology experimental_psychology finance financial_crisis_of_2007-- financial_markets financial_speculation fmri food fraud funny funny:academic funny:geeky funny:laughing_instead_of_screaming funny:malicious funny:pointed graph_theory graphical_models have_read heard_the_talk heavy_tails high-dimensional_statistics history_of_ideas history_of_science human_genetics hypothesis_testing ideology imperialism in_nb inequality inference_to_latent_objects information_theory institutions kernel_estimators kernel_methods kith_and_kin krugman.paul large_deviations lasso learning_theory liberman.mark likelihood linguistics literary_criticism machine_learning macro_from_micro macroeconomics manifold_learning market_failures_in_everything markov_models mixing mixture_models model_selection modeling modern_ruins monte_carlo moral_psychology moral_responsibility mortgage_crisis natural_history_of_truthiness network_data_analysis networked_life networks neural_data_analysis neuroscience non-equilibrium nonparametrics optimization our_decrepit_institutions philosophy philosophy_of_science photos physics pittsburgh political_economy political_science practices_relating_to_the_transmission_of_genetic_information prediction pretty_pictures principal_components probability programming progressive_forces psychology r racism random_fields re:almost_none re:aos_project re:democratic_cognition re:do-institutions-evolve re:g_paper re:homophily_and_confounding re:network_differences re:smoothing_adjacency_matrices re:social_networks_as_sensor_networks re:stacs re:your_favorite_dsge_sucks recipes regression regulation running_dogs_of_reaction science_as_a_social_process science_fiction simulation social_influence social_life_of_the_mind social_media social_networks social_science_methodology sociology something_about_america sparsity spatial_statistics state-space_models statistical_inference_for_stochastic_processes statistical_mechanics statistics stochastic_processes text_mining the_american_dilemma the_continuing_crises time_series to:blog to:nb to_be_shot_after_a_fair_trial to_read to_teach:complexity-and-inference to_teach:data-mining to_teach:statcomp to_teach:undergrad-ada track_down_references us-iraq_war us_politics utter_stupidity variable_selection vast_right-wing_conspiracy via:? via:henry_farrell via:jbdelong via:klk visual_display_of_quantitative_information whats_gone_wrong_with_america why_oh_why_cant_we_have_a_better_academic_publishing_system why_oh_why_cant_we_have_a_better_press_corps

Copy this bookmark:



description:


tags: