Mixed Graphical Models via Exponential Families | AISTATS 2014 | JMLR W&CP
"Markov Random Fields, or undirected graphical models are widely used to model high-dimensional multivariate data. Classical instances of these models, such as Gaussian Graphical and Ising Models, as well as recent extensions to graphical models specified by univariate exponential families, assume all variables arise from the same distribution. Complex data from high-throughput genomics and social networking for example, often contain discrete, count, and continuous variables measured on the same set of samples. To model such heterogeneous data, we develop a novel class of mixed graphical models by specifying that each node-conditional distribution is a member of a possibly different univariate exponential family. We study several instances of our model, and propose scalable M-estimators for recovering the underlying network structure. Simulations as well as an application to learning mixed genomic networks from next generation sequencing and mutation data demonstrate the versatility of our methods."

--- Um. Haven't people been doing this in practice since about forever?
to:NB  graphical_models  exponential_families  statistics  ravikumar.pradeep  allen.genevera  to_be_shot_after_a_fair_trial 
A New Approach to Probabilistic Programming Inference | AISTATS 2014 | JMLR W&CP
"We introduce and demonstrate a new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo. Our approach is easy to implement and to parallelize, applies to Turing-complete probabilistic programming languages, and supports accurate inference in models that make use of complex control flow, including stochastic recursion, as well as primitives from nonparametric Bayesian statistics. Our experiments show that this approach can be more efficient than previously introduced single-site Metropolis-Hastings samplers."
to:NB  monte_carlo  theoretical_computer_science  to_read  wood.frank 
Linear-time training of nonlinear low-dimensional embeddings | AISTATS 2014 | JMLR W&CP
"Nonlinear embeddings such as stochastic neighbor embedding or the elastic embedding achieve better results than spectral methods but require an expensive, nonconvex optimization, where the objective function and gradient are quadratic on the sample size. We address this bottleneck by formulating the optimization as an N-body problem and using fast multipole methods (FMMs) to approximate the gradient in linear time. We study the effect, in theory and experiment, of approximating gradients in the optimization and show that the expected error is related to the mean curvature of the objective function, and that gradually increasing the accuracy level in the FMM over iterations leads to a faster training. When combined with standard optimizers, such as gradient descent or L-BFGS, the resulting algorithm beats the (NlogN) Barnes-Hut method and achieves reasonable embeddings for one million points in around three hours’ runtime."
to:NB  optimization  multidimensional_scaling  computational_statistics  statistics 
Active Learning for Undirected Graphical Model Selection | AISTATS 2014 | JMLR W&CP
"This paper studies graphical model selection, i.e., the problem of estimating a graph of statistical relationships among a collection of random variables. Conventional graphical model selection algorithms are passive, i.e., they require all the measurements to have been collected before processing begins. We propose an active learning algorithm that uses junction tree representations to adapt future measurements based on the information gathered from prior measurements. We prove that, under certain conditions, our active learning algorithm requires fewer scalar measurements than any passive algorithm to reliably estimate a graph. A range of numerical results validate our theory and demonstrates the benefits of active learning."
to:NB  graphical_models  model_discovery  active_learning  experimental_design  vats.divyanshu  nowak.robert  statistics  learning_theory 
<span>Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression</span> | AISTATS 2014 | JMLR W&CP
"In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding PaTh, that transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters."
to:NB  to_read  regression  sparsity  high-dimensional_statistics  statistics  vats.divyanshu 
PAC-Bayesian Collective Stability | AISTATS 2014 | JMLR W&CP
"Recent results have shown that the generalization error of structured predictors decreases with both the number of examples and the size of each example, provided the data distribution has weak dependence and the predictor exhibits a smoothness property called collective stability. These results use an especially strong definition of collective stability that must hold uniformly over all inputs and all hypotheses in the class. We investigate whether weaker definitions of collective stability suffice. Using the PAC-Bayes framework, which is particularly amenable to our new definitions, we prove that generalization is indeed possible when uniform collective stability happens with high probability over draws of predictors (and inputs). We then derive a generalization bound for a class of structured predictors with variably convex inference, which suggests a novel learning objective that optimizes collective stability."
to:NB  to_read  stability_of_learning  learning_theory  relational_learning  getoor.lise 
Non-Asymptotic Analysis of Relational Learning with One Network | AISTATS 2014 | JMLR W&CP
"This theoretical paper is concerned with a rigorous non-asymptotic analysis of relational learning applied to a single network. Under suitable and intuitive conditions on features and clique dependencies over the network, we present the first probably approximately correct (PAC) bound for maximum likelihood estimation (MLE). To our best knowledge, this is the first sample complexity result of this problem. We propose a novel combinational approach to analyze complex dependencies of relational data, which is crucial to our non-asymptotic analysis. The consistency of MLE under our conditions is also proved as the consequence of our sample complexity bound. Finally, our combinational method for analyzing dependent data can be easily generalized to treat other generalized maximum likelihood estimators for relational learning."
to:NB  estimation  relational_learning  learning_theory  statistics  network_data_analysis  to_read  re:XV_for_networks 
On the Testability of Models with Missing Data | AISTATS 2014 | JMLR W&CP
"Graphical models that depict the process by which data are lost are helpful in recovering information from missing data. We address the question of whether any such model can be submitted to a statistical test given that the data available are corrupted by missingness. We present sufficient conditions for testability in missing data applications and note the impediments for testability when data are contaminated by missing entries. Our results strengthen the available tests for MCAR and MAR and further provide tests in the category of MNAR. Furthermore, we provide sufficient conditions to detect the existence of dependence between a variable and its missingness mechanism. We use our results to show that model sensitivity persists in almost all models typically categorized as MNAR."
to:NB  missing_data  graphical_models  hypothesis_testing  statistics  pearl.judea 
[1404.1355] Studying Social Networks at Scale: Macroscopic Anatomy of the Twitter Social Graph
"Twitter is one of the largest social networks using exclusively directed links among accounts. This makes the Twitter social graph much closer to the social graph supporting real life communications than, for instance, Facebook. Therefore, understanding the structure of the Twitter social graph is interesting not only for computer scientists, but also for researchers in other fields, such as sociologists. However, little is known about how the information propagation in Twitter is constrained by its inner structure. In this paper, we present an in-depth study of the macroscopic structure of the Twitter social graph unveiling the highways on which tweets propagate, the specific user activity associated with each component of this macroscopic structure, and the evolution of this macroscopic structure with time for the past 6 years. For this study, we crawled Twitter to retrieve all accounts and all social relationships (follow links) among accounts; the crawl completed in July 2012 with 505 million accounts interconnected by 23 billion links. Then, we present a methodology to unveil the macroscopic structure of the Twitter social graph. This macroscopic structure consists of 8 components defined by their connectivity characteristics. Each component group users with a specific usage of Twitter. For instance, we identified components gathering together spammers, or celebrities. Finally, we present a method to approximate the macroscopic structure of the Twitter social graph in the past, validate this method using old datasets, and discuss the evolution of the macroscopic structure of the Twitter social graph during the past 6 years."
to:NB  network_data_analysis  networked_life  social_media  re:network_differences  to_read 
[1404.1295] Detecting criminal organizations in mobile phone networks
"The study of criminal networks using traces from heterogeneous communication media is acquiring increasing importance in nowadays society. The usage of communication media such as phone calls and online social networks leaves digital traces in the form of metadata that can be used for this type of analysis. The goal of this work is twofold: first we provide a theoretical framework for the problem of detecting and characterizing criminal organizations in networks reconstructed from phone call records. Then, we introduce an expert system to support law enforcement agencies in the task of unveiling the underlying structure of criminal networks hidden in communication data. This platform allows for statistical network analysis, community detection and visual exploration of mobile phone network data. It allows forensic investigators to deeply understand hierarchies within criminal organizations, discovering members who play central role and provide connection among sub-groups. Our work concludes illustrating the adoption of our computational framework for a real-word criminal investigation."
to:NB  network_data_analysis  community_discovery  social_networks  crime  surveillance  to_be_shot_after_a_fair_trial  statistics  when_you_walk_through_the_garden_you_gotta_watch_your_back 
[1404.0578] Mental Disorder Recovery Correlated with Centralities and Interactions on an Online Social Network
"Recent research has established both a theoretical basis and strong empirical evidence that effective social behavior plays a beneficial role in the maintenance of physical and psychological well-being of people. To verify this relationship on online communities, we studied the correlations between the recovery of patients with mental health problems and their social behaviors. As a source of the data related to the social behavior and progress of mental recovery, we used PatientsLikeMe (PLM), the world's first open-participation research platform for the development of patient-centered health outcome measures. We first constructed an online social network structure, based on patient-to-patient ties among 200 patients obtained from PLM. We found that some node properties, such as in-degrees and eigenvector centralities, were significantly correlated with the recovery of those patients. Meanwhile, we also collected another set of recovery data two months after the first recovery data collection, to investigate the effect of social behaviors over time. Linear regression analysis revealed an equally strong correlation between the patients' social behavior and the second recovery data as the first ones, implying the robustness of our finding over time. Our findings suggest that social interactions in online communities such as PLM could be useful as a good predictor for the recovery of patients with mental disorders."

--- The obvious explanation would be that patients with high degree tended to not be very mentally ill in the first place...
to:NB  social_networks  mental_illness  network_data_analysis  to_be_shot_after_a_fair_trial 
[1404.0667] ATLAS: A geometric approach to learning high-dimensional stochastic systems near manifolds
"When simulating multiscale stochastic differential equations (SDEs) in high-dimensions, separation of timescales, stochastic noise and high-dimensionality can make simulations prohibitively expensive. The computational cost is dictated by microscale properties and interactions of many variables, while interesting behavior often occurs at the macroscale level and at large time scales, often characterized by few important, but unknown, degrees of freedom. For many problems bridging the gap between the microscale and macroscale by direct simulation is computationally infeasible. In this work we propose a novel approach to automatically learn a reduced model with an associated fast macroscale simulator. Our unsupervised learning algorithm uses short parallelizable microscale simulations to learn provably accurate macroscale SDE models. The learning algorithm takes as input: the microscale simulator, a local distance function, and a homogenization spatial or temporal scale, which is the smallest time scale of interest in the reduced system. The learned macroscale model can then be used for fast computation and storage of long simulations. We discuss various examples, both low- and high-dimensional, as well as results about the accuracy of the fast simulators we construct, and its dependency on the number of short paths requested from the microscale simulator."
to:NB  stochastic_differential_equations  macro_from_micro  simulation  stochastic_processes 
[1404.1466] Coarse-graining and fluctuations: Two birds with one stone
"We show how the mathematical structure of large-deviation principles matches well with the concept of coarse-graining. For those systems with a large-deviation principle, this may lead to a general approach to coarse-graining through the variational form of the large-deviation functional."
to:NB  to_read  large_deviations  macro_from_micro  stochastic_processes 
[1404.0645] Moment bounds and concentration inequalities for slowly mixing dynamical systems
"We obtain optimal moment bounds for Birkhoff sums, and optimal concentration inequalities, for a large class of slowly mixing dynamical systems, including those that admit anomalous diffusion in the form of a stable law or a central limit theorem with nonstandard scaling (nlogn)1/2."
to:NB  mixing  concentration_of_measure  dynamical_systems  central_limit_theorem  stochastic_processes 
[1404.0295] Hitting time statistics for observations of dynamical systems
"In this paper we study the distribution of hitting and return times for observations of dynamical systems. We apply this results to get an exponential law for the distribution of hitting and return times for rapidly mixing random dynamical systems. In particular, it allows us to obtain an exponential law for random toral automorphisms, random circle maps expanding in average and randomly perturbed dynamical systems."
to:NB  mixing  recurrence_times  dynamical_systems  stochastic_processes 
[1404.0353] The size distribution, scaling properties and spatial organization of urban clusters: a global and regional perspective
"Human development has far-reaching impacts on the surface of the globe. The transformation of natural land cover occurs in different forms and urban growth is one of the most eminent transformative processes. We analyze global land cover data and extract cities as defined by maximally connected urban clusters. The analysis of the city size distribution for all cities on the globe confirms Zipf's law. Moreover, by investigating the percolation properties of the clustering of urban areas we assess the closeness to criticality for various countries. At the critical thresholds, the urban land cover of the countries undergoes a transition from separated clusters to a gigantic component on the country scale. We study the Zipf-exponents as a function of the closeness to percolation and find a systematic decrease with increasing scale, which could be the reason for deviating exponents reported in literature. Moreover, we investigate the average size of the clusters as a function of the proximity to percolation and find country specific behavior. By relating the standard deviation and the average of cluster sizes -- analogous to Taylor's law -- we suggest an alternative way to identify the percolation transition. We calculate spatial correlations of the urban land cover and find long-range correlations. Finally, by relating the areas of cities with population figures we address the global aspect of the allometry of cities, finding an exponent δ≈0.85, i.e. large cities have lower densities."
to:NB  cities  spatial_statistics  statistics 
[1404.0333] Cross-checking different sources of mobility information
"The pervasive use of new mobile devices has allowed a better characterization in space and time of human concentrations and mobility in general. Besides its theoretical interest, describing mobility is of great importance for a number of practical applications ranging from the forecast of disease spreading to the design of new spaces in urban environments. While classical data sources, such as surveys or census, have a limited level of geographical resolution (e.g., districts, municipalities, counties are typically used) or are restricted to generic workdays or weekends, the data coming from mobile devices can be precisely located both in time and space. Most previous works have used a single data source to study human mobility patterns. Here we perform instead a cross-check analysis by comparing results obtained with data collected from three different sources: Twitter, census and cell phones. The analysis is focused on the urban areas of Barcelona and Madrid, for which data of the three types is available. We assess the correlation between the datasets on different aspects: the spatial distribution of people concentration, the temporal evolution of people density and the mobility patterns of individuals. Our results show that the three data sources are providing comparable information. Even though the representativeness of Twitter geolocated data is lower than that of mobile phone and census data, the correlations between the population density profiles and mobility patterns detected by the three datasets are close to one in a grid with cells of 2x2 and 1x1 square kilometers. This level of correlation supports the feasibility of interchanging the three data sources at the spatio-temporal scales considered."
to:NB  re:social_networks_as_sensor_networks  statistics  surveys 
[1404.0267] The diffusion dynamics of choice: From durable goods markets to fashion first names
"Goods, styles, ideologies are adopted by society through various mechanisms. In particular, adoption driven by innovation is extensively studied by marketing economics. Mathematical models are currently used to forecast the sales of innovative goods. Inspired by the theory of diffusion processes developed for marketing economics, we propose, for the first time, a predictive framework for the mechanism of fashion, which we apply to first names. Analyses of French, Dutch and US national databases validate our modelling approach for thousands of first names, covering, on average, more than 50% of the yearly incidence in each database. In these cases, it is thus possible to forecast how popular the first names will become and when they will run out of fashion. Furthermore, we uncover a clear distinction between popularity and fashion: less popular names, typically not included in studies of fashion, may be driven by fashion, as well."
to:NB  diffusion_of_innovations  names  statistics 
[1404.0994] Evolutionary game theory using agent-based methods
"Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic situations (finite populations, non-vanishing mutations rates, communication between agents, and spatial interactions) require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. Here we discuss the use of agent-based methods in evolutionary game theory and contrast standard results to those obtainable by a mathematical treatment. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread, but that mathematics is crucial to validate the computational simulations."
to:NB  agent-based_models  evolutionary_game_theory 
Effect of Beer Marinades on Formation of Polycyclic Aromatic Hydrocarbons in Charcoal-Grilled Pork - Journal of Agricultural and Food Chemistry (ACS Publications)
"The effect of marinating meat with Pilsner beer, nonalcoholic Pilsner beer, and Black beer (coded respectively PB, P0B, and BB) on the formation of polycyclic aromatic hydrocarbons (PAHs) in charcoal-grilled pork was evaluated and compared with the formation of these compounds in unmarinated meat. Antiradical activity of marinades (DPPH assay) was assayed. BB exhibited the strongest scavenging activity (68.0%), followed by P0B (36.5%) and PB (29.5%). Control and marinated meat samples contained the eight PAHs named PAH8 by the EFSA and classified as suitable indicators for carcinogenic potency of PAHs in food. BB showed the highest inhibitory effect in the formation of PAH8 (53%), followed by P0B (25%) and PB (13%). The inhibitory effect of beer marinades on PAH8 increased with the increase of their radical-scavenging activity. BB marinade was the most efficient on reduction of PAH formation, providing a proper mitigation strategy."
food  cooking  chemistry  funny:geeky  via:aks 
Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens
"Each of four theoretical traditions in the study of American politics – which can be characterized as theories of Majoritarian Electoral Democracy, Economic Elite Domination, and two types of interest group pluralism, Majoritarian Pluralism and Biased Pluralism – offers different predictions about which sets of actors have how much influence over public policy: average citizens; economic elites; and organized interest groups, mass-based or business-oriented.
"A great deal of empirical research speaks to the policy influence of one or another set of actors, but until recently it has not been possible to test these contrasting theoretical predictions against each other within a single statistical model. This paper reports on an effort to do so, using a unique data set that includes measures of the key variables for 1,779 policy issues.
"Multivariate analysis indicates that economic elites and organized groups representing business interests have substantial independent impacts on U.S. government policy, while average citizens and mass-based interest groups have little or no independent influence. The results provide substantial support for theories of Economic Elite Domination and for theories of Biased Pluralism, but not for theories of Majoritarian Electoral Democracy or Majoritarian Pluralism."
to:NB  to_read  political_science  us_politics  congress  our_decrepit_institutions  via:monkeycage  re:democratic_cognition 
2 days ago
Think Tanks in America, Medvetz
"Over the past half-century, think tanks have become fixtures of American politics, supplying advice to presidents and policy makers, expert testimony on Capitol Hill, and convenient facts and figures to journalists and media specialists. But what are think tanks? Who funds them? What kind of “research” do they produce? Where does their authority come from? And how influential have they become?"
to:NB  books:noted  think_tanks  public_policy  political_science  us_politics  natural_history_of_truthiness 
5 days ago
Shiflet, A.B. and Shiflet, G.W.: Introduction to Computational Science: Modeling and Simulation for the Sciences (Second Edition). (Hardcover)
"Computational science is an exciting new field at the intersection of the sciences, computer science, and mathematics because much scientific investigation now involves computing as well as theory and experiment. This textbook provides students with a versatile and accessible introduction to the subject. It assumes only a background in high school algebra, enables instructors to follow tailored pathways through the material, and is the only textbook of its kind designed specifically for an introductory course in the computational science and engineering curriculum. While the text itself is generic, an accompanying website offers tutorials and files in a variety of software packages.
"This fully updated and expanded edition features two new chapters on agent-based simulations and modeling with matrices, ten new project modules, and an additional module on diffusion. Besides increased treatment of high-performance computing and its applications, the book also includes additional quick review questions with answers, exercises, and individual and team projects."
to:NB  books:noted  to_teach:complexity-and-inference  programming  mathematics  modeling 
5 days ago
Osterhammel, J.; Camiller, P.,: The Transformation of the World: A Global History of the Nineteenth Century. (eBook and Hardcover)
"A monumental history of the nineteenth century, The Transformation of the World offers a panoramic and multifaceted portrait of a world in transition. Jürgen Osterhammel, an eminent scholar who has been called the Braudel of the nineteenth century, moves beyond conventional Eurocentric and chronological accounts of the era, presenting instead a truly global history of breathtaking scope and towering erudition. He examines the powerful and complex forces that drove global change during the "long nineteenth century," taking readers from New York to New Delhi, from the Latin American revolutions to the Taiping Rebellion, from the perils and promise of Europe's transatlantic labor markets to the hardships endured by nomadic, tribal peoples across the planet. Osterhammel describes a world increasingly networked by the telegraph, the steamship, and the railways. He explores the changing relationship between human beings and nature, looks at the importance of cities, explains the role slavery and its abolition played in the emergence of new nations, challenges the widely held belief that the nineteenth century witnessed the triumph of the nation-state, and much more.
"This is the highly anticipated English edition of the spectacularly successful and critically acclaimed German book, which is also being translated into Chinese, Polish, Russian, and French. Indispensable for any historian, The Transformation of the World sheds important new light on this momentous epoch, showing how the nineteenth century paved the way for the global catastrophes of the twentieth century, yet how it also gave rise to pacifism, liberalism, the trade union, and a host of other crucial developments."
to:NB  books:noted  world_history  19th_century_history  industrial_revolution  the_singularity_has_happened 
5 days ago
Wagner, G.P.: Homology, Genes, and Evolutionary Innovation (eBook and Hardcover).
"Homology--a similar trait shared by different species and derived from common ancestry, such as a seal's fin and a bird's wing--is one of the most fundamental yet challenging concepts in evolutionary biology. This groundbreaking book provides the first mechanistically based theory of what homology is and how it arises in evolution.
"Günter Wagner, one of the preeminent researchers in the field, argues that homology, or character identity, can be explained through the historical continuity of character identity networks--that is, the gene regulatory networks that enable differential gene expression. He shows how character identity is independent of the form and function of the character itself because the same network can activate different effector genes and thus control the development of different shapes, sizes, and qualities of the character. Demonstrating how this theoretical model can provide a foundation for understanding the evolutionary origin of novel characters, Wagner applies it to the origin and evolution of specific systems, such as cell types; skin, hair, and feathers; limbs and digits; and flowers."
to:NB  books:noted  evolutionary_biology  evolution  genetics  re:do-institutions-evolve 
5 days ago
Campbell, J.L. and Pedersen, O.K.: The National Origins of Policy Ideas: Knowledge Regimes in the United States, France, Germany, and Denmark. (eBook, Paperback and Hardcover)
"In politics, ideas matter. They provide the foundation for economic policymaking, which in turn shapes what is possible in domestic and international politics. Yet until now, little attention has been paid to how these ideas are produced and disseminated, and how this process varies between countries. The National Origins of Policy Ideas provides the first comparative analysis of how "knowledge regimes"--communities of policy research organizations like think tanks, political party foundations, ad hoc commissions, and state research offices, and the institutions that govern them--generate ideas and communicate them to policymakers.
"John Campbell and Ove Pedersen examine how knowledge regimes are organized, operate, and have changed over the last thirty years in the United States, France, Germany, and Denmark. They show how there are persistent national differences in how policy ideas are produced. Some countries do so in contentious, politically partisan ways, while others are cooperative and consensus oriented. They find that while knowledge regimes have adopted some common practices since the 1970s, tendencies toward convergence have been limited and outcomes have been heavily shaped by national contexts.
"Drawing on extensive interviews with top officials at leading policy research organizations, this book demonstrates why knowledge regimes are as important to capitalism as the state and the firm, and sheds new light on debates about the effects of globalization, the rise of neoliberalism, and the orientation of comparative political economy in political science and sociology."
to:NB  books:noted  public_policy  social_life_of_the_mind  sociology  institutions  political_science  re:democratic_cognition  re:do-institutions-evolve 
5 days ago
Sugrue, T.J.: The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit. (eBook and Paperback)
"Once America's "arsenal of democracy," Detroit is now the symbol of the American urban crisis. In this reappraisal of America's racial and economic inequalities, Thomas Sugrue asks why Detroit and other industrial cities have become the sites of persistent racialized poverty. He challenges the conventional wisdom that urban decline is the product of the social programs and racial fissures of the 1960s. Weaving together the history of workplaces, unions, civil rights groups, political organizations, and real estate agencies, Sugrue finds the roots of today's urban poverty in a hidden history of racial violence, discrimination, and deindustrialization that reshaped the American urban landscape after World War II.
"This Princeton Classics edition includes a new preface by Sugrue, discussing the lasting impact of the postwar transformation on urban America and the chronic issues leading to Detroit's bankruptcy."
to:NB  books:noted  books:partially_read  detroit  american_history  20th_century_history  the_american_dilemma  cities  whats_gone_wrong_with_america 
5 days ago
Gutmann, A. and Thompson, D.F.: The Spirit of Compromise: Why Governing Demands It and Campaigning Undermines It. (eBook and Paperback)
"To govern in a democracy, political leaders have to compromise. When they do not, the result is political paralysis--dramatically demonstrated by the gridlock in Congress in recent years. In The Spirit of Compromise, eminent political thinkers Amy Gutmann and Dennis Thompson show why compromise is so important, what stands in the way of achieving it, and how citizens can make defensible compromises more likely. They urge politicians to focus less on campaigning and more on governing. In a new preface, the authors reflect on the state of compromise in Congress since the book's initial publication.
"Calling for greater cooperation in contemporary politics, The Spirit of Compromise will interest everyone who cares about making government work better for the good of all."
to:NB  books:noted  congress  us_politics  democracy  re:democratic_cognition 
5 days ago
Ben-Shahar, O. and Schneider, C.E.: More Than You Wanted to Know: The Failure of Mandated Disclosure. (eBook and Hardcover)
"Perhaps no kind of regulation is more common or less useful than mandated disclosure--requiring one party to a transaction to give the other information. It is the iTunes terms you assent to, the doctor's consent form you sign, the pile of papers you get with your mortgage. Reading the terms, the form, and the papers is supposed to equip you to choose your purchase, your treatment, and your loan well. More Than You Wanted to Know surveys the evidence and finds that mandated disclosure rarely works. But how could it? Who reads these disclosures? Who understands them? Who uses them to make better choices?
"Omri Ben-Shahar and Carl Schneider put the regulatory problem in human terms. Most people find disclosures complex, obscure, and dull. Most people make choices by stripping information away, not layering it on. Most people find they can safely ignore most disclosures and that they lack the literacy to analyze them anyway. And so many disclosures are mandated that nobody could heed them all. Nor can all this be changed by simpler forms in plainer English, since complex things cannot be made simple by better writing. Furthermore, disclosure is a lawmakers' panacea, so they keep issuing new mandates and expanding old ones, often instead of taking on the hard work of writing regulations with bite."

--- On the one hand, I have long thought that the notion of "an informed consumer" goes against the whole point of having a market economy. On the other hand, endorsement by Richard Posner is often a sign of something nefarious.
to:NB  books:noted  regulation  law  institutions  decision-making  to_be_shot_after_a_fair_trial  re:democratic_cognition  our_decrepit_institutions 
5 days ago
Teitelbaum, M.S.: Falling Behind? Boom, Bust, and the Global Race for Scientific Talent. (eBook and Hardcover)
"Is the United States falling behind in the global race for scientific and engineering talent? Are U.S. employers facing shortages of the skilled workers that they need to compete in a globalized world? Such claims from some employers and educators have been widely embraced by mainstream media and political leaders, and have figured prominently in recent policy debates about education, federal expenditures, tax policy, and immigration. Falling Behind? offers careful examinations of the existing evidence and of its use by those involved in these debates.
"These concerns are by no means a recent phenomenon. Examining historical precedent, Michael Teitelbaum highlights five episodes of alarm about "falling behind" that go back nearly seventy years to the end of World War II. In each of these episodes the political system responded by rapidly expanding the supply of scientists and engineers, but only a few years later political enthusiasm or economic demand waned. Booms turned to busts, leaving many of those who had been encouraged to pursue science and engineering careers facing disheartening career prospects. Their experiences deterred younger and equally talented students from following in their footsteps--thereby sowing the seeds of the next cycle of alarm, boom, and bust."
to:NB  books:noted  science_policy  american_history  economic_history 
5 days ago
PowellsBooks.Blog – Jen Van Meter: The Powells.com Interview - Powell's Books
"When I started graduate school, I was thrown into the teaching pool pretty quickly. I wasn't much older than many of my students, I was very shy about public speaking, and I didn't feel a great claim to much authority up in front of the room at first — I was intimidated. At the time, the role-playing game we were spending the most time with was White Wolf's Vampire: The Masquerade; my character was all tough and mean and wore leather and got into bar fights and wasn't intimidated by anybody. So on teaching days, I dressed like her. I didn't tell anyone, but for about a year, I worked up the nerve to go teach by secretly cosplaying this fictional character."

--- _Hopeless Savages_ is terrific.
funny:geeky  nerdworld  comics 
6 days ago
Struck, P.T.: Birth of the Symbol: Ancient Readers at the Limits of Their Texts. (eBook, Paperback and Hardcover)
"Nearly all of us have studied poetry and been taught to look for the symbolic as well as literal meaning of the text. Is this the way the ancients saw poetry? In Birth of the Symbol, Peter Struck explores the ancient Greek literary critics and theorists who invented the idea of the poetic "symbol."
"The book notes that Aristotle and his followers did not discuss the use of poetic symbolism. Rather, a different group of Greek thinkers--the allegorists--were the first to develop the notion. Struck extensively revisits the work of the great allegorists, which has been underappreciated. He links their interest in symbolism to the importance of divination and magic in ancient times, and he demonstrates how important symbolism became when they thought about religion and philosophy. "They see the whole of great poetic language as deeply figurative," he writes, "with the potential always, even in the most mundane details, to be freighted with hidden messages.""
to:NB  books:noted  ancient_history  literary_history  literary_criticism  neo-platonism 
6 days ago
Israel, J.: Revolutionary Ideas: An Intellectual History of the French Revolution from The Rights of Man to Robespierre. (eBook and Hardcover)
"Historians of the French Revolution used to take for granted what was also obvious to its contemporary observers--that the Revolution was caused by the radical ideas of the Enlightenment. Yet in recent decades scholars have argued that the Revolution was brought about by social forces, politics, economics, or culture--almost anything but abstract notions like liberty or equality. In Revolutionary Ideas, one of the world's leading historians of the Enlightenment restores the Revolution's intellectual history to its rightful central role. Drawing widely on primary sources, Jonathan Israel shows how the Revolution was set in motion by radical eighteenth-century doctrines, how these ideas divided revolutionary leaders into vehemently opposed ideological blocs, and how these clashes drove the turning points of the Revolution.
"Revolutionary Ideas demonstrates that the Revolution was really three different revolutions vying for supremacy--a conflict between constitutional monarchists such as Lafayette who advocated moderate Enlightenment ideas; democratic republicans allied to Tom Paine who fought for Radical Enlightenment ideas; and authoritarian populists, such as Robespierre, who violently rejected key Enlightenment ideas and should ultimately be seen as Counter-Enlightenment figures. The book tells how the fierce rivalry between these groups shaped the course of the Revolution, from the Declaration of Rights, through liberal monarchism and democratic republicanism, to the Terror and the Post-Thermidor reaction.
"In this compelling account, the French Revolution stands once again as a culmination of the emancipatory and democratic ideals of the Enlightenment. That it ended in the Terror represented a betrayal of those ideas--not their fulfillment."

--- Turning _Robespierre_ into a _counter_-Enlightenment figure would seem to call for great dexterity.
to:NB  books:noted  history_of_ideas  enlightenment  french_revolution  israel.jonathan 
6 days ago
Normal acquisition of expertise with greebles in two cases of acquired prosopagnosia
"Face recognition is generally thought to rely on different neurocognitive mechanisms than most types of objects, but the specificity of these mechanisms is debated. One account suggests the mechanisms are specific to upright faces, whereas the expertise view proposes the mechanisms operate on objects of high within-class similarity with which an observer has become proficient at rapid individuation. Much of the evidence cited in support of the expertise view comes from laboratory-based training experiments involving computer-generated objects called greebles that are designed to place face-like demands on recognition mechanisms. A fundamental prediction of the expertise hypothesis is that recognition deficits with faces will be accompanied by deficits with objects of expertise. Here we present two cases of acquired prosopagnosia, Herschel and Florence, who violate this prediction: Both show normal performance in a standard greeble training procedure, along with severe deficits on a matched face training procedure. Herschel and Florence also meet several response time criteria that advocates of the expertise view suggest signal successful acquisition of greeble expertise. Furthermore, Herschel’s results show that greeble learning can occur without normal functioning of the right fusiform face area, an area proposed to mediate greeble expertise. The marked dissociation between face and greeble expertise undermines greeble-based claims challenging face-specificity and indicates face recognition mechanisms are not necessary for object recognition after laboratory-based training."
to:NB  perception  categorization  neuropsychology  to_read 
6 days ago
Knowledge discovery by accuracy maximization
"Here we describe KODAMA (knowledge discovery by accuracy maximization), an unsupervised and semisupervised learning algorithm that performs feature extraction from noisy and high-dimensional data. Unlike other data mining methods, the peculiarity of KODAMA is that it is driven by an integrated procedure of cross-validation of the results. The discovery of a local manifold’s topology is led by a classifier through a Monte Carlo procedure of maximization of cross-validated predictive accuracy. Briefly, our approach differs from previous methods in that it has an integrated procedure of validation of the results. In this way, the method ensures the highest robustness of the obtained solution. This robustness is demonstrated on experimental datasets of gene expression and metabolomics, where KODAMA compares favorably with other existing feature extraction methods. KODAMA is then applied to an astronomical dataset, revealing unexpected features. Interesting and not easily predictable features are also found in the analysis of the State of the Union speeches by American presidents: KODAMA reveals an abrupt linguistic transition sharply separating all post-Reagan from all pre-Reagan speeches. The transition occurs during Reagan’s presidency and not from its beginning."

--- I suspect that only in one of the glossies could one get away with claiming that cross-validation was a radical new departure for data mining.
to:NB  manifold_learning  data_mining  computational_statistics  statistics  to_be_shot_after_a_fair_trial 
7 days ago
Mathematical approaches to modeling development and reprogramming
"Induced pluripotent stem cells (iPSCs) are created by the reprogramming of somatic cells via overexpression of certain transcription factors, such as the originally described Yamanaka factors: Oct4, Sox2, Klf4, and c-Myc (OSKM). Here we discuss recent advancements in iPSC reprogramming and introduce mathematical approaches to help map the landscape between cell states during reprogramming. Our modelization indicates that OSKM expression diminishes and/or changes potential barriers between cell states and that epigenetic remodeling facilitate these transitions. From a practical perspective, the modeling approaches outlined here allow us to predict the time necessary to create a given number of iPSC colonies or the number of reprogrammed cells generated in a given time. Additional investigations will help to further refine modeling strategies, rendering them applicable toward the study of the development and stability of cancer cells or even other reprogramming processes such as lineage conversion. Ultimately, a quantitative understanding of cell state transitions might facilitate the establishment of regenerative medicine strategies and enhance the translation of reprogramming technologies into the clinic."
to:NB  molecular_biology  developmental_biology  modeling 
7 days ago
A Book That Needed To Be Written | The Baseline Scenario
How does it compare to _No One Makes You Shop At Walmart_?
economics  debunking  via:jbdelong 
7 days ago
Mike Konczal for Democracy Journal: The Voluntarism Fantasy
I suspect the idea of policing the morals of those receiving private charity is widely seen on the right as a feature, not a bug.
charity  welfare_state  political_economy  institutions  american_history  public_policy  konczal.mike  have_read 
7 days ago
[1404.0431] Learning Latent Block Structure in Weighted Networks
"Community detection is an important task in network analysis, in which we aim to learn a network partition that groups together vertices with similar community-level connectivity patterns. By finding such groups of vertices with similar structural roles, we extract a compact representation of the network's large-scale structure, which can facilitate its scientific interpretation and the prediction of unknown or future interactions. Popular approaches, including the stochastic block model, assume edges are unweighted, which limits their utility by throwing away potentially useful information. We introduce the `weighted stochastic block model' (WSBM), which generalizes the stochastic block model to networks with edge weights drawn from any exponential family distribution. This model learns from both the presence and weight of edges, allowing it to discover structure that would otherwise be hidden when weights are discarded or thresholded. We describe a Bayesian variational algorithm for efficiently approximating this model's posterior distribution over latent block structures. We then evaluate the WSBM's performance on both edge-existence and edge-weight prediction tasks for a set of real-world weighted networks. In all cases, the WSBM performs as well or better than the best alternatives on these tasks."
to:NB  network_data_analysis  community_discovery  kith_and_kin  jacobs.abigail_z.  clauset.aaron 
8 days ago
[1404.1239] Towards a Multi-Subject Analysis of Neural Connectivity
"Directed acyclic graphs (DAGs) and associated probability models are widely used to model neural connectivity and communication channels. In many experiments, data are collected from multiple subjects whose DAGs may differ but are likely to share many features. The first exact algorithm for estimation of multiple related DAGs was recently proposed by Oates et al. (2014); in this letter we present examples and discuss implications of the methodology as applied to the analysis of fMRI data from a multi-subject experiment. Elicitation of hyperparameters requires care and we illustrate how this may proceed retrospectively based on technical replicate data. In addition to joint learning of subject-specific DAGs, we simultaneously estimate relationships between the subjects themselves. A special case of the methodology provides a novel analogue of k-means clustering of subjects based on their DAG structure. It is anticipated that the exact algorithms discussed here will be widely applicable within neuroscience."
to:NB  graphical_models  model_discovery  neuroscience  functional_connectivity  statistics  re:network_differences 
8 days ago
[1404.1361] Nonparametric Compressive Graphical Model Selection for Vector-Valued Stationary Random Processes: A Multitask Learning Approach
"We propose a method for inferring the conditional independence graph (CIG) of a high-dimensional Gaussian time series (discrete time process) from a finite-length observation. By contrast to existing approaches, we do not rely on a parametric process model (such as, e.g., an autoregressive model) for the observed random process. Instead, we only require certain smoothness properties (in the Fourier domain) of the process only. The proposed inference scheme is compressive in that it works even for sample sizes much smaller than the number of scalar process components. A theoretical performance analysis provides conditions which guarantee that the probability of the proposed inference method to deliver a wrong the CIG is below a prescribed value. This analysis reveals conditions for the new method to be consistent asymptotically. Some numerical experiments validate our theoretical performance analysis and demonstrate superior performance of our scheme compared to existing approaches in case of model mismatch."
to:NB  model_discovery  graphical_models  time_series  nonparametrics  statistics 
8 days ago
Regret bounded by gradual variation for online convex optimization - Machine Learning
"Recently, it has been shown that the regret of the Follow the Regularized Leader (FTRL) algorithm for online linear optimization can be bounded by the total variation of the cost vectors rather than the number of rounds. In this paper, we extend this result to general online convex optimization. In particular, this resolves an open problem that has been posed in a number of recent papers. We first analyze the limitations of the FTRL algorithm as proposed by Hazan and Kale (in Machine Learning 80(2–3), 165–188, 2010) when applied to online convex optimization, and extend the definition of variation to a gradual variation which is shown to be a lower bound of the total variation. We then present two novel algorithms that bound the regret by the gradual variation of cost functions. Unlike previous approaches that maintain a single sequence of solutions, the proposed algorithms maintain two sequences of solutions that make it possible to achieve a variation-based regret bound for online convex optimization.
"To establish the main results, we discuss a lower bound for FTRL that maintains only one sequence of solutions, and a necessary condition on smoothness of the cost functions for obtaining a gradual variation bound. We extend the main results three-fold: (i) we present a general method to obtain a gradual variation bound measured by general norm; (ii) we extend algorithms to a class of online non-smooth optimization with gradual variation bound; and (iii) we develop a deterministic algorithm for online bandit optimization in multipoint bandit setting."
to:NB  to_read  learning_theory  low-regret_learning  individual_sequence_prediction  optimization  re:growing_ensemble_project 
9 days ago
Five Red Flags that Preceded Neo & Bee's Collapse
"The story is so fantastic that if I made it up, editors would reject my novel because it'd be too unbelievable."
funny:malicious  bitcoin  fraud 
9 days ago
Scale-free power-laws as interaction between progress and diffusion - Hilbert - 2013 - Complexity - Wiley Online Library
"While scale-free power-laws are frequently found in social and technological systems, their authenticity, origin, and gained insights are often questioned, and rightfully so. The article presents a newly found rank-frequency power-law that aligns the top-500 supercomputers according to their performance. Pursuing a cautious approach in a systematic way, we check for authenticity, evaluate several potential generative mechanisms, and ask the “so what” question. We evaluate and finally reject the applicability of well-known potential generative mechanisms such as preferential attachment, self-organized criticality, optimization, and random observation. Instead, the microdata suggest that an inverse relationship between exponential technological progress and exponential technology diffusion through social networks results in the identified fat-tail distribution. This newly identified generative mechanism suggests that the supply and demand of technology (“technology push” and “demand pull”) align in exponential synchronicity, providing predictive insights into the evolution of highly uncertain technology markets"
to:NB  heavy_tails  computers  diffusion_of_innovations 
12 days ago
Message-Passing Algorithms for Sparse Network Alignment
"Network alignment generalizes and unifies several approaches for forming a matching or alignment between the vertices of two graphs. We study a mathematical programming framework for network alignment problem and a sparse variation of it where only a small number of matches between the vertices of the two graphs are possible. We propose a new message passing algorithm that allows us to compute, very efficiently, approximate solutions to the sparse network alignment problems with graph sizes as large as hundreds of thousands of vertices. We also provide extensive simulations comparing our algorithms with two of the best solvers for network alignment problems on two synthetic matching problems, two bioinformatics problems, and three large ontology alignment problems including a multilingual problem with a known labeled alignment."

- Ungated version: https://www.cs.purdue.edu/homes/dgleich/publications/Bayati%202013%20-%20sparse%20belief%20propagation.pdf
in_NB  to_read  network_data_analysis  graph_theory  network_alignment  re:network_differences  entableted 
12 days ago
AER (104,4) p. 1120 - Vertical Integration and Input Flows
"We use broad-based yet detailed data from the economy's goods-producing sectors to investigate firms' ownership of production chains. It does not appear that vertical ownership is primarily used to facilitate transfers of goods along the production chain, as is often presumed: roughly one-half of upstream establishments report no shipments to downstream establishments within the same firm. We propose an alternative explanation for vertical ownership, namely that it promotes efficient intrafirm transfers of intangible inputs. We show evidence consistent with this hypothesis, including the fact that, after a change of ownership, an acquired establishment begins to resemble the acquiring firm along multiple dimensions."
to:NB  economics  industrial_organization 
12 days ago
AER (104,4) p. 1211 - Spatial Development
"We present a theory of spatial development. Manufacturing and services firms located in a continuous geographic area choose each period how much to innovate. Firms trade subject to transport costs and technology diffuses spatially. We apply the model to study the evolution of the US economy in the last half-century and find that it can generate the reduction in the manufacturing employment share, the increased spatial concentration of services, the growth in service productivity starting in the mid-1990s, the rise in the dispersion of land rents in the same period, as well as several other spatial and temporal patterns."
to:NB  economics  development_economics  economic_geography 
12 days ago
[1403.6585] Moment Conditions for Convergence of Particle Filters with Unbounded Importance Weights
"In this paper, we derive moment conditions for particle filter importance weights, which ensure the mean square and almost sure convergence of particle filter estimates even when the importance weights are unbounded. The result extends the previously derived conditions by not requiring the boundedness of weights, but only finite second or fourth order moments. We show that the boundedness of the second order moments of the weights implies the convergence of the estimates bounded functions in the mean square sense, and the L4 convergence as well as the almost sure convergence are assured by the boundedness of the fourth order moments of the weights. We also present an example class of models and importance distributions where the moment conditions hold, but the boundedness does not. The unboundedness in these models is caused by isolated singularities in the weights which still leave the weight moments bounded. We show by using simulated data that the particle filter for this kind of model also performs well in practice."
to:NB  particle_filters  statistics  state_estimation  re:amplification_sampling 
12 days ago
[1403.6804] A simple modification for improving inference of non-linear dynamical systems
"Particle and ensemble filters are increasingly utilized for inference, optimization, and forecast; however, both filtering methods use discrete distributions to simulate continuous state space, a drawback that can lead to degraded performance for non-linear dynamical systems. Here we propose a simple modification, applicable to both particle and ensemble filters, that compensates for this problem. The method randomly replaces one or more model variables or parameters within a fraction of simulated trajectories at each filtering cycle. This modification, termed space re-probing, expands the state space covered by the filter through the introduction of outlying trajectories. We apply the space re-probing modification to three particle filters and three ensemble filters, and use these modified filters to model and forecast influenza epidemics. For both filter types, the space re-probing improves simulation of influenza epidemic curves and the prediction of influenza outbreak peak timing. Further, as fewer particles are needed for the particle filters, the proposed modification reduces the computational cost of these filters."
to:NB  particle_filters  state-space_models  state_estimation  statistical_inference_for_stochastic_processes  statistics  re:amplification_sampling 
12 days ago
[1403.7118] A Unified Framework of Constrained Regression
"Generalized additive models (GAMs) play an important role in modeling and understanding complex relationships in modern applied statistics. They allow for flexible, data-driven estimation of covariate effects. Yet researchers often have a priori knowledge of certain effects, which might be monotonic or periodic (cyclic) or should fulfill boundary conditions. We propose a unified framework to incorporate these constraints for both univariate and bivariate effect estimates and for varying coefficients. As the framework is based on (functional gradient descent) boosting methods, variables can be selected intrinsically, and effects can be estimated for a wide range of different distributional assumptions. We present three case studies from environmental sciences. The first on air pollution illustrates the use of monotonic and periodic effects in the context of an additive Poisson model. The second case study highlights the use of bivariate cyclic splines to model activity profiles of roe deer. The third case study demonstrates how to estimate the complete conditional distribution function of deer-vehicle collisions with the help of monotonicity constraints, and a cyclic constraint is considered for the seasonal variation of collision numbers. All discussed constrained effect estimates are implemented in the comprehensive R package mboost for model-based boosting."
to:NB  additive_models  regression  nonparametrics  statistics  to_read  boosting  splines 
12 days ago
[1403.7063] A Significance Test for Covariates in Nonparametric Regression
"We consider testing the significance of a subset of covariates in a nonparametric regression. These covariates can be continuous and/or discrete. We propose a new kernel-based test that smoothes only over the covariates appearing under the null hypothesis, so that the curse of dimensionality is mitigated. The test statistic is asymptotically pivotal and the rate of which the test detects local alternatives depends only on the dimension of the covariates under the null hypothesis. We show the validity of wild bootstrap for the test. In small samples, our test is competitive compared to existing procedures."
to:NB  variable_selection  hypothesis_testing  statistics  nonparametrics  regression  to_teach:undergrad-ADA 
12 days ago
[1403.7001] Spaghetti prediction: A robust method for forecasting short time series
"A novel method for predicting time series is described and demonstrated. This method inputs time series data points and outputs multiple "spaghetti" functions from which predictions can be made. Spaghetti prediction has desirable properties that are not realized by classic autoregression, moving average, spline, Gaussian process, and other methods. It is particularly appropriate for short time series because it allows asymmetric prediction distributions and produces prediction functions which are robust in that they use multiple independent models."
to:NB  to_read  prediction  time_series  statistics  to_be_shot_after_a_fair_trial 
12 days ago
[1403.5787] Scalable detection of statistically significant communities and hierarchies: message-passing for modularity
"Modularity is a popular measure of community structure. However, maximizing the modularity can lead to many competing partitions with almost the same modularity that are poorly correlated to each other; it can also overfit, producing illusory "communities" in random graphs where none exist. We address this problem by using the modularity as a Hamiltonian, and computing the marginals of the resulting Gibbs distribution. If we assign each node to its most-likely community under these marginals, we claim that, unlike the ground state, the resulting partition is a good measure of statistically-significant community structure.
"We propose an efficient Belief Propagation (BP) algorithm to compute these marginals. In random networks with no true communities, the system has two phases as we vary the temperature: a paramagnetic phase where all marginals are equal, and a spin glass phase where BP fails to converge. In networks with real community structure, there is an additional retrieval phase where BP converges, and where the marginals are strongly correlated with the underlying communities. We show analytically and numerically that the proposed algorithm works all the way down to the detectability transition in networks generated by the stochastic block model. We also show that our algorithm performs well on real-world networks, revealing large communities in some networks where previous work has claimed no communities exist. Finally we show that by applying our algorithm recursively, subdividing communities until no statistically-significant subcommunities can be found, we can detect hierarchical structure in real-world networks more efficiently than previous methods. Our algorithm is highly scalable, working in time nearly linear in the number of edges: for networks with 10^5 nodes and 10^6 edges, for instance, it takes 14 seconds to find community structure."
to:NB  to_read  heard_the_talk  community_discovery  kith_and_kin  moore.cristopher  network_data_analysis  statistics  computational_statistics 
12 days ago
[1404.0300] Followers Are Not Enough: Beyond Structural Communities in Online Social Networks
"Community detection in online social networks is typically based on the analysis of the explicit connections between users, such as "friends" on Facebook and "followers" on Twitter. But online users often have hundreds or even thousands of such connections, and many of these connections do not correspond to real friendships or more generally to accounts that users interact with. We claim that community detection in online social networks should be question-oriented and rely on additional information beyond the simple structure of the network. The concept of 'community' is very general, and different questions such as "who do we interact with?" and "with whom do we share similar interests?" can lead to the discovery of different social groups. In this paper we focus on three types of communities beyond structural communities: activity-based, topic-based, and interaction-based. We analyze a Twitter dataset using three different weightings of the structural network meant to highlight these three community types, and then infer the communities associated with these weightings. We show that the communities obtained in the three weighted cases are highly different from each other, and from the communities obtained by considering only the unweighted structural network. Our results confirm that asking a precise question is an unavoidable first step in community detection in online social networks, and that different questions can lead to different insights into the network under study."
to:NB  social_networks  social_media  community_discovery  network_data_analysis  to_read  entableted 
12 days ago
[1404.0067] Topics in social network analysis and network science
"This chapter introduces statistical methods used in the analysis of social networks and in the rapidly evolving parallel-field of network science. Although several instances of social network analysis in health services research have appeared recently, the majority involve only the most basic methods and thus scratch the surface of what might be accomplished. Cutting-edge methods using relevant examples and illustrations in health services research are provided."

--- From a quick skim, they rather over-value what can be accomplished by rejecting an internally-inconsistent model...
to:NB  to_read  have_skimmed  social_networks  social_influence  network_data_analysis  re:homophily_and_confounding  enta 
12 days ago
So You Think You're Smarter Than A CIA Agent : Parallels : NPR
Note however the presumably-expert background wok of determining what questions to put to the panel...
collective_cognition  intelligence  re:democratic_cognition  expertise  via:henry_farrell  have_read 
12 days ago
Cline, E.H.: 1177 B.C.: The Year Civilization Collapsed. (eBook and Hardcover)
"In 1177 B.C., marauding groups known only as the "Sea Peoples" invaded Egypt. The pharaoh's army and navy managed to defeat them, but the victory so weakened Egypt that it soon slid into decline, as did most of the surrounding civilizations. After centuries of brilliance, the civilized world of the Bronze Age came to an abrupt and cataclysmic end. Kingdoms fell like dominoes over the course of just a few decades. No more Minoans or Mycenaeans. No more Trojans, Hittites, or Babylonians. The thriving economy and cultures of the late second millennium B.C., which had stretched from Greece to Egypt and Mesopotamia, suddenly ceased to exist, along with writing systems, technology, and monumental architecture. But the Sea Peoples alone could not have caused such widespread breakdown. How did it happen?
"In this major new account of the causes of this "First Dark Ages," Eric Cline tells the gripping story of how the end was brought about by multiple interconnected failures, ranging from invasion and revolt to earthquakes, drought, and the cutting of international trade routes. Bringing to life the vibrant multicultural world of these great civilizations, he draws a sweeping panorama of the empires and globalized peoples of the Late Bronze Age and shows that it was their very interdependence that hastened their dramatic collapse and ushered in a dark age that lasted centuries."
to:NB  books:noted  ancient_history 
13 days ago
Mazur, J.: Enlightening Symbols: A Short History of Mathematical Notation and Its Hidden Powers. (eBook and Hardcover)
"While all of us regularly use basic math symbols such as those for plus, minus, and equals, few of us know that many of these symbols weren't available before the sixteenth century. What did mathematicians rely on for their work before then? And how did mathematical notations evolve into what we know today? In Enlightening Symbols, popular math writer Joseph Mazur explains the fascinating history behind the development of our mathematical notation system. He shows how symbols were used initially, how one symbol replaced another over time, and how written math was conveyed before and after symbols became widely adopted.
"Traversing mathematical history and the foundations of numerals in different cultures, Mazur looks at how historians have disagreed over the origins of the numerical system for the past two centuries. He follows the transfigurations of algebra from a rhetorical style to a symbolic one, demonstrating that most algebra before the sixteenth century was written in prose or in verse employing the written names of numerals. Mazur also investigates the subconscious and psychological effects that mathematical symbols have had on mathematical thought, moods, meaning, communication, and comprehension. He considers how these symbols influence us (through similarity, association, identity, resemblance, and repeated imagery), how they lead to new ideas by subconscious associations, how they make connections between experience and the unknown, and how they contribute to the communication of basic mathematics."
to:NB  books:noted  history_of_mathematics  epidemiology_of_representations 
13 days ago
Cook, M.: Ancient Religions, Modern Politics: The Islamic Case in Comparative Perspective. (eBook and Hardcover)
"Why does Islam play a larger role in contemporary politics than other religions? Is there something about the Islamic heritage that makes Muslims more likely than adherents of other faiths to invoke it in their political life? If so, what is it? Ancient Religions, Modern Politics seeks to answer these questions by examining the roles of Islam, Hinduism, and Christianity in modern political life, placing special emphasis on the relevance--or irrelevance--of their heritages to today's social and political concerns.
"Michael Cook takes an in-depth, comparative look at political identity, social values, attitudes to warfare, views about the role of religion in various cultural domains, and conceptions of the polity. In all these fields he finds that the Islamic heritage offers richer resources for those engaged in current politics than either the Hindu or the Christian heritages. He uses this finding to explain the fact that, despite the existence of Hindu and Christian counterparts to some aspects of Islamism, the phenomenon as a whole is unique in the world today. The book also shows that fundamentalism--in the sense of a determination to return to the original sources of the religion--is politically more adaptive for Muslims than it is for Hindus or Christians."
to:NB  books:noted  islam  islamic_civilization  fundamentalism  comparative_history 
13 days ago
Soames, S.: The Analytic Tradition in Philosophy, Volume 1: The Founding Giants. (eBook and Hardcover)
"This is the first of five volumes of a definitive history of analytic philosophy from the invention of modern logic in 1879 to the end of the twentieth century. Scott Soames, a leading philosopher of language and historian of analytic philosophy, provides the fullest and most detailed account of the analytic tradition yet published, one that is unmatched in its chronological range, topics covered, and depth of treatment. Focusing on the major milestones and distinguishing them from the dead ends, Soames gives a seminal account of where the analytic tradition has been and where it appears to be heading.
"Volume 1 examines the initial phase of the analytic tradition through the major contributions of three of its four founding giants--Gottlob Frege, Bertrand Russell, and G. E. Moore. Soames describes and analyzes their work in logic, the philosophy of mathematics, epistemology, metaphysics, ethics, and the philosophy of language. He explains how by about 1920 their efforts had made logic, language, and mathematics central to philosophy in an unprecedented way. But although logic, language, and mathematics were now seen as powerful tools to attain traditional ends, they did not yet define philosophy. As volume 1 comes to a close, that was all about to change with the advent of the fourth founding giant, Ludwig Wittgenstein, and the 1922 English publication of his Tractatus, which ushered in a "linguistic turn" in philosophy that was to last for decades."
to:NB  books:noted  history_of_ideas  history_of_philosophy  history_of_mathematics  logic  russell.bertrand 
13 days ago
To Wash It All Away
"People think that Web browsers are elegant computation platforms, and Web pages are light, fluffy things that you can edit in Notepad as you trade ironic comments with your friends in the coffee shop. Nothing could be further from the truth. A modern Web page is a catastrophe. It’s like a scene from one of those apocalyptic medieval paintings that depicts what would happen if Galactus arrived: people are tumbling into fiery crevasses and lament- ing various lamentable things and hanging from playground equipment that would not pass OSHA safety checks. This kind of stuff is exactly what you’ll see if you look at the HTML, CSS, and JavaScript in a modern Web page. Of course, no human can truly “look” at this content, because a Web page is now like V’Ger from the first “Star Trek” movie, a piece of technology that we once understood but can no longer fathom, a thrashing leviathan of code and markup written by people so untrust- worthy that they’re not even third parties, they’re fifth parties who weren’t even INVITED to the party, but who showed up anyways because the hippies got it right and free love or whatever."
funny:geeky  programming  web  via:?  have_read 
13 days ago
A New History of the Humanities - Rens Bod - Oxford University Press
"Many histories of science have been written, but A New History of the Humanities offers the first overarching history of the humanities from Antiquity to the present. There are already historical studies of musicology, logic, art history, linguistics, and historiography, but this volume gathers these, and many other humanities disciplines, into a single coherent account.
"Its central theme is the way in which scholars throughout the ages and in virtually all civilizations have sought to identify patterns in texts, art, music, languages, literature, and the past. What rules can we apply if we wish to determine whether a tale about the past is trustworthy? By what criteria are we to distinguish consonant from dissonant musical intervals? What rules jointly describe all possible grammatical sentences in a language? How can modern digital methods enhance pattern-seeking in the humanities? Rens Bod contends that the hallowed opposition between the sciences (mathematical, experimental, dominated by universal laws) and the humanities (allegedly concerned with unique events and hermeneutic methods) is a mistake born of a myopic failure to appreciate the pattern-seeking that lies at the heart of this inquiry. A New History of the Humanities amounts to a persuasive plea to give Panini, Valla, Bopp, and countless other often overlooked intellectual giants their rightful place next to the likes of Galileo, Newton, and Einstein."
to:NB  history_of_ideas  humanities  books:noted 
14 days ago
Courage in the Democratic Polis - Ryan K. Balot - Oxford University Press
"In this careful and compelling study, Ryan K. Balot brings together political theory, classical history, and ancient philosophy in order to reinterpret courage as a specifically democratic virtue. Ranging from Thucydides and Aristophanes to the Greek tragedians and Plato, Balot shows that the ancient Athenians constructed a novel vision of courage that linked this virtue to fundamental democratic ideals such as freedom, equality, and practical rationality. The Athenian ideology of courage had practical implications for the conduct of war, for gender relations, and for the citizens' self-image as democrats. In revising traditional ideals, Balot argues, the Athenians reimagined the emotional and cognitive motivations for courage in ways that will unsettle and transform our contemporary discourses. Without losing sight of political tensions and practical conflicts, Balot illustrates the merits of the Athenian ideal, provocatively explaining its potential to enlarge our contemporary understandings of politics and ethics. The result is a remarkably interdisciplinary work that has significant implications for the theory and practice of democracy, both ancient and modern."
to:NB  books:noted  athens  ancient_history  history_of_morals  democracy  moral_psychology 
14 days ago
Chameleons: The Misuse of Theoretical Models
"In this essay I discuss how theoretical models in finance and economics are used in ways that make them “chameleons” and how chameleons devalue the intellectual currency and muddy policy debates. A model becomes a chameleon when it is built on assumptions with dubious connections to the real world but nevertheless has conclusions that are uncritically (or not critically enough) applied to understanding our economy. I discuss how chameleons are created and nurtured by the mistaken notion that one should not judge a model by its assumptions, by the unfounded argument that models should have equal standing until definitive empirical tests are conducted, and by misplaced appeals to “as-if” arguments, mathematical elegance, subtlety, references to assumptions that are “standard in the literature,” and the need for tractability."

--- Query: Are the fundamental theorems of welfare economics "chameleons"?
to:NB  social_science_methodology  economics  finance  natural_history_of_truthiness  scholarly_misconstruction_of_reality  ideology  have_read  to:blog 
14 days ago
How to judge a theoretical model | Cyrus Samii
This all ought to be obvious, but I have met many smart economists who disagree with some or all of it.
14 days ago
Taylor & Francis Online :: Principal Flows - Journal of the American Statistical Association - Volume 109, Issue 505
"We revisit the problem of extending the notion of principal component analysis (PCA) to multivariate datasets that satisfy nonlinear constraints, therefore lying on Riemannian manifolds. Our aim is to determine curves on the manifold that retain their canonical interpretability as principal components, while at the same time being flexible enough to capture nongeodesic forms of variation. We introduce the concept of a principal flow, a curve on the manifold passing through the mean of the data, and with the property that, at any point of the curve, the tangent velocity vector attempts to fit the first eigenvector of a tangent space PCA locally at that same point, subject to a smoothness constraint. That is, a particle flowing along the principal flow attempts to move along a path of maximal variation of the data, up to smoothness constraints. The rigorous definition of a principal flow is given by means of a Lagrangian variational problem, and its solution is reduced to an ODE problem via the Euler–Lagrange method. Conditions for existence and uniqueness are provided, and an algorithm is outlined for the numerical solution of the problem. Higher order principal flows are also defined. It is shown that global principal flows yield the usual principal components on a Euclidean space. By means of examples, it is illustrated that the principal flow is able to capture patterns of variation that can escape other manifold PCA methods."
to:NB  dimension_reduction  principal_components  manifold_learning  geometry  statistics 
14 days ago
« earlier      
academia afghanistan agent-based_models american_history archaeology art bad_data_analysis bad_science_journalism bayesian_consistency bayesianism biochemical_networks book_reviews books:noted books:owned books:recommended bootstrap cartoons cats causal_inference causality central_asia central_limit_theorem class_struggles_in_america classifiers climate_change clustering cognitive_science collective_cognition comics community_discovery complexity computational_statistics computer_networks_as_provinces_of_the_commonwealth_of_letters confidence_sets corruption coveted cthulhiana cultural_criticism data_analysis data_mining debunking decision-making decision_theory delong.brad democracy density_estimation dimension_reduction distributed_systems dynamical_systems econometrics economic_history economic_policy economics education empirical_processes ensemble_methods entropy_estimation epidemic_models ergodic_theory estimation evisceration evolutionary_biology experimental_psychology exponential_families finance financial_crisis_of_2007-- financial_markets financial_speculation fmri food fraud funny funny:academic funny:geeky funny:laughing_instead_of_screaming funny:malicious graph_theory graphical_models have_read heard_the_talk heavy_tails high-dimensional_statistics history_of_ideas history_of_science human_genetics hypothesis_testing ideology imperialism in_nb inequality inference_to_latent_objects information_theory institutions kernel_methods kith_and_kin krugman.paul large_deviations lasso learning_theory liberman.mark likelihood linguistics literary_criticism machine_learning macro_from_micro macroeconomics manifold_learning market_failures_in_everything markov_models mixing mixture_models model_selection modeling modern_ruins monte_carlo moral_psychology moral_responsibility mortgage_crisis natural_history_of_truthiness network_data_analysis networked_life networks neural_data_analysis neuroscience non-equilibrium nonparametrics obama.barack optimization our_decrepit_institutions philosophy philosophy_of_science photos physics pittsburgh political_economy political_science practices_relating_to_the_transmission_of_genetic_information prediction pretty_pictures principal_components probability programming progressive_forces psychology r racism random_fields re:almost_none re:aos_project re:democratic_cognition re:do-institutions-evolve re:g_paper re:homophily_and_confounding re:network_differences re:smoothing_adjacency_matrices re:social_networks_as_sensor_networks re:stacs re:your_favorite_dsge_sucks recipes regression regulation running_dogs_of_reaction science_as_a_social_process science_fiction simulation social_influence social_life_of_the_mind social_media social_networks social_science_methodology sociology something_about_america sparsity spatial_statistics statistical_inference_for_stochastic_processes statistical_mechanics statistics stochastic_processes text_mining the_american_dilemma the_continuing_crises time_series to:blog to:nb to_be_shot_after_a_fair_trial to_read to_teach:complexity-and-inference to_teach:data-mining to_teach:statcomp to_teach:undergrad-ada track_down_references us-iraq_war us_politics utter_stupidity variable_selection vast_right-wing_conspiracy via:? via:jbdelong via:klk visual_display_of_quantitative_information whats_gone_wrong_with_america why_oh_why_cant_we_have_a_better_academic_publishing_system why_oh_why_cant_we_have_a_better_press_corps

Copy this bookmark: