**models-and-modes**126

Church vs Curry Types - LispCast

3 days ago by Vaguery

My ambitious hope is that this perspective will quiet a lot of the fighting as people recognize that they are just perpetuating a rift in the field of mathematics that happened a long time ago. The perspectives are irreconcilable now, but that could change. A paper called Church and Curry: Combining Intrinsic and Extrinsic Typing builds a language with both kinds of types. And Gradual Typing and Blame Calculus are investigating the intersection of static and dynamic typing. Let’s stop fighting, make some cool tools and use them well.

type-theory
computer-science
models-and-modes
dichotomy-or-not?
to-write-about
3 days ago by Vaguery

Opinion | Corporate America Is Suppressing Wages for Many Workers - The New York Times

3 days ago by Vaguery

For a long time, economists believed that labor-market monopsony rarely existed, at least outside old-fashioned company towns where a single factory employs most of the residents. But in recent decades, several compelling studies have revealed that monopsony is omnipresent. Professionals like doctors and nurses, workers in factories and meat processing plants, and sandwich makers and other low-skill workers earn far less — thousands of dollars less — than they would if employers did not dominate labor markets.

The studies show that common features of the labor market give enormous bargaining advantages to employers. Because most people sink roots in their communities, they are reluctant to quit their job and move to a job that is far away. Because workplaces differ in terms of their location and conditions, people have trouble comparing them, which means that one cannot easily “comparison shop” for jobs. And thanks to a wave of consolidation, industries are increasingly dominated by a small number of huge companies, which means that workers have fewer choices among employers in their area.

When employers exercise monopsonistic power, wages are suppressed, jobs are left unfilled, and economic growth suffers. Unions used to offset employer monopsony power, but unions now represent only 7 percent of private sector workers, down from a peak of 35 percent in the 1950s. Combating the practices that employers use to monopsonize the labor market can lead to higher wages, more jobs and faster economic growth.

worklife
economics
models-and-modes
public-policy
power-relations
to-write-about
capitalism
The studies show that common features of the labor market give enormous bargaining advantages to employers. Because most people sink roots in their communities, they are reluctant to quit their job and move to a job that is far away. Because workplaces differ in terms of their location and conditions, people have trouble comparing them, which means that one cannot easily “comparison shop” for jobs. And thanks to a wave of consolidation, industries are increasingly dominated by a small number of huge companies, which means that workers have fewer choices among employers in their area.

When employers exercise monopsonistic power, wages are suppressed, jobs are left unfilled, and economic growth suffers. Unions used to offset employer monopsony power, but unions now represent only 7 percent of private sector workers, down from a peak of 35 percent in the 1950s. Combating the practices that employers use to monopsonize the labor market can lead to higher wages, more jobs and faster economic growth.

3 days ago by Vaguery

[1802.02627] Going Deeper in Spiking Neural Networks: VGG and Residual Architectures

5 weeks ago by Vaguery

Over the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to enable low-power event-driven neuromorphic hardware. However, their application in machine learning have largely been limited to very shallow neural network architectures for simple problems. In this paper, we propose a novel algorithmic technique for generating an SNN with a deep architecture, and demonstrate its effectiveness on complex visual recognition problems such as CIFAR-10 and ImageNet. Our technique applies to both VGG and Residual network architectures, with significantly better accuracy than the state-of-the-art. Finally, we present analysis of the sparse event-driven computations to demonstrate reduced hardware overhead when operating in the spiking domain.

via:?
to-write-about
to-read
neural-networks
representation
models-and-modes
machine-learning
simulation
5 weeks ago by Vaguery

[1703.10651] Reliable Decision Support using Counterfactual Models

7 weeks ago by Vaguery

Making a good decision involves considering the likely outcomes under each possible action. For example, would drug A or drug B lead to a better outcome for this patient? Ideally, we answer these questions using an experiment, but this is not always possible (e.g., it may be unethical). As an alternative, we can use non-experimental data to learn models that make counterfactual predictions of what we would observe had we run an experiment. To learn such models for decision-making problems, we propose the use of counterfactual objectives in lieu of classical supervised learning objectives. We implement this idea in a challenging and frequently occurring context, and propose the counterfactual GP (CGP), a counterfactual model of continuous-time trajectories (time series) under sequences of actions taken in continuous-time. We develop our model within the potential outcomes framework of Neyman and Rubin. The counterfactual GP is trained using a joint maximum likelihood objective that adjusts for dependencies between observed actions and outcomes in the training data. We report two sets of experimental results. First, we show that the CGP's predictions are reliable; they are stable to changes in certain characteristics of the training data that are not relevant to the decision-making problem. Predictive models trained using classical supervised learning objectives, however, are not stable to such perturbations. In the second experiment, we use data from a real intensive care unit (ICU) and qualitatively demonstrate how the CGP's ability to answer "What if?" questions offers medical decision-makers a powerful new tool for planning treatment.

machine-learning
models-and-modes
rather-interesting
to-write-about
consider:symbolic-regression
7 weeks ago by Vaguery

[1710.03453] The Sparse Multivariate Method of Simulated Quantiles

november 2017 by Vaguery

In this paper the method of simulated quantiles (MSQ) of Dominicy and Veredas (2013) and Dominick et al. (2013) is extended to a general multivariate framework (MMSQ) and to provide a sparse estimator of the scale matrix (sparse-MMSQ). The MSQ, like alternative likelihood-free procedures, is based on the minimisation of the distance between appropriate statistics evaluated on the true and synthetic data simulated from the postulated model. Those statistics are functions of the quantiles providing an effective way to deal with distributions that do not admit moments of any order like the α-Stable or the Tukey lambda distribution. The lack of a natural ordering represents the major challenge for the extension of the method to the multivariate framework. Here, we rely on the notion of projectional quantile recently introduced by Hallin etal. (2010) and Kong Mizera (2012). We establish consistency and asymptotic normality of the proposed estimator. The smoothly clipped absolute deviation (SCAD) ℓ1--penalty of Fan and Li (2001) is then introduced into the MMSQ objective function in order to achieve sparse estimation of the scaling matrix which is the major responsible for the curse of dimensionality problem. We extend the asymptotic theory and we show that the sparse-MMSQ estimator enjoys the oracle properties under mild regularity conditions. The method is illustrated and its effectiveness is tested using several synthetic datasets simulated from the Elliptical Stable distribution (ESD) for which alternative methods are recognised to perform poorly. The method is then applied to build a new network-based systemic risk measurement framework. The proposed methodology to build the network relies on a new systemic risk measure and on a parametric test of statistical dominance.

statistics
reinventing-the-wheel
how-is-this-not-constrained-symbolic-regression?
algorithms
models-and-modes
to-understand
inference
november 2017 by Vaguery

[1703.04977] What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?

may 2017 by Vaguery

There are two major types of uncertainty one can model. Aleatoric uncertainty captures noise inherent in the observations. On the other hand, epistemic uncertainty accounts for uncertainty in the model -- uncertainty which can be explained away given enough data. Traditionally it has been difficult to model epistemic uncertainty in computer vision, but with new Bayesian deep learning tools this is now possible. We study the benefits of modeling epistemic vs. aleatoric uncertainty in Bayesian deep learning models for vision tasks. For this we present a Bayesian deep learning framework combining input-dependent aleatoric uncertainty together with epistemic uncertainty. We study models under the framework with per-pixel semantic segmentation and depth regression tasks. Further, our explicit uncertainty formulation leads to new loss functions for these tasks, which can be interpreted as learned attenuation. This makes the loss more robust to noisy data, also giving new state-of-the-art results on segmentation and depth regression benchmarks.

computer-vision
machine-learning
models-and-modes
uncertainty
deep-learning
rather-interesting
define-your-terms
representation
nudge-targets
to-wrt
may 2017 by Vaguery

[1312.7604] Probabilistic Archetypal Analysis

february 2017 by Vaguery

Archetypal analysis represents a set of observations as convex combinations of pure patterns, or archetypes. The original geometric formulation of finding archetypes by approximating the convex hull of the observations assumes them to be real valued. This, unfortunately, is not compatible with many practical situations. In this paper we revisit archetypal analysis from the basic principles, and propose a probabilistic framework that accommodates other observation types such as integers, binary, and probability vectors. We corroborate the proposed methodology with convincing real-world applications on finding archetypal winter tourists based on binary survey data, archetypal disaster-affected countries based on disaster count data, and document archetypes based on term-frequency data. We also present an appropriate visualization tool to summarize archetypal analysis solution better.

machine-learning
dimension-reduction
rather-interesting
models-and-modes
to-understand
to-write-about
february 2017 by Vaguery

The Archdruid Report: Perched on the Wheel of Time

february 2017 by Vaguery

In the final chapters of his second volume, for example, Spengler noted that civilizations in the stage ours was about to reach always end up racked by conflicts that pit established hierarchies against upstart demagogues who rally the disaffected and transform them into a power base. Looking at the trends visible in his own time, he sketched out the most likely form those conflicts would take in the Winter phase of our civilization. Modern representative democracy, he pointed out, has no effective defenses against corruption by wealth, and so could be expected to evolve into corporate-bureaucratic plutocracies that benefit the affluent at the expense of everyone else. Those left out in the cold by these transformations, in turn, end up backing what Spengler called Caesarism—the rise of charismatic demagogues who challenge and eventually overturn the corporate-bureaucratic order.

These demagogues needn’t come from within the excluded classes, by the way. Julius Caesar, the obvious example, came from an old upper-class Roman family and parlayed his family connections into a successful political career. Watchers of the current political scene may be interested to know that Caesar during his lifetime wasn’t the imposing figure he became in retrospect; he had a high shrill voice, his morals were remarkably flexible even by Roman standards—the scurrilous gossip of his time called him “every man’s wife and every woman’s husband”—and he spent much of his career piling up huge debts and then wriggling out from under them. Yet he became the political standardbearer for the plebeian classes, and his assassination by a conspiracy of rich Senators launched the era of civil wars that ended the rule of the old elite once and for all.

history
political-economy
philosophy
models-and-modes
argumentation
These demagogues needn’t come from within the excluded classes, by the way. Julius Caesar, the obvious example, came from an old upper-class Roman family and parlayed his family connections into a successful political career. Watchers of the current political scene may be interested to know that Caesar during his lifetime wasn’t the imposing figure he became in retrospect; he had a high shrill voice, his morals were remarkably flexible even by Roman standards—the scurrilous gossip of his time called him “every man’s wife and every woman’s husband”—and he spent much of his career piling up huge debts and then wriggling out from under them. Yet he became the political standardbearer for the plebeian classes, and his assassination by a conspiracy of rich Senators launched the era of civil wars that ended the rule of the old elite once and for all.

february 2017 by Vaguery

[1612.02483] High Dimensional Consistent Digital Segments

january 2017 by Vaguery

We consider the problem of digitalizing Euclidean line segments from ℝd to ℤd. Christ {\em et al.} (DCG, 2012) showed how to construct a set of {\em consistent digital segment} (CDS) for d=2: a collection of segments connecting any two points in ℤ2 that satisfies the natural extension of the Euclidean axioms to ℤd. In this paper we study the construction of CDSs in higher dimensions.

We show that any total order can be used to create a set of {\em consistent digital rays} CDR in ℤd (a set of rays emanating from a fixed point p that satisfies the extension of the Euclidean axioms). We fully characterize for which total orders the construction holds and study their Hausdorff distance, which in particular positively answers the question posed by Christ {\em et al.}.

approximation
computational-geometry
performance-measure
rather-interesting
mathematics
consistency
models-and-modes
constructive-geometry
nudge-targets
consider:representation
consider:looking-to-see
We show that any total order can be used to create a set of {\em consistent digital rays} CDR in ℤd (a set of rays emanating from a fixed point p that satisfies the extension of the Euclidean axioms). We fully characterize for which total orders the construction holds and study their Hausdorff distance, which in particular positively answers the question posed by Christ {\em et al.}.

january 2017 by Vaguery

[1508.05837] Hydroassets Portfolio Management for Intraday Electricity Trading in a Discrete Time Stochastic Optimization Perspective

january 2017 by Vaguery

Hydro storage system optimization is becoming one of the most challenging task in Energy Finance. Following the Blomvall and Lindberg (2002) interior point model, we set up a stochastic multiperiod optimization procedure by means of a "bushy" recombining tree that provides fast computational results. Inequality constraints are packed into the objective function by the logarithmic barrier approach and the utility function is approximated by its second order Taylor polynomial. The optimal solution for the original problem is obtained as a diagonal sequence where the first diagonal dimension is the parameter controlling the logarithmic penalty and the second is the parameter for the Newton step in the construction of the approximated solution. Optimal intraday electricity trading and water values for hydroassets as shadow prices are computed. The algorithm is implemented in Mathematica.

portfolio-theory
operations-research
financial-engineering
time-series
prediction
models-and-modes
nudge-targets
consider:performance-measures
consider:metaheuristics
january 2017 by Vaguery

[1604.04647] Sheaf and duality methods for analyzing multi-model systems

january 2017 by Vaguery

There is an interplay between models, specified by variables and equations, and their connections to one another. This dichotomy should be reflected in the abstract as well. Without referring to the models directly -- only that a model consists of spaces and maps between them -- the most readily apparent feature of a multi-model system is its topology. We propose that this topology should be modeled first, and then the spaces and maps of the individual models be specified in accordance with the topology. Axiomatically, this construction leads to sheaves. Sheaf theory provides a toolbox for constructing predictive models described by systems of equations. Sheaves are mathematical objects that manage the combination of bits of local information into a consistent whole. The power of this approach is that complex models can be assembled from smaller, easier-to-construct models. The models discussed in this chapter span the study of continuous dynamical systems, partial differential equations, probabilistic graphical models, and discrete approximations of these models.

category-theory
to-understand
models-and-modes
rather-interesting
no-really-I-think-I-need-to-understand-this-thread
january 2017 by Vaguery

Genotypic complexity of Fisher's geometric model | bioRxiv

january 2017 by Vaguery

Fisher's geometric model was originally introduced to argue that complex adaptations must occur in small steps because of pleiotropic constraints. When supplemented with the assumption of additivity of mutational effects on phenotypic traits, it provides a simple mechanism for the emergence of genotypic epistasis from the nonlinear mapping of phenotypes to fitness. Of particular interest is the occurrence of sign epistasis, which is a necessary condition for multipeaked genotypic fitness landscapes. Here we compute the probability that a pair of randomly chosen mutations interacts sign-epistatically, which is found to decrease algebraically with increasing phenotypic dimension n, and varies non-monotonically with the distance from the phenotypic optimum. We then derive asymptotic expressions for the mean number of fitness maxima in genotypic landscapes composed of all combinations of L random mutations. This number increases exponentially with L, and the corresponding growth rate is used as a measure of the complexity of the genotypic landscape. The dependence of the complexity on the parameters of the model is found to be surprisingly rich, and three distinct phases characterized by different landscape structures are identified. The complexity generally decreases with increasing phenotypic dimension, but a non-monotonic dependence on n is found in certain regimes. Our results inform the interpretation of experiments where the parameters of Fisher's model have been inferred from data, and help to elucidate which features of empirical fitness landscapes can (or cannot) be described by this model.

population-biology
theoretical-biology
theory-and-practice-sitting-in-a-tree
fitness-landscapes
models-and-modes
to-write-about
nudge-targets
consider:rediscovery
consider:robustness
consider:multiobjective-versions
january 2017 by Vaguery

[1612.02540] City traffic forecasting using taxi GPS data: A coarse-grained cellular automata model

december 2016 by Vaguery

City traffic is a dynamic system of enormous complexity. Modeling and predicting city traffic flow remains to be a challenge task and the main difficulties are how to specify the supply and demands and how to parameterize the model. In this paper we attempt to solve these problems with the help of large amount of floating car data. We propose a coarse-grained cellular automata model that simulates vehicles moving on uniform grids whose size are much larger compared with the microscopic cellular automata model. The car-car interaction in the microscopic model is replaced by the coupling between vehicles and coarse-grained state variables in our model. To parameterize the model, flux-occupancy relations are fitted from the historical data at every grids, which serve as the coarse-grained fundamental diagrams coupling the occupancy and speed. To evaluate the model, we feed it with the historical travel demands and trajectories obtained from the floating car data and use the model to predict road speed one hour into the future. Numerical results show that our model can capture the traffic flow pattern of the entire city and make reasonable predictions. The current work can be considered a prototype for a model-based forecasting system for city traffic.

agent-based
cellular-automata
rather-interesting
traffic
models-and-modes
to-write-about
representation
december 2016 by Vaguery

On Computational Explanations/ Synthese (in press) DOI 10.1007/s11229-016-1101-5 (PDF Download Available)

december 2016 by Vaguery

Computational explanations focus on information processing required in specific cognitive capacities, such as perception, reasoning or decision-making. These explanations specify the nature of the information processing task, what information needs to be represented, and why it should be operated on in a particular manner. In this article, the focus is on three questions concerning the nature of computational explanations: (1) What type of explanations they are, (2) in what sense computational explanations are explanatory and (3) to what extent they involve a special, “independent” or “autonomous” level of explanation. In this paper, we defend the view computational explanations are genuine explanations, which track non-causal/formal dependencies. Specifically, we argue that they do not provide mere sketches for explanation, in contrast to what for example Piccinini and Craver (Synthese 183(3):283–311, 2011) suggest. This view of computational explanations implies some degree of “autonomy” for the computational level. However, as we will demonstrate that does not make this view “computationally chauvinistic” in a way that Piccinini (Synthese 153:343–353, 2006b) or Kaplan (Synthese 183(3):339–373, 2011) have charged it to be.

via:cshalizi
philosophy-of-science
explanation
models-and-modes
to-read
to-write-about
december 2016 by Vaguery

[1608.05226] A tale of a Principal and many many Agents

august 2016 by Vaguery

In this paper, we investigate a moral hazard problem in finite time with lump-sum and continuous payments, involving infinitely many Agents, with mean field type interactions, hired by one Principal. By reinterpreting the mean-field game faced by each Agent in terms of a mean field FBSDE, we are able to rewrite the Principal's problem as a control problem for McKean-Vlasov SDEs. We review two general approaches to tackle it: the first one introduced recently in [2, 66, 67, 68, 69] using dynamic programming and Hamilton-Jacobi-Bellman equations, the second based on the stochastic Pontryagin maximum principle, which follows [16]. We solve completely and explicitly the problem in special cases, going beyond the usual linear-quadratic framework. We finally show in our examples that the optimal contract in the N-players' model converges to the mean-field optimal contract when the number of agents goes to +∞.

probability-theory
options
optimization
models-and-modes
rather-interesting
agent-based
nudge-targets
consider:looking-to-see
to-write-about
august 2016 by Vaguery

[1607.06274] Topological Data Analysis with Bregman Divergences

august 2016 by Vaguery

Given a finite set in a metric space, the topological analysis generalizes hierarchical clustering using a 1-parameter family of homology groups to quantify connectivity in all dimensions. The connectivity is compactly described by the persistence diagram. One limitation of the current framework is the reliance on metric distances, whereas in many practical applications objects are compared by non-metric dissimilarity measures. Examples are the Kullback-Leibler divergence, which is commonly used for comparing text and images, and the Itakura-Saito divergence, popular for speech and sound. These are two members of the broad family of dissimilarities called Bregman divergences.

We show that the framework of topological data analysis can be extended to general Bregman divergences, widening the scope of possible applications. In particular, we prove that appropriately generalized Cech and Delaunay (alpha) complexes capture the correct homotopy type, namely that of the corresponding union of Bregman balls. Consequently, their filtrations give the correct persistence diagram, namely the one generated by the uniformly growing Bregman balls. Moreover, we show that unlike the metric setting, the filtration of Vietoris-Rips complexes may fail to approximate the persistence diagram. We propose algorithms to compute the thus generalized Cech, Vietoris-Rips and Delaunay complexes and experimentally test their efficiency. Lastly, we explain their surprisingly good performance by making a connection with discrete Morse theory.

data-analysis
topology
metrics
to-understand
algorithms
representation
statistics
probability-theory
models-and-modes
We show that the framework of topological data analysis can be extended to general Bregman divergences, widening the scope of possible applications. In particular, we prove that appropriately generalized Cech and Delaunay (alpha) complexes capture the correct homotopy type, namely that of the corresponding union of Bregman balls. Consequently, their filtrations give the correct persistence diagram, namely the one generated by the uniformly growing Bregman balls. Moreover, we show that unlike the metric setting, the filtration of Vietoris-Rips complexes may fail to approximate the persistence diagram. We propose algorithms to compute the thus generalized Cech, Vietoris-Rips and Delaunay complexes and experimentally test their efficiency. Lastly, we explain their surprisingly good performance by making a connection with discrete Morse theory.

august 2016 by Vaguery

[1606.03490] The Mythos of Model Interpretability

july 2016 by Vaguery

Supervised machine learning models boast remarkable predictive capabilities. But can you trust your model? Will it work in deployment? What else can it tell you about the world? We want models to be not only good, but interpretable. And yet the task of interpretation appears underspecified. Papers provide diverse and sometimes non-overlapping motivations for interpretability, and offer myriad notions of what attributes render models interpretable. Despite this ambiguity, many papers proclaim interpretability axiomatically, absent further explanation. In this paper, we seek to refine the discourse on interpretability. First, we examine the motivations underlying interest in interpretability, finding them to be diverse and occasionally discordant. Then, we address model properties and techniques thought to confer interpretability, identifying transparency to humans and post-hoc explanations as competing notions. Throughout, we discuss the feasibility and desirability of different notions, and question the oft-made assertions that linear models are interpretable and that deep neural networks are not.

models-and-modes
infighting
representation
philosophy-of-engineering
to-write-about
neural-networks
cultural-assumptions
july 2016 by Vaguery

[1604.01674] OFFl models: novel schema for dynamical modeling of biological systems

july 2016 by Vaguery

Flow diagrams are a common tool used to help build and interpret models of dynamical systems, often in biological contexts such as consumer-resource models and similar compartmental models. Typically, their usage is intuitive and informal. Here, we present a formalized version of flow diagrams as a kind of weighted directed graph which follow a strict grammar, which translate into a system of ordinary differential equations (ODEs) by a single unambiguous rule, and which have an equivalent representation as a relational database. (We abbreviate this schema of "ODEs and formalized flow diagrams" as OFFl.) Drawing a diagram within this strict grammar encourages a mental discipline on the part of the modeler in which all dynamical processes of a system are thought of as interactions between dynamical species that draw parcels from one or more source species and deposit them into target species according to a set of transformation rules. From these rules, the net rate of change for each species can be derived. The modeling schema can therefore be understood as both an epistemic and practical heuristic for modeling, serving both as an organizational framework for the model building process and as a mechanism for deriving ODEs. All steps of the schema beyond the initial scientific (intuitive, creative) abstraction of natural observations into model variables are algorithmic and easily carried out by a computer, thus enabling the future development of a dedicated software implementation. Such tools would empower the modeler to consider significantly more complex models than practical limitations might have otherwise proscribed, since the modeling framework itself manages that complexity on the modeler's behalf. In this report, we describe the chief motivations for OFFl, outline its implementation, and utilize a range of classic examples from ecology and epidemiology to showcase its features.

models-and-modes
representation
visualization
systems-biology
formalization
amusing
theoretical-biology
systems-thinking
july 2016 by Vaguery

[1601.03243] Growth against entropy in bacterial metabolism: the phenotypic trade-off behind empirical growth rate distributions in E. coli

june 2016 by Vaguery

The solution space of genome-scale models of cellular metabolism provides a map between physically viable flux configurations and cellular metabolic phenotypes described, at the most basic level, by the corresponding growth rates. By sampling the solution space of E. coli's metabolic network, we show that empirical growth rate distributions recently obtained in experiments at single-cell resolution can be explained in terms of a trade-off between the higher fitness of fast-growing phenotypes and the higher entropy of slow-growing ones. Based on this, we propose a minimal model for the evolution of a large bacterial population that captures this trade-off. The scaling relationships observed in experiments encode, in such frameworks, for the same distance from the maximum achievable growth rate, the same degree of growth rate maximization, and/or the same rate of phenotypic change. Being grounded on genome-scale metabolic network reconstructions, these results allow for multiple implications and extensions in spite of the underlying conceptual simplicity.

cell-biology
theoretical-biology
rather-interesting
nonlinear-dynamics
models-and-modes
stochastic-systems
probability-theory
nudge-targets
consider:robustness
june 2016 by Vaguery

[1505.01396] A common brew - The relationships between varieties of percolation

may 2016 by Vaguery

In recent years, many variants of percolation have been used to study the structure of networks and the behavior of processes spreading along those networks. Among these are bond percolation, site percolation, k-core percolation, bootstrap percolation, the generalized epidemic process, and the Watts Threshold Model (WTM). We show that --- except for bond percolation --- each of these processes can be derived as a special case of the WTM. In fact "heterogeneous k-core percolation", a corresponding "heterogeneous bootstrap percolation" model, and the generalized epidemic process are in fact equivalent to one another and the WTM. We further show that a natural generalization of the WTM in which individuals "transmit" or "send a message" to their neighbors with some probability less than 1 can be reformulated in terms of the WTM, and so this apparent generalization is in fact not more general. Finally, we show that in bond percolation, finding the set of nodes in the component containing a given node is equivalent to finding the set of nodes activated if that node is initially activated and the node thresholds are chosen from the appropriate distribution. A consequence of these results is that mathematical techniques developed for the WTM apply to these other models as well, and techniques that were developed for some particular case may in fact apply much more generally.

percolation
models-and-modes
horse-races
comparison
rather-interesting
simulation
performance-network
nudge-targets
consider:feature-discovery
may 2016 by Vaguery

**related tags**

Copy this bookmark: