**dynamical_systems**227

[1810.05575] Joining and decomposing reaction networks

yesterday by cshalizi

"In systems and synthetic biology, much research has focused on the behavior and design of single pathways, while, more recently, experimental efforts have focused on how cross-talk (coupling two or more pathways) or inhibiting molecular function (isolating one part of the pathway) affects systems-level behavior. However, the theory for tackling these larger systems in general has lagged behind. Here, we analyze how joining networks (e.g., cross-talk) or decomposing networks (e.g., inhibition or knock-outs) affects three properties that reaction networks may possess---identifiability (recoverability of parameter values from data), steady-state invariants (relationships among species concentrations at steady state, used in model selection), and multistationarity (capacity for multiple steady states, which correspond to multiple cell decisions). Specifically, we prove results that clarify, for a network obtained by joining two smaller networks, how properties of the smaller networks can be inferred from or can imply similar properties of the original network. Our proofs use techniques from computational algebraic geometry, including elimination theory and differential algebra."

to:NB
biochemical_networks
algebra
dynamical_systems
yesterday by cshalizi

[1712.01572] Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces

2 days ago by cshalizi

"Transfer operators such as the Perron--Frobenius or Koopman operator play an important role in the global analysis of complex dynamical systems. The eigenfunctions of these operators can be used to detect metastable sets, to project the dynamics onto the dominant slow processes, or to separate superimposed signals. We extend transfer operator theory to reproducing kernel Hilbert spaces and show that these operators are related to Hilbert space representations of conditional distributions, known as conditional mean embeddings in the machine learning community. Moreover, numerical methods to compute empirical estimates of these embeddings are akin to data-driven methods for the approximation of transfer operators such as extended dynamic mode decomposition and its variants. One main benefit of the presented kernel-based approaches is that these methods can be applied to any domain where a similarity measure given by a kernel is available. We illustrate the results with the aid of guiding examples and highlight potential applications in molecular dynamics as well as video and text data analysis."

to:NB
kernel_estimators
dynamical_systems
statistical_inference_for_stochastic_processes
2 days ago by cshalizi

[1908.02843] Invariant predictions of epidemic patterns from radically different forms of seasonal forcing

9 days ago by cshalizi

"Seasonal variation in environmental variables, and in rates of contact among individuals, are fundamental drivers of infectious disease dynamics. Unlike most periodically-forced physical systems, for which the precise pattern of forcing is typically known, underlying patterns of seasonal variation in transmission rates can be estimated approximately at best, and only the period of forcing is accurately known. Yet solutions of epidemic models depend strongly on the forcing function, so dynamical predictions---such as changes in epidemic patterns that can be induced by demographic transitions or mass vaccination---are always subject to the objection that the underlying patterns of seasonality are poorly specified. Here, we demonstrate that the key bifurcations of the standard epidemic model are invariant to the shape of seasonal forcing if the amplitude of forcing is appropriately adjusted. Consequently, analyses applicable to real disease dynamics can be conducted with a smooth, idealized sinusoidal forcing function, and qualitative changes in epidemic patterns can be predicted without precise knowledge of the underlying forcing pattern. We find similar invariance in a seasonally forced predator-prey model, and conjecture that this phenomenon---and the associated robustness of predictions---might be a feature of many other periodically forced dynamical systems."

to:NB
dynamical_systems
epidemic_models
9 days ago by cshalizi

[1904.05172] Modeling a Hidden Dynamical System Using Energy Minimization and Kernel Density Estimates

9 days ago by rvenkat

In this paper we develop a kernel density estimation (KDE) approach to modeling and forecasting recurrent trajectories on a compact manifold. For the purposes of this paper, a trajectory is a sequence of coordinates in a phase space defined by an underlying hidden dynamical system. Our work is inspired by earlier work on the use of KDE to detect shipping anomalies using high-density, high-quality automated information system (AIS) data as well as our own earlier work in trajectory modeling. We focus specifically on the sparse, noisy trajectory reconstruction problem in which the data are (i) sparsely sampled and (ii) subject to an imperfect observer that introduces noise. Under certain regularity assumptions, we show that the constructed estimator minimizes a specific energy function defined over the trajectory as the number of samples obtained grows.

dynamical_systems
kernel
statistics
machine_learning
differential_geometry
time_series
via:cshalizi
9 days ago by rvenkat

[1904.05172] Modeling a Hidden Dynamical System Using Energy Minimization and Kernel Density Estimates

9 days ago by cshalizi

"In this paper we develop a kernel density estimation (KDE) approach to modeling and forecasting recurrent trajectories on a compact manifold. For the purposes of this paper, a trajectory is a sequence of coordinates in a phase space defined by an underlying hidden dynamical system. Our work is inspired by earlier work on the use of KDE to detect shipping anomalies using high-density, high-quality automated information system (AIS) data as well as our own earlier work in trajectory modeling. We focus specifically on the sparse, noisy trajectory reconstruction problem in which the data are (i) sparsely sampled and (ii) subject to an imperfect observer that introduces noise. Under certain regularity assumptions, we show that the constructed estimator minimizes a specific energy function defined over the trajectory as the number of samples obtained grows."

to:NB
dynamical_systems
time_series
inference_to_latent_objects
statistics
statistical_inference_for_stochastic_processes
9 days ago by cshalizi

[1907.12881] Response and Sensitivity Using Markov Chains

10 days ago by cshalizi

"Dynamical systems are often subject to forcing or changes in their governing parameters and it is of interest to study how this affects their statistical properties. A prominent real-life example of this class of problems is the investigation of climate response to perturbations. In this respect, it is crucial to determine what the linear response of a system is to small perturbations as a quantification of sensitivity. Alongside previous work, here we use the transfer operator formalism to study the response and sensitivity of a dynamical system undergoing perturbations. By projecting the transfer operator onto a suitable finite dimensional vector space, one is able to obtain matrix representations which determine finite Markov processes. Further, using perturbation theory for Markov matrices, it is possible to determine the linear and nonlinear response of the system given a prescribed forcing. Here, we suggest a methodology which puts the scope on the evolution law of densities (the Liouville/Fokker-Planck equation), allowing to effectively calculate the sensitivity and response of two representative dynamical systems."

to:NB
stochastic_processes
dynamical_systems
markov_models
fluctuation-response
macro_from_micro
10 days ago by cshalizi

[1908.02233] Koopman Representations of Dynamic Systems with Control

11 days ago by cshalizi

"The design and analysis of optimal control policies for dynamical systems can be complicated by nonlinear dependence in the state variables. Koopman operators have been used to simplify the analysis of dynamical systems by mapping the flow of the system onto a space of observables where the dynamics are linear (and possibly infinte). This paper focuses on the development of consistent Koopman representations for controlled dynamical system. We introduce the concept of dynamical consistency for Koopman representations and analyze several existing and proposed representations deriving necessary constraints on the dynamical system, observables, and Koopman operators. Our main result is a hybrid formulation which independently and jointly observes the state and control inputs. This formulation admits a relatively large space of dynamical systems compared to earlier formulations while keeping the Koopman operator independent of the state and control inputs. More generally, this work provides an analysis framework to evaluate and rank proposed simplifications to the general Koopman representation for controlled dynamical systems."

to:NB
dynamical_systems
control_theory_and_control_engineering
11 days ago by cshalizi

[1712.03785] A primer on noise-induced transitions in applied dynamical systems

11 days ago by cshalizi

"Noise plays a fundamental role in a wide variety of physical and biological dynamical systems. It can arise from an external forcing or due to random dynamics internal to the system. It is well established that even weak noise can result in large behavioral changes such as transitions between or escapes from quasi-stable states. These transitions can correspond to critical events such as failures or extinctions that make them essential phenomena to understand and quantify, despite the fact that their occurrence is rare. This article will provide an overview of the theory underlying the dynamics of rare events for stochastic models along with some example applications."

to:NB
large_deviations
meta-stability
dynamical_systems
re:do-institutions-evolve
11 days ago by cshalizi

[1908.01520] Long time dynamics for interacting oscillators on dense graphs

11 days ago by cshalizi

"The long time dynamics of the stochastic Kuramoto model defined on a graph is analyzed in the subcritical regime. The emphasis is posed on the relationship between the mean field behavior and the connectivity of the underlying graph: we give an explicit deterministic condition on the sequence of graphs such that, for any initial condition, even dependent on the network, the system approaches the unique stable stationary solution and it remains close to it, up to almost exponential times. The condition on the sequence of graphs is expressed through a concentration in ℓ∞→ℓ1 norm and it is shown to be satisfied by a large class of graphs, random and deterministic, provided that the number of neighbors per site diverges, as the size of the system tends to infinity."

to:NB
kuramoto_model
dynamical_systems
graph_limits
11 days ago by cshalizi

[1908.00845] Iterations of dependent random maps and exogeneity in nonlinear dynamics

12 days ago by cshalizi

"We discuss existence and uniqueness of stationary and ergodic nonlinear autoregressive processes when exogenous regressors are incorporated in the dynamic. To this end, we consider the convergence of the backward iterations of dependent random maps. In particular, we give a new result when the classical condition of contraction on average is replaced with a contraction in conditional expectation. Under some conditions, we also derive an explicit control of the functional dependence of Wu (2005) which guarantees a wide range of statistical applications. Our results are illustrated with CHARME models, GARCH processes, count time series, binary choice models and categorical time series for which we provide many extensions of existing results."

to:NB
stochastic_processes
ergodic_theory
mixing
time_series
dynamical_systems
statistics
statistical_inference_for_stochastic_processes
12 days ago by cshalizi

[1907.13490] Linear response for macroscopic observables in high-dimensional systems

16 days ago by cshalizi

"The long-term average response of observables of chaotic systems to dynamical perturbations can often be predicted using linear response theory, but not all chaotic systems possess a linear response. Macroscopic observables of complex dissipative chaotic systems, however, are widely assumed to have a linear response even if the microscopic variables do not, but the mechanism for this is not well-understood.

"We present a comprehensive picture for the linear response of macroscopic observables in high-dimensional weakly coupled deterministic dynamical systems, where the weak coupling is via a mean field and the microscopic subsystems may or may not obey linear response theory. We derive stochastic reductions of the dynamics of these observables from statistics of the microscopic system, and provide conditions for linear response theory to hold in finite dimensional systems and in the thermodynamic limit. In particular, we show that for large systems of finite size, linear response is induced via self-generated noise.

"We present examples in the thermodynamic limit where the macroscopic observable satisfies LRT, although the microscopic subsystems individually violate LRT, as well a converse example where the macroscopic observable does not satisfy LRT despite all microscopic subsystems satisfying LRT when uncoupled. This latter, maybe surprising, example is associated with emergent non-trivial dynamics of the macroscopic observable. We provide numerical evidence for our results on linear response as well as some analytical intuition"

to:NB
dynamical_systems
statistical_mechanics
macro_from_micro
"We present a comprehensive picture for the linear response of macroscopic observables in high-dimensional weakly coupled deterministic dynamical systems, where the weak coupling is via a mean field and the microscopic subsystems may or may not obey linear response theory. We derive stochastic reductions of the dynamics of these observables from statistics of the microscopic system, and provide conditions for linear response theory to hold in finite dimensional systems and in the thermodynamic limit. In particular, we show that for large systems of finite size, linear response is induced via self-generated noise.

"We present examples in the thermodynamic limit where the macroscopic observable satisfies LRT, although the microscopic subsystems individually violate LRT, as well a converse example where the macroscopic observable does not satisfy LRT despite all microscopic subsystems satisfying LRT when uncoupled. This latter, maybe surprising, example is associated with emergent non-trivial dynamics of the macroscopic observable. We provide numerical evidence for our results on linear response as well as some analytical intuition"

16 days ago by cshalizi

[1907.12998] Approximation Capabilities of Neural Ordinary Differential Equations

18 days ago by rvenkat

Neural Ordinary Differential Equations have been recently proposed as an infinite-depth generalization of residual networks. Neural ODEs provide out-of-the-box invertibility of the mapping realized by the neural network, and can lead to networks that are more efficient in terms of computational time and parameter space. Here, we show that a Neural ODE operating on a space with dimensionality increased by one compared to the input dimension is a universal approximator for the space of continuous functions, at the cost of loosing invertibility. We then turn our focus to invertible mappings, and we prove that any homeomorphism on a p-dimensional Euclidean space can be approximated by a Neural ODE operating on a (2p+1)-dimensional Euclidean space.

-- Makes we want to go back and dust off my old differential topology books. Surely, some of the embedding theorems address these issues.

neural_networks
deep_learning
ode
dynamical_systems
via:raginsky
-- Makes we want to go back and dust off my old differential topology books. Surely, some of the embedding theorems address these issues.

18 days ago by rvenkat

[1905.10844] A randomized discretization of the nonlocal diffusion equation

28 days ago by rvenkat

Using our recent results on convergence of interacting dynamical systems on graphs to the continuum limit, we propose a numerical scheme for the initial value problem for a nonlocal diffusion equation. Our method is based on the approximation of the nonlocal term by averaging it over a set of small subdomains chosen randomly. We prove convergence of the numerical scheme and estimate the rate of convergence. Our method applies to models with low regularity of the kernel defining the nonlocal diffusion term.

pde
graph_limit
limit_objects
dynamical_systems
28 days ago by rvenkat

[1907.01552] Forecasting high-dimensional dynamics exploiting suboptimal embeddings

4 weeks ago by cshalizi

"Delay embedding---a method for reconstructing dynamical systems by delay coordinates---is widely used to forecast nonlinear time series as a model-free approach. When multivariate time series are observed, several existing frameworks can be applied to yield a single forecast combining multiple forecasts derived from various embeddings. However, the performance of these frameworks is not always satisfactory because they randomly select embeddings or use brute force and do not consider the diversity of the embeddings to combine. Herein, we develop a forecasting framework that overcomes these existing problems. The framework exploits various "suboptimal embeddings" obtained by minimizing the in-sample error via combinatorial optimization. The framework achieves the best results among existing frameworks for sample toy datasets and a real-world flood dataset. We show that the framework is applicable to a wide range of data lengths and dimensions. Therefore, the framework can be applied to various fields such as neuroscience, ecology, finance, fluid dynamics, weather, and disaster prevention."

to:NB
dynamical_systems
time_series
prediction
geometry_from_a_time_series
4 weeks ago by cshalizi

[1907.01681] Gradient flow formulations of discrete and continuous evolutionary models: a unifying perspective

4 weeks ago by cshalizi

"We consider three classical models of biological evolution: (i) the Moran process, an example of a reducible Markov Chain; (ii) the Kimura Equation, a particular case of a degenerated Fokker-Planck Diffusion; (iii) the Replicator Equation, a paradigm in Evolutionary Game Theory. While these approaches are not completely equivalent, they are intimately connected, since (ii) is the diffusion approximation of (i), and (iii) is obtained from (ii) in an appropriate limit. It is well known that the Replicator Dynamics for two strategies is a gradient flow with respect to the celebrated Shahshahani distance. We reformulate the Moran process and the Kimura Equation as gradient flows and in the sequel we discuss conditions such that the associated gradient structures converge: (i) to (ii) and (ii) to (iii). This provides a geometric characterisation of these evolutionary processes and provides a reformulation of the above examples as time minimization of free energy functionals."

to:NB
replicator_dynamics
evolutionary_biology
dynamical_systems
4 weeks ago by cshalizi

Regulation of harvester ant foraging as a closed-loop excitable system

7 weeks ago by rvenkat

Ant colonies regulate activity in response to changing conditions without using centralized control. Desert harvester ant colonies forage for seeds, and regulate foraging to manage a tradeoff between spending and obtaining water. Foragers lose water while outside in the dry air, but ants obtain water by metabolizing the fats in the seeds they eat. Previous work shows that the rate at which an outgoing forager leaves the nest depends on its recent rate of brief antennal contacts with incoming foragers carrying food. We examine how this process can yield foraging rates that are robust to uncertainty and responsive to temperature and humidity across minute-to-hour timescales. To explore possible mechanisms, we develop a low-dimensional analytical model with a small number of parameters that captures observed foraging behavior. The model uses excitability dynamics to represent response to interactions inside the nest and a random delay distribution to represent foraging time outside the nest. We show how feedback from outgoing foragers returning to the nest stabilizes the incoming and outgoing foraging rates to a common value determined by the volatility of available foragers. The model exhibits a critical volatility above which there is sustained foraging at a constant rate and below which foraging stops. To explain how foraging rates adjust to temperature and humidity, we propose that foragers modify their volatility after they leave the nest and become exposed to the environment. Our study highlights the importance of feedback in the regulation of foraging activity and shows how modulation of volatility can explain how foraging activity responds to conditions and varies across colonies. Our model elucidates the role of feedback across many timescales in collective behavior, and may be generalized to other systems driven by excitable dynamics, such as neuronal networks.

animal_behavior
stochastic_processes
dynamical_systems
deborah.gordon
control_theory
markov_models
distributed_computing
collective_animal_behavior
7 weeks ago by rvenkat

[1006.1265] Symbolic dynamics

7 weeks ago by rvenkat

This chapter presents some of the links between automata theory and symbolic dynamics. The emphasis is on two particular points. The first one is the interplay between some particular classes of automata, such as local automata and results on embeddings of shifts of finite type. The second one is the connection between syntactic semigroups and the classification of sofic shifts up to conjugacy.

handbook
review
dynamical_systems
automata
7 weeks ago by rvenkat

Nonlinear Time Series Analysis with R - Hardcover - Ray Huffaker; Marco Bittelli; Rodolfo Rosa - Oxford University Press

7 weeks ago by cshalizi

"Nonlinear Time Series Analysis with R provides a practical guide to emerging empirical techniques allowing practitioners to diagnose whether highly fluctuating and random appearing data are most likely driven by random or deterministic dynamic forces. It joins the chorus of voices recommending 'getting to know your data' as an essential preliminary evidentiary step in modelling. Time series are often highly fluctuating with a random appearance. Observed volatility is commonly attributed to exogenous random shocks to stable real-world systems. However, breakthroughs in nonlinear dynamics raise another possibility: highly complex dynamics can emerge endogenously from astoundingly parsimonious deterministic nonlinear models. Nonlinear Time Series Analysis (NLTS) is a collection of empirical tools designed to aid practitioners detect whether stochastic or deterministic dynamics most likely drive observed complexity. Practitioners become 'data detectives' accumulating hard empirical evidence supporting their modelling approach.

"This book is targeted to professionals and graduate students in engineering and the biophysical and social sciences. Its major objectives are to help non-mathematicians--with limited knowledge of nonlinear dynamics--to become operational in NLTS; and in this way to pave the way for NLTS to be adopted in the conventional empirical toolbox and core coursework of the targeted disciplines. Consistent with modern trends in university instruction, the book makes readers active learners with hands-on computer experiments in R code directing them through NLTS methods and helping them understand the underlying logic. The computer code is explained in detail so that readers can adjust it for use in their own work. The book also provides readers with an explicit framework--condensed from sound empirical practices recommended in the literature--that details a step-by-step procedure for applying NLTS in real-world data diagnostics"

to:NB
books:noted
dynamical_systems
time_series
statistics
to_teach:data_over_space_and_time
"This book is targeted to professionals and graduate students in engineering and the biophysical and social sciences. Its major objectives are to help non-mathematicians--with limited knowledge of nonlinear dynamics--to become operational in NLTS; and in this way to pave the way for NLTS to be adopted in the conventional empirical toolbox and core coursework of the targeted disciplines. Consistent with modern trends in university instruction, the book makes readers active learners with hands-on computer experiments in R code directing them through NLTS methods and helping them understand the underlying logic. The computer code is explained in detail so that readers can adjust it for use in their own work. The book also provides readers with an explicit framework--condensed from sound empirical practices recommended in the literature--that details a step-by-step procedure for applying NLTS in real-world data diagnostics"

7 weeks ago by cshalizi

[1906.09211] Universal Approximation of Input-Output Maps by Temporal Convolutional Nets

7 weeks ago by rvenkat

There has been a recent shift in sequence-to-sequence modeling from recurrent network architectures to convolutional network architectures due to computational advantages in training and operation while still achieving competitive performance. For systems having limited long-term temporal dependencies, the approximation capability of recurrent networks is essentially equivalent to that of temporal convolutional nets (TCNs). We prove that TCNs can approximate a large class of input-output maps having approximately finite memory to arbitrary error tolerance. Furthermore, we derive quantitative approximation rates for deep ReLU TCNs in terms of the width and depth of the network and modulus of continuity of the original input-output map, and apply these results to input-output maps of systems that admit finite-dimensional state-space realizations (i.e., recurrent models).

Also this

http://www.dct.tue.nl/New/Wouw/SCL2004.pdf

(I might have seen his textbooks floating around in the used book markets back in the day. From MIR Publishers)

neural_networks
approximation_theory
maxim.raginsky
dynamical_systems
Also this

http://www.dct.tue.nl/New/Wouw/SCL2004.pdf

(I might have seen his textbooks floating around in the used book markets back in the day. From MIR Publishers)

7 weeks ago by rvenkat

[1906.09069] First Exit Time Analysis of Stochastic Gradient Descent Under Heavy-Tailed Gradient Noise

7 weeks ago by rvenkat

Stochastic gradient descent (SGD) has been widely used in machine learning due to its computational efficiency and favorable generalization properties. Recently, it has been empirically demonstrated that the gradient noise in several deep learning settings admits a non-Gaussian, heavy-tailed behavior. This suggests that the gradient noise can be modeled by using α-stable distributions, a family of heavy-tailed distributions that appear in the generalized central limit theorem. In this context, SGD can be viewed as a discretization of a stochastic differential equation (SDE) driven by a Lévy motion, and the metastability results for this SDE can then be used for illuminating the behavior of SGD, especially in terms of `preferring wide minima'. While this approach brings a new perspective for analyzing SGD, it is limited in the sense that, due to the time discretization, SGD might admit a significantly different behavior than its continuous-time limit. Intuitively, the behaviors of these two systems are expected to be similar to each other only when the discretization step is sufficiently small; however, to the best of our knowledge, there is no theoretical understanding on how small the step-size should be chosen in order to guarantee that the discretized system inherits the properties of the continuous-time system. In this study, we provide formal theoretical analysis where we derive explicit conditions for the step-size such that the metastability behavior of the discrete-time system is similar to its continuous-time limit. We show that the behaviors of the two systems are indeed similar for small step-sizes and we identify how the error depends on the algorithm and problem parameters. We illustrate our results with simulations on a synthetic model and neural networks.

neural_networks
optimization
dynamical_systems
stochastic_processes
first-passage-time
via:raginsky
7 weeks ago by rvenkat

**related tags**

Copy this bookmark: