cshalizi + algebra   24

[1909.13754] Identifiability in Phylogenetics using Algebraic Matroids
"Identifiability is a crucial property for a statistical model since distributions in the model uniquely determine the parameters that produce them. In phylogenetics, the identifiability of the tree parameter is of particular interest since it means that phylogenetic models can be used to infer evolutionary histories from data. In this paper we introduce a new computational strategy for proving the identifiability of discrete parameters in algebraic statistical models that uses algebraic matroids naturally associated to the models. We then use this algorithm to prove that the tree parameters are generically identifiable for 2-tree CFN and K3P mixtures. We also show that the k-cycle phylogenetic network parameter is identifiable under the K2P and K3P models."
to:NB  identifiability  phylogenetics  algebra  statistics 
20 days ago by cshalizi
[1404.2035] Strongly continuous and locally equi-continuous semigroups on locally convex spaces
"We consider locally equi-continuous strongly continuous semigroups on locally convex spaces (X,tau). First, we show that if (X,tau) has the property that weak* compact sets of the dual are equi-continuous, then strong continuity of the semigroup is equivalent to weak continuity and local equi-continuity.
"Second, we consider locally convex spaces (X,tau) that are also equipped with a `suitable' auxiliary norm. We introduce the set N of tau continuous semi-norms that are bounded by the norm. If (X,tau) has the property that N is closed under countable convex combinations, then a number of Banach space results can be generalised in a straightforward way. Importantly, we extend the Hille-Yosida theorem.
"We apply the results to the study of transition semigroups of Markov processes on complete separable metric spaces."
to:NB  algebra  markov_models  stochastic_processes  re:almost_none 
5 weeks ago by cshalizi
[1811.01394] A method to construct exponential families by representation theory
"In this paper, we give a method to construct "good" exponential families systematically by representation theory. More precisely, we consider a homogeneous space G/H as a sample space and construct an exponential family invariant under the transformation group G by using a representation of G. The method generates widely used exponential families such as normal, gamma, Bernoulli, categorical, Wishart, von Mises, Fisher-Bingham and hyperboloid distributions."
to:NB  exponential_families  algebra  statistics  probability 
8 weeks ago by cshalizi
[1302.5484] Input-output equivalence and identifiability: some simple generalizations of the differential algebra approach
"In this paper, we give an overview of the differential algebra approach to identifiability, and then note a very simple observation about input-output equivalence and identifiability, that describes the identifiability equivalence between input-output equivalent models. We then give several simple consequences of this observation that can be useful in showing identifiability, including examining non-first order ODE models, nondimensionalization and rescaling, model reducibility, and a modular approach to evaluating identifiability. We also examine how input-output equivalence can allow us to generate input output equations in the differential algebra approach through a wider range of methods (e.g. substitution and differential or standard Groebner basis approaches)."
to:NB  state-space_models  identifiability  algebra  statistics 
8 weeks ago by cshalizi
[1908.05483] Algebraic Coarse-Graining of Biochemical Reaction Networks
"Biological systems exhibit processes on a wide range of time and length scales. This work demonstrates that models, wherein the interaction between system constituents is captured by algebraic operations, inherently allow for successive coarse-graining operations through quotients of the algebra. Thereby, the class of model is retained and all possible coarse-graining operations are encoded in the lattice of congruences of the model. We analyze a class of algebraic models generated by the subsequent and simultaneous catalytic functions of chemicals within a reaction network. Our ansatz yields coarse-graining operations that cover the network with local functional patches and delete the information about the environment, and complementary operations that resolve only the large-scale functional structure of the network. Finally, we present a geometric interpretation of the algebraic models through an analogy with classical models on vector fields. We then use the geometric framework to show how a coarse-graining of the algebraic model naturally leads to a coarse-graining of the state-space. The framework developed here is aimed at the study of the functional structure of cellular reaction networks spanning a wide range of scales."
to:NB  biochemical_networks  algebra 
9 weeks ago by cshalizi
[1908.04642] Semigroup Models for Biochemical Reaction Networks
"The CRS (chemical reaction system) formalism by Hordijk and Steel is a versatile method to model self-sustaining biochemical reaction networks. Its distinguishing feature is the explicit assignment of catalytic function to chemicals that are part of the network. In this work, we show the introduction of subsequent and simultaneous catalytic functions gives rise to an algebraic structure of a semigroup with additional compatible data of idempotent addition and a partial order. The aim of this paper is to demonstrate that such semigroup models are a natural setup to treat the emergence of autocatalytic biochemical reaction networks. We establish the basic algebraic properties of the algebraic models and show that it is natural to define the function of any subnetwork on the whole reaction network in a mathematically correct way. This leads to a natural discrete dynamics on the network, which results from iteratively considering the self-action on a subnetwork by its own function. Finally, we demonstrate that the identification of the maximal self-sustaining subnetwork of any reaction network is very straightforward in our setup. This leads to an algebraic characterization of the lattice of self-sustaining subsets for any CRS. Moreover, we show that algebraic models for reaction networks with a self-sustaining subnetwork cannot be nilpotent, thus establishing a link to the combinatorial theory of finite semigroups."
to:NB  biochemical_networks  algebra 
9 weeks ago by cshalizi
[1810.05575] Joining and decomposing reaction networks
"In systems and synthetic biology, much research has focused on the behavior and design of single pathways, while, more recently, experimental efforts have focused on how cross-talk (coupling two or more pathways) or inhibiting molecular function (isolating one part of the pathway) affects systems-level behavior. However, the theory for tackling these larger systems in general has lagged behind. Here, we analyze how joining networks (e.g., cross-talk) or decomposing networks (e.g., inhibition or knock-outs) affects three properties that reaction networks may possess---identifiability (recoverability of parameter values from data), steady-state invariants (relationships among species concentrations at steady state, used in model selection), and multistationarity (capacity for multiple steady states, which correspond to multiple cell decisions). Specifically, we prove results that clarify, for a network obtained by joining two smaller networks, how properties of the smaller networks can be inferred from or can imply similar properties of the original network. Our proofs use techniques from computational algebraic geometry, including elimination theory and differential algebra."
to:NB  biochemical_networks  algebra  dynamical_systems 
9 weeks ago by cshalizi
Codes and automata | Algebra | Cambridge University Press
"This major revision of Berstel and Perrin's classic Theory of Codes has been rewritten with a more modern focus and a much broader coverage of the subject. The concept of unambiguous automata, which is intimately linked with that of codes, now plays a significant role throughout the book, reflecting developments of the last 20 years. This is complemented by a discussion of the connection between codes and automata, and new material from the field of symbolic dynamics. The authors have also explored links with more practical applications, including data compression and cryptography. The treatment remains self-contained: there is background material on discrete mathematics, algebra and theoretical computer science. The wealth of exercises and examples make it ideal for self-study or courses. In summary, this is a comprehensive reference on the theory of variable-length codes and their relation to automata."
to:NB  books:noted  automata_theory  information_theory  algebra  mathematics 
january 2019 by cshalizi
Bhatia, R.: Positive Definite Matrices (eBook and Paperback).
"This book represents the first synthesis of the considerable body of new research into positive definite matrices. These matrices play the same role in noncommutative analysis as positive real numbers do in classical analysis. They have theoretical and computational uses across a broad spectrum of disciplines, including calculus, electrical engineering, statistics, physics, numerical analysis, quantum information theory, and geometry. Through detailed explanations and an authoritative and inspiring writing style, Rajendra Bhatia carefully develops general techniques that have wide applications in the study of such matrices.
"Bhatia introduces several key topics in functional analysis, operator theory, harmonic analysis, and differential geometry--all built around the central theme of positive definite matrices. He discusses positive and completely positive linear maps, and presents major theorems with simple and direct proofs. He examines matrix means and their applications, and shows how to use positive definite functions to derive operator inequalities that he and others proved in recent years. He guides the reader through the differential geometry of the manifold of positive definite matrices, and explains recent work on the geometric mean of several matrices."
to:NB  books:noted  mathematics  algebra  linear_regression  re:g_paper  statistics 
september 2015 by cshalizi
Recovery from selection bias using marginal structure in discrete models
"This paper considers the problem of inferring a discrete joint distribution from a sample subject to selection. Abstractly, we want to identify a distribution p(x, w) from its condi- tional p(x | w). We introduce new assump- tions on the marginal model for p(x), un- der which generic identification is possible. These assumptions are quite general and can easily be tested; they do not require pre- cise background knowledge of p(x) or p(w), such as proportions estimated from previous studies. We particularly consider conditional independence constraints, which often arise from graphical and causal models, although other constraints can also be used. We show that generic identifiability of causal effects is possible in a much wider class of causal mod- els than had previously been known."
in_NB  graphical_models  identifiability  statistics  categorical_data  partial_identification  algebra  heard_the_talk  didelez.vanessa 
july 2015 by cshalizi
Lions , Nisio : A uniqueness result for the semigroup associated with the Hamilton-Jacobi-Belman operator
b/c the Feng and Kurtz book on large deviations for stochastic processes (e.g., evolutionary models) seems to presume the reader knows what a "Nisio semigroup" is, and I don't.

ETA: I should have known better than to expect reading pure mathematicians would be clarifying.
stochastic_processes  stochastic_differential_equations  control_theory_and_control_engineering  algebra  have_read 
january 2015 by cshalizi
[1412.6621] Why does Deep Learning work? - A perspective from Group Theory
"Why does Deep Learning work? What representations does it capture? How do higher-order representations emerge? We study these questions from the perspective of group theory, thereby opening a new approach towards a theory of Deep learning.
"One factor behind the recent resurgence of the subject is a key algorithmic step called pre-training: first search for a good generative model for the input samples, and repeat the process one layer at a time. We show deeper implications of this simple principle, by establishing a connection with the interplay of orbits and stabilizers of group actions. Although the neural networks themselves may not form groups, we show the existence of {\em shadow} groups whose elements serve as close approximations.
"Over the shadow groups, the pre-training step, originally introduced as a mechanism to better initialize a network, becomes equivalent to a search for features with minimal orbits. Intuitively, these features are in a way the {\em simplest}. Which explains why a deep learning network learns simple features first. Next, we show how the same principle, when repeated in the deeper layers, can capture higher order representations, and why representation complexity increases as the layers get deeper."

- Slightly dubious, but S.V. is always worth attending to.
to:NB  neural_networks  algebra  machine_learning 
january 2015 by cshalizi
An Algebraic Process for Visual Representation Design
"We present a model of visualization design based on algebraic considerations of the visualization process. The model helps characterize visual encodings, guide their design, evaluate their effectiveness, and highlight their shortcomings. The model has three components: the underlying mathematical structure of the data or object being visualized, the concrete representation of the data in a computer, and (to the extent possible) a mathematical description of how humans perceive the visualization. Because we believe the value of our model lies in its practical application, we propose three general principles for good visualization design. We work through a collection of examples where our model helps explain the known properties of existing visualizations methods, both good and not-so-good, as well as suggesting some novel methods. We describe how to use the model alongside experimental user studies, since it can help frame experiment outcomes in an actionable manner. Exploring the implications and applications of our model and its design principles should provide many directions for future visualization research."
to:NB  to_read  visual_display_of_quantitative_information  representation  to_teach:statcomp  to_teach:data-mining  algebra 
september 2014 by cshalizi
Modern Algebra and the Rise of Mathematical Structures
"The notion of a mathematical structure is among the most pervasive ones in twentieth-century mathematics. Modern Algebra and the Rise of Mathematical Structures describes two stages in the historical development of this notion: first, it traces its rise in the context of algebra from the mid-nineteenth century to its consolidation by 1930, and then it considers several attempts to formulate elaborate theories after 1930 aimed at elucidating, from a purely mathematical perspective, the precise meaning of this idea.
"Part one dicusses the process whereby the aims and scope of the discipline of algebra were deeply transformed, turning it into that branch of mathematics dealing with a new kind of mathematical entities: the "algebraic structures". The transition from the classical, nineteenth-century, image of the discipline to the thear of ideals, from Richard Dedekind to Emmy Noether, and culminating with the publication in 1930 of Bartel L. van der Waerden's Moderne Algebra. Following its enormous success in algebra, the structural approach has been widely adopted in other mathematical domains since 1930s. But what is a mathematical structure and what is the place of this notion within the whole fabric of mathematics? Part Two describes the historical roots, the early stages and the interconnections between three attempts to address these questions from a purely formal, mathematical perspective: Oystein Ore's lattice-theoretical theory of structures, Nicolas Bourbaki's theory of structures, and the theory of categories and functors."
in_NB  books:noted  history_of_mathematics  mathematics  algebra  category_theory 
may 2014 by cshalizi
[1304.6659] Dimensional analysis using toric ideals
"Classical dimensional analysis is one of the cornerstones of qualitative physics and is also used in the analysis of engineering systems, for example in engineering design. The basic power product relationship in dimensional analysis is identical to one way of defining toric ideals in algebraic geometry, a large and growing field. This paper exploits the toric representation to provide a method for automatic dimensional analysis for engineering systems. In particular all "primitive", invariants for a particular problem, in a well defined sense, can be found using such methods."
to:NB  dimensional_analysis  algebra  algebraic_geometry 
april 2013 by cshalizi
[1207.5910] Groups related to Gaussian graphical models
"Gaussian graphical models have become a well-recognized tool for the analysis of conditional independencies within a set of continuous random variables. We describe the maximal group acting on the space of covariance matrices, which leaves this model invariant for any fixed graph. This group furnishes a representation of Gaussian graphical models as composite transformation families and enables to analyze properties of parameter estimators. We use these results in the robustness analysis to compute upper bounds on finite sample breakdown points of equivariant estimators of the covariance matrix. In addition we provide conditions on the sample size so that an equivariant estimator exists with probability 1."
to:NB  algebra  graphical_models  probability  statistics  variance_estimation 
august 2012 by cshalizi
[1207.4814] Automorphism Groups of Graphical Models and Lifted Variational Inference
"Using the theory of group action, we first introduce the concept of the automorphism group of an exponential family or a graphical model, thus formalizing the general notion of symmetry of a probabilistic model. This automorphism group provides a precise mathematical framework for lifted inference in the general exponential family. Its group action partitions the set of random variables and feature functions into equivalent classes (called orbits) having identical marginals and expectations. Then the inference problem is effectively reduced to that of computing marginals or expectations for each class, thus avoiding the need to deal with each individual variable or feature. We demonstrate the usefulness of this general framework in lifting two classes of variational approximation for MAP inference: local LP relaxation and local LP relaxation with cycle constraints; the latter yields the first lifted inference that operate on a bound tighter than local constraints. Initial experimental results demonstrate that lifted MAP inference with cycle constraints achieved the state of the art performance, obtaining much better objective function values than local approximation while remaining relatively efficient."
to:NB  graphical_models  exponential_families  identifiability  statistics  algebra 
july 2012 by cshalizi
Bingham : Multivariate prediction and matrix Szegö theory
"Following the recent survey by the same author of Szegö’s theorem and orthogonal polynomials on the unit circle (OPUC) in the scalar case, we survey the corresponding multivariate prediction theory and matrix OPUC (MOPUC)."
to:NB  probability  statistics  algebra 
july 2012 by cshalizi
Bingham : Szegö’s theorem and its probabilistic descendants
"The theory of orthogonal polynomials on the unit circle (OPUC) dates back to Szegö’s work of 1915-21, and has been given a great impetus by the recent work of Simon, in particular his survey paper and three recent books; we allude to the title of the third of these, Szegö’s theorem and its descendants, in ours. Simon’s motivation comes from spectral theory and analysis. Another major area of application of OPUC comes from probability, statistics, time series and prediction theory; see for instance the classic book by Grenander and Szegö, Toeplitz forms and their applications. Coming to the subject from this background, our aim here is to complement this recent work by giving some probabilistically motivated results. We also advocate a new definition of long-range dependence."
to:NB  statistics  probability  algebra 
july 2012 by cshalizi
[1111.1352] Max-plus objects to study the complexity of graphs
"Given an undirected graph $G$, we define a new object $H_G$, called the mp-chart of $G$, in the max-plus algebra. We use it, together with the max-plus permanent, to describe the complexity of graphs. We show how to compute the mean and the variance of $H_G$ in terms of the adjacency matrix of $G$ and we give a central limit theorem for $H_G$. Finally, we show that the mp-chart is easily tractable also for the complement graph." --- I have no idea what this means either.
in_NB  network_data_analysis  graph_theory  algebra 
november 2011 by cshalizi

Copy this bookmark:



description:


tags: