**rather-interesting**2078

Figuring out when you can do a puzzle. – Occupy Math

4 days ago by Vaguery

This week’s Occupy Math looks at a type of puzzle where you want to fill a rectangle with a shape. We will be using the L-shaped 3-square polyomino, used to fill a 5×9 rectangle below, as our example shape. The goal is to figure out every possible size of rectangle that can be filled with this shape. If you are constructing puzzles for other people — e.g., your students — knowing which problems can be solved gives you an edge. The post will not only solve the problem for our example shape, but also give you tools for doing this for other shapes. The answers, and the tools, are at the bottom if you don’t feel like working through the reasoning.

mathematical-recreations
polyominoes
proof
rather-interesting
nudge-targets
consider:classification
consider:feature-discovery
4 days ago by Vaguery

A Note about Cognitive Effort and Misinfo (Oh, and also I’m a Rita Allen Misinformation Solutions Forum Finalist) | Hapgood

4 days ago by Vaguery

I’m not necessarily sold on the Pennycook and Rand version of this idea, but I’m interested in the broader insight. I know it doesn’t explain the worst offenders, but I’ve found with those I work with that cynicism (“Pick what you want, it’s all bullshit!”) is often driven by the cognitive exhaustion of sorting through conflicting information. This insight also aligns with Hannah Arendt’s work — totalitarianism wins the information war by deliberately overwhelming the capacity of a population to reconcile endless contradictions. The contradictions are a tool to increase the cost of pursuing truth relative to other options.

If this is the case, one approach might be to encourage people to be more effortful when looking at online media. (Meh.) But the approach I favor is to reduce both the real and perceived cost of sorting through the muck through finding cheap, good enough methods and popularizing them. Doing that — while fostering a culture that values accuracy — might cause a few more people to regard the cost of checking something to be worth it relative to other seemingly more economical options like partisan heuristics, conspiracy thinking, or cynical nihilism.

cultural-dynamics
politics
reality-criticism
affordances
user-experience
social-psychology
argumentation
rather-interesting
to-write-about
If this is the case, one approach might be to encourage people to be more effortful when looking at online media. (Meh.) But the approach I favor is to reduce both the real and perceived cost of sorting through the muck through finding cheap, good enough methods and popularizing them. Doing that — while fostering a culture that values accuracy — might cause a few more people to regard the cost of checking something to be worth it relative to other seemingly more economical options like partisan heuristics, conspiracy thinking, or cynical nihilism.

4 days ago by Vaguery

Bandit Algorithms

5 days ago by Vaguery

After nearly two years since starting to write the blog we have at last completed a first draft of the book, which is to be published by Cambridge University Press.

The book is available for free as a PDF and will remain so after publication. We’re grateful to Cambridge for allowing this.

Without further ado, here is the link.

probability-theory
online-learning
book
rather-interesting
to-read
algorithms
to-write-about
The book is available for free as a PDF and will remain so after publication. We’re grateful to Cambridge for allowing this.

Without further ado, here is the link.

5 days ago by Vaguery

[1806.07366] Neural Ordinary Differential Equations

5 days ago by Vaguery

We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is computed using a blackbox differential equation solver. These continuous-depth models have constant memory cost, adapt their evaluation strategy to each input, and can explicitly trade numerical precision for speed. We demonstrate these properties in continuous-depth residual networks and continuous-time latent variable models. We also construct continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models.

neural-networks
machine-learning
representation
rather-interesting
deep-learning
to-understand
consider:representation
to-write-about
5 days ago by Vaguery

A tale of three machines | The Math Less Traveled

7 days ago by Vaguery

The punchline is that this is not true!! That is,

It is possible to make working primality machines that truly do not know anything about any factors of .

Not only that, but we know how to make primality machines that run much faster than the fastest known factor machines!

number-theory
algorithms
rather-interesting
nudge-targets
proof
to-write-about
It is possible to make working primality machines that truly do not know anything about any factors of .

Not only that, but we know how to make primality machines that run much faster than the fastest known factor machines!

7 days ago by Vaguery

The Fermat primality test | The Math Less Traveled

7 days ago by Vaguery

So this is better than nothing, but it’s not quite a primality machine, because it can’t tell us for sure that a number is prime. And it leaves a lot more questions: could we make big enough so that we could know for sure whether is prime? How big would have to be? What about for composite numbers; how fast do we expect this to be? Are there other ways to build on this basic idea to get better (faster, more certain) primality tests? I’ll write about all this and more in future posts!

number-theory
algorithms
rather-interesting
to-write-about
nudge-targets
7 days ago by Vaguery

About · Compositionality

15 days ago by Vaguery

Compositionality describes and quantifies how complex things can be assembled out of simpler parts. Compositionality, the journal, is a new open-access journal for research using compositional ideas, most notably of a category-theoretic origin, in any discipline. Topics may concern foundational structures, an organizing principle, or a powerful tool. Example areas include but are not limited to: computation, logic, physics, chemistry, engineering, linguistics, and cognition.

open-access
journals
publishing
nonprofit
rather-interesting
to-understand
15 days ago by Vaguery

Theorem of the Day

15 days ago by Vaguery

The list is presented here in reverse chronological order, so that new additions will appear at the top. This is not the order in which the theorem of the day is picked which is more designed to mix up the different areas of mathematics and the level of abstractness or technicality involved. The way that the list of theorems is indexed is described here.

mathematics
proof
lists
rather-interesting
nudge-targets
consider:looking-to-see
15 days ago by Vaguery

Color Problems: A Republished Tome Reveals the Color Wisdom and Poetics of 19th-Century Artist Emily Noyes Vanderpoel | Colossal

25 days ago by Vaguery

In 1901 artist and historian Emily Noyes Vanderpoel (1842-1939) published the painting manual Color Problems: A Practical Manual for the Lay Student of Color under the guise of flower painting and decorative arts, subjects that were appropriate for a woman of her time. The study provided an extensive look at color theory ideas of the early 20th-century. Her research-based techniques were later used and circulated by men without mention of her name, and are now commonly used in art curriculums. Many of the included studies predict design and art trends that wouldn’t occur for several decades, such as a concentric square format that predates Joseph Albers’s Homage to the Square by fifty years.

color
art-history
rather-interesting
to-read
25 days ago by Vaguery

[1801.10139] Analysis of the Continued Logarithm Algorithm

5 weeks ago by Vaguery

The Continued Logarithm Algorithm - CL for short- introduced by Gosper in 1978 computes the gcd of two integers; it seems very efficient, as it only performs shifts and subtractions. Shallit has studied its worst-case complexity in 2016 and showed it to be linear. We here perform the average-case analysis of the algorithm: we study its main parameters (number of iterations, total number of shifts) and obtain precise asymptotics for their mean values. Our 'dynamical' analysis involves the dynamical system underlying the algorithm, that produces continued fraction expansions whose quotients are powers of 2. Even though this CL system has already been studied by Chan (around 2005), the presence of powers of 2 in the quotients ingrains into the central parameters a dyadic flavour that cannot be grasped solely by studying the CL system. We thus introduce a dyadic component and deal with a two-component system. With this new mixed system at hand, we then provide a complete average-case analysis of the CL algorithm, with explicit constants.

number-theory
numerical-methods
representation
computer-science
computational-complexity
rather-interesting
algorithms
continued-fractions
to-write-about
5 weeks ago by Vaguery

[1710.00217] A Framework for Inferring Combination Lock Codes using Smartwatches

5 weeks ago by Vaguery

Wrist-wearables such as smartwatches and fitness bands are equipped with a variety of high-precision sensors that enable collection of rich contextual information related to the wearer and his/her surroundings and support a variety of novel context- and activity-based applications. The presence of such a diverse set of on-board sensors, however, also expose an additional attack surface which, if not adequately protected, could be potentially exploited to leak private user information. In this paper, we comprehensively investigate the feasibility of a new vulnerability that attempts to take advantage of a wrist-wearable's seemingly innocuous and poorly regulated motion sensors to infer a user's input on mechanical devices typically used to secure physical access, for example, combination locks. In this direction, we outline two motion-based inference frameworks: i) a deterministic attack framework that attempts to infer a lock's unlock combination from the wrist motion (specifically, angular displacement) data obtained from a wrist-wearable's gyroscope sensor, and ii) a probabilistic attack framework that extends the output of the deterministic framework to produce a ranked list of likely unlock combinations. Further, we conduct a thorough empirical evaluation of the proposed frameworks by employing unlocking-related motion data collected from human subject participants in a variety of controlled and realistic settings. Evaluation results from these experiments demonstrate that motion data from wrist-wearables can be effectively employed as an information side-channel to significantly reduce the unlock combination search-space of commonly-found combination locks, thus compromising the physical security provided by these locks.

security
inference
to-write-about
inverse-problems
rather-interesting
nudge-targets
feature-extraction
5 weeks ago by Vaguery

[1803.08530] A Coloring Book Approach to Finding Coordination Sequences

5 weeks ago by Vaguery

An elementary method is described for finding the coordination sequences for a tiling, based on coloring the underlying graph. We illustrate the method by first applying it to the two kinds of vertices (tetravalent and trivalent) in the Cairo (or dual-3^2.4.3.4) tiling. The coordination sequence for a tetravalent vertex turns out, surprisingly, to be 1, 4, 8 ,12, 16, ..., the same as for a vertex in the familiar square (or 4^4) tiling. We thought that such a simple fact should have a simple proof, and this article is the result. We also apply the method to obtain coordination sequences for the 3^2.4.3.4, 3.4.6.4, 4.8^2, 3.12^2, and 3^4.6 uniform tilings, as well as the snub-632 and bew tilings. In several cases the results provide proofs for previously conjectured formulas.

combinatorics
feature-construction
representation
rather-interesting
enumeration
to-write-about
mathematical-recreations
consider:random-graphs
consider:non-tilings
5 weeks ago by Vaguery

[1806.00521] The lemniscate tree of a random polynomial

6 weeks ago by Vaguery

To each generic complex polynomial p(z) there is associated a labeled binary tree (here referred to as a "lemniscate tree") that encodes the topological type of the graph of |p(z)|. The branching structure of the lemniscate tree is determined by the configuration (i.e., arrangement in the plane) of the singular components of those level sets |p(z)|=t passing through a critical point.

In this paper, we address the question "How many branches appear in a typical lemniscate tree?" We answer this question first for a lemniscate tree sampled uniformly from the combinatorial class and second for the lemniscate tree arising from a random polynomial generated by i.i.d. zeros. From a more general perspective, these results take a first step toward a probabilistic treatment (within a specialized setting) of Arnold's program of enumerating algebraic Morse functions.

geometry
topology
rather-interesting
probability-theory
data-structures
feature-construction
to-understand
to-write-about
In this paper, we address the question "How many branches appear in a typical lemniscate tree?" We answer this question first for a lemniscate tree sampled uniformly from the combinatorial class and second for the lemniscate tree arising from a random polynomial generated by i.i.d. zeros. From a more general perspective, these results take a first step toward a probabilistic treatment (within a specialized setting) of Arnold's program of enumerating algebraic Morse functions.

6 weeks ago by Vaguery

[1806.10909] ResNet with one-neuron hidden layers is a Universal Approximator

6 weeks ago by Vaguery

We demonstrate that a very deep ResNet with stacked modules with one neuron per hidden layer and ReLU activation functions can uniformly approximate any Lebesgue integrable function in d dimensions, i.e. ℓ1(ℝd). Because of the identity mapping inherent to ResNets, our network has alternating layers of dimension one and d. This stands in sharp contrast to fully connected networks, which are not universal approximators if their width is the input dimension d [Lu et al, 2017]. Hence, our result implies an increase in representational power for narrow deep networks by the ResNet architecture.

representation
neural-networks
approximation
rather-interesting
to-write-about
to-do
6 weeks ago by Vaguery

Journey of a Single Line – BLDGBLOG

6 weeks ago by Vaguery

“When he was eighty-five,” Sarah Cowan writes in a review of a show mounted by the Miguel Abreu Gallery in New York City, “Wacław Szpakowski wrote a treatise for a lifetime project that no one had known about. Titled ‘Rhythmical Lines,’ it describes a series of labyrinthine geometrical abstractions, each one produced from a single continuous line. He’d begun these drawings around 1900, when he was just seventeen—what started as sketches he then formalized, compiled, and made ever more intricate over the course of his life.”

abstraction
design
art
constraints
rather-interesting
to-write-about
consider:performance-measures
6 weeks ago by Vaguery

[1705.04353] On the records

7 weeks ago by Vaguery

World record setting has long attracted public interest and scientific investigation. Extremal records summarize the limits of the space explored by a process, and the historical progression of a record sheds light on the underlying dynamics of the process. Existing analyses of prediction, statistical properties, and ultimate limits of record progressions have focused on particular domains. However, a broad perspective on how record progressions vary across different spheres of activity needs further development. Here we employ cross-cutting metrics to compare records across a variety of domains, including sports, games, biological evolution, and technological development. We find that these domains exhibit characteristic statistical signatures in terms of rates of improvement, "burstiness" of record-breaking time series, and the acceleration of the record breaking process. Specifically, sports and games exhibit the slowest rate of improvement and a wide range of rates of "burstiness." Technology improves at a much faster rate and, unlike other domains, tends to show acceleration in records. Many biological and technological processes are characterized by constant rates of improvement, showing less burstiness than sports and games. It is important to understand how these statistical properties of record progression emerge from the underlying dynamics. Towards this end, we conduct a detailed analysis of a particular record-setting event: elite marathon running. In this domain, we find that studying record-setting data alone can obscure many of the structural properties of the underlying process. The marathon study also illustrates how some of the standard statistical assumptions underlying record progression models may be inappropriate or commonly violated in real-world datasets.

extreme-values
time-series
feature-extraction
rather-interesting
dynamical-systems
to-understand
7 weeks ago by Vaguery

[1805.07360] Prediction in Projection: A new paradigm in delay-coordinate reconstruction

7 weeks ago by Vaguery

Delay-coordinate embedding is a powerful, time-tested mathematical framework for reconstructing the dynamics of a system from a series of scalar observations. Most of the associated theory and heuristics are overly stringent for real-world data, however, and real-time use is out of the question due to the expert human intuition needed to use these heuristics correctly. The approach outlined in this thesis represents a paradigm shift away from that traditional approach. I argue that perfect reconstructions are not only unnecessary for the purposes of delay-coordinate based forecasting, but that they can often be less effective than reduced-order versions of those same models. I demonstrate this using a range of low- and high-dimensional dynamical systems, showing that forecast models that employ imperfect reconstructions of the dynamics---i.e., models that are not necessarily true embeddings---can produce surprisingly accurate predictions of the future state of these systems. I develop a theoretical framework for understanding why this is so. This framework, which combines information theory and computational topology, also allows one to quantify the amount of predictive structure in a given time series, and even to choose which forecast method will be the most effective for those data.

nonlinear-dynamics
representation
complexology
rather-interesting
to-write-about
to-understand
7 weeks ago by Vaguery

[1806.01387] New And Surprising Ways to Be Mean. Adversarial NPCs with Coupled Empowerment Minimisation

7 weeks ago by Vaguery

Creating Non-Player Characters (NPCs) that can react robustly to unforeseen player behaviour or novel game content is difficult and time-consuming. This hinders the design of believable characters, and the inclusion of NPCs in games that rely heavily on procedural content generation. We have previously addressed this challenge by means of empowerment, a model of intrinsic motivation, and demonstrated how a coupled empowerment maximisation (CEM) policy can yield generic, companion-like behaviour. In this paper, we extend the CEM framework with a minimisation policy to give rise to adversarial behaviour. We conduct a qualitative, exploratory study in a dungeon-crawler game, demonstrating that CEM can exploit the affordances of different content facets in adaptive adversarial behaviour without modifications to the policy. Changes to the level design, underlying mechanics and our character's actions do not threaten our NPC's robustness, but yield new and surprising ways to be mean.

hey-I-know-this-guy
coevolution
evolutionary-algorithms
engineering-design
rather-interesting
to-write-about
7 weeks ago by Vaguery

Symmathesy: A Word in Progress | norabateson

7 weeks ago by Vaguery

I would like to propose a new word for “System” that refers specifically to living systems – that is, to systems which emerge from the communications and interactions of living vitae (another new term, one which will be defined later). The new word, and concept, for “system” that I propose is one which highlights the expression and communication of interdependency and, particularly, mutual learning. The existing word, “system”, while useful for discussion of many kinds of systems, does not communicate contextual fields of simultaneous learning as is necessary for life. The inclusion of mutual learning in the terminology is specifically meant to preclude the models of engineering and mechanism that are implicit in much systems theorizing today. We have learned that when dealing with living systems, the many variables of developing interaction become untenable to consider in such mechanistic parameters. This change in concept should spark a significant shift in our work, in the sciences, applied professions, communication, arts, that addresses or depends upon our understanding of life and evolution. The discourse with which we discuss and study the living world should be representative of the living world, and should cautiously avoid connotations that imply or are derived from engineering.

The notion of systems as being an arrangement of parts and wholes has become a distraction from the new systemic vision, which we are trying to encourage, that sees life as relational mutual learning contexts. As studies ranging from cognitive science to epigenetics, social science, ecology and evolutionary theory, are increasingly showing, evolution emerges in interrelationality, not in arrangement. Therefore the need is acute to create a differentiation between living systems and other systems.

complexology
representation
define-your-terms
rather-interesting
to-write-about
philosophy-of-science
The notion of systems as being an arrangement of parts and wholes has become a distraction from the new systemic vision, which we are trying to encourage, that sees life as relational mutual learning contexts. As studies ranging from cognitive science to epigenetics, social science, ecology and evolutionary theory, are increasingly showing, evolution emerges in interrelationality, not in arrangement. Therefore the need is acute to create a differentiation between living systems and other systems.

7 weeks ago by Vaguery

Kumaraswamy distribution: a beta-like probability density

7 weeks ago by Vaguery

Maybe the algorithm I suggested for picking parameters is not very good, but I suspect the optimal parameters are not much better. Rather than saying that the Kumaraswamy distribution approximates the beta distribution, I’d say that the Kumaraswamy distribution is capable of assuming roughly the same shapes as the beta distribution. If the only reason you’re using a beta distribution is to get a certain density shape, the Kumaraswamy distribution would be a reasonable alternative. But if you need to approximate a beta distribution closely, it may not work well enough.

probability-theory
representation
rather-interesting
to-write-about
consider:feature-discovery
consider:heuristics
consider:approximation
7 weeks ago by Vaguery

**related tags**

Copy this bookmark: