mike + math   13

Beautiful differentiation
Automatic differentiation (AD) is a precise, efficient, and convenient method for computing derivatives of functions. Its forward-mode implementation can be quite simple even when extended to compute all of the higher-order derivatives as well. The higher-dimensional case has also been tackled, though with extra complexity. This paper develops an implementation of higher-dimensional, higher-order, forward-mode AD in the extremely general and elegant setting of calculus on manifolds and derives that implementation from a simple and precise specification.

In order to motivate and discover the implementation, the paper poses the question “What does AD mean, independently of implementation?” An answer arises in the form of naturality of sampling a function and its derivative. Automatic differentiation flows out of this naturality condition, together with the chain rule. Graduating from first-order to higher-order AD corresponds to sampling all derivatives instead of just one. Next, the setting is expanded to arbitrary vector spaces, in which derivative values are linear maps. The specification of AD adapts to this elegant and very general setting, which even simplifies the development.
haskell  math  programming 
5 weeks ago by mike
Floating Point Demystified, Part 1
To turn things around, think about time_t. time_t is a type defined to represent the number of seconds since the epoch of 1970-01-01 00:00 UTC. It has traditionally been defined as a 32-bit signed integer (which means that it will overflow in the year 2038). Imagine that a 32-bit single-precision float had been chosen instead.

With a float time_t, there would be no overflow until the year 5395141535403007094485264579465 AD, long after the Sun has swallowed up the Earth as a Red Giant, and turned into a Black Dwarf. However! With this scheme the granularity of timekeeping would get worse and worse the farther we got from 1970. Unlike the int32 which gives second granularity all the way until 2038, with a float time_t we would already in 2014 be down to a precision of 128 seconds — far too coarse to be useful.
math  programming 
5 weeks ago by mike
Hack the derivative
Finite difference with complex analysis trick
calculus  math  python 
5 weeks ago by mike
Math Has No God Particle | FiveThirtyEight
David Vogan, who’s a mathematician at MIT and was involved with the atlas project, described academic mathematics as a garden. There are showy, flowery fields like number theory. Its beautiful problems and elegant results, such as the prime gap or Fermat’s last theorem, are math’s orchids. There are also the tomatoes — the things you can eat out of the garden, the practical yield. These disciplines, like Fourier analysis with its concrete applications to signal processing of audio, radio and light waves, are businesslike. And then there are the disciplines, often unheralded, that keep the rest of the garden growing — the hoes, the sprinklers. Lie groups, their representations and the atlas project are an example.

“Representation theory,” the field of the atlas group’s research, “is the fertilizer or the rose trellis, depending on the day of the week,” Vogan said.
math  research  journalism  scientism  ScienceWriting 
july 2017 by mike
algorithms - Prove Upper Bound (Big O) for Fibonacci's Sequence? - Mathematics Stack Exchange
In general, given any linear recurrence relation, the same trick works: guess that the general nnth term is of the form cxncxn, write down an equation for xx (which will be a polynomial), and if αα is the largest root of that polynomial equation, then you'll have T(n)=O(αn)T(n)=O(αn).
math  cs  algorithms 
december 2016 by mike

Copy this bookmark:



description:


tags: