nhaliday + invariance   32

Three best practices for building successful data pipelines - O'Reilly Media
Drawn from their experiences and my own, I’ve identified three key areas that are often overlooked in data pipelines, and those are making your analysis:
1. Reproducible
2. Consistent
3. Productionizable

...

Science that cannot be reproduced by an external third party is just not science — and this does apply to data science. One of the benefits of working in data science is the ability to apply the existing tools from software engineering. These tools let you isolate all the dependencies of your analyses and make them reproducible.

Dependencies fall into three categories:
1. Analysis code ...
2. Data sources ...
3. Algorithmic randomness ...

...

Establishing consistency in data
...

There are generally two ways of establishing the consistency of data sources. The first is by checking-in all code and data into a single revision control repository. The second method is to reserve source control for code and build a pipeline that explicitly depends on external data being in a stable, consistent format and location.

Checking data into version control is generally considered verboten for production software engineers, but it has a place in data analysis. For one thing, it makes your analysis very portable by isolating all dependencies into source control. Here are some conditions under which it makes sense to have both code and data in source control:
Small data sets ...
Regular analytics ...
Fixed source ...

Productionizability: Developing a common ETL
...

1. Common data format ...
2. Isolating library dependencies ...

https://blog.koresoftware.com/blog/etl-principles
Rigorously enforce the idempotency constraint
For efficiency, seek to load data incrementally
Always ensure that you can efficiently process historic data
Partition ingested data at the destination
Rest data between tasks
Pool resources for efficiency
Store all metadata together in one place
Manage login details in one place
Specify configuration details once
Parameterize sub flows and dynamically run tasks where possible
Execute conditionally
Develop your own workflow framework and reuse workflow components

more focused on details of specific technologies:
https://medium.com/@rchang/a-beginners-guide-to-data-engineering-part-i-4227c5c457d7

https://www.cloudera.com/documentation/director/cloud/topics/cloud_de_best_practices.html
techtariat  org:com  best-practices  engineering  code-organizing  machine-learning  data-science  yak-shaving  nitty-gritty  workflow  config  vcs  replication  homo-hetero  multi  org:med  design  system-design  links  shipping  minimalism  volo-avolo  causation  random  invariance  structure  arrows  protocol-metadata  interface-compatibility 
august 2019 by nhaliday
OCaml For the Masses | November 2011 | Communications of the ACM
Straight out of the box, OCaml is pretty good at catching bugs, but it can do even more if you design your types carefully. Consider as an example the following types for representing the state of a network connection as illustrated in Figure 4.

that one excellent example of using algebraic data types
techtariat  rhetoric  programming  pls  engineering  pragmatic  carmack  quotes  aphorism  functional  ocaml-sml  types  formal-methods  correctness  finance  tip-of-tongue  examples  characterization  invariance  networking 
july 2019 by nhaliday
The Existential Risk of Math Errors - Gwern.net
How big is this upper bound? Mathematicians have often made errors in proofs. But it’s rarer for ideas to be accepted for a long time and then rejected. But we can divide errors into 2 basic cases corresponding to type I and type II errors:

1. Mistakes where the theorem is still true, but the proof was incorrect (type I)
2. Mistakes where the theorem was false, and the proof was also necessarily incorrect (type II)

Before someone comes up with a final answer, a mathematician may have many levels of intuition in formulating & working on the problem, but we’ll consider the final end-product where the mathematician feels satisfied that he has solved it. Case 1 is perhaps the most common case, with innumerable examples; this is sometimes due to mistakes in the proof that anyone would accept is a mistake, but many of these cases are due to changing standards of proof. For example, when David Hilbert discovered errors in Euclid’s proofs which no one noticed before, the theorems were still true, and the gaps more due to Hilbert being a modern mathematician thinking in terms of formal systems (which of course Euclid did not think in). (David Hilbert himself turns out to be a useful example of the other kind of error: his famous list of 23 problems was accompanied by definite opinions on the outcome of each problem and sometimes timings, several of which were wrong or questionable5.) Similarly, early calculus used ‘infinitesimals’ which were sometimes treated as being 0 and sometimes treated as an indefinitely small non-zero number; this was incoherent and strictly speaking, practically all of the calculus results were wrong because they relied on an incoherent concept - but of course the results were some of the greatest mathematical work ever conducted6 and when later mathematicians put calculus on a more rigorous footing, they immediately re-derived those results (sometimes with important qualifications), and doubtless as modern math evolves other fields have sometimes needed to go back and clean up the foundations and will in the future.7

...

Isaac Newton, incidentally, gave two proofs of the same solution to a problem in probability, one via enumeration and the other more abstract; the enumeration was correct, but the other proof totally wrong and this was not noticed for a long time, leading Stigler to remark:

...

TYPE I > TYPE II?
“Lefschetz was a purely intuitive mathematician. It was said of him that he had never given a completely correct proof, but had never made a wrong guess either.”
- Gian-Carlo Rota13

Case 2 is disturbing, since it is a case in which we wind up with false beliefs and also false beliefs about our beliefs (we no longer know that we don’t know). Case 2 could lead to extinction.

...

Except, errors do not seem to be evenly & randomly distributed between case 1 and case 2. There seem to be far more case 1s than case 2s, as already mentioned in the early calculus example: far more than 50% of the early calculus results were correct when checked more rigorously. Richard Hamming attributes to Ralph Boas a comment that while editing Mathematical Reviews that “of the new results in the papers reviewed most are true but the corresponding proofs are perhaps half the time plain wrong”.

...

Gian-Carlo Rota gives us an example with Hilbert:

...

Olga labored for three years; it turned out that all mistakes could be corrected without any major changes in the statement of the theorems. There was one exception, a paper Hilbert wrote in his old age, which could not be fixed; it was a purported proof of the continuum hypothesis, you will find it in a volume of the Mathematische Annalen of the early thirties.

...

Leslie Lamport advocates for machine-checked proofs and a more rigorous style of proofs similar to natural deduction, noting a mathematician acquaintance guesses at a broad error rate of 1/329 and that he routinely found mistakes in his own proofs and, worse, believed false conjectures30.

[more on these "structured proofs":
https://academia.stackexchange.com/questions/52435/does-anyone-actually-publish-structured-proofs
https://mathoverflow.net/questions/35727/community-experiences-writing-lamports-structured-proofs
]

We can probably add software to that list: early software engineering work found that, dismayingly, bug rates seem to be simply a function of lines of code, and one would expect diseconomies of scale. So one would expect that in going from the ~4,000 lines of code of the Microsoft DOS operating system kernel to the ~50,000,000 lines of code in Windows Server 2003 (with full systems of applications and libraries being even larger: the comprehensive Debian repository in 2007 contained ~323,551,126 lines of code) that the number of active bugs at any time would be… fairly large. Mathematical software is hopefully better, but practitioners still run into issues (eg Durán et al 2014, Fonseca et al 2017) and I don’t know of any research pinning down how buggy key mathematical systems like Mathematica are or how much published mathematics may be erroneous due to bugs. This general problem led to predictions of doom and spurred much research into automated proof-checking, static analysis, and functional languages31.

[related:
https://mathoverflow.net/questions/11517/computer-algebra-errors
I don't know any interesting bugs in symbolic algebra packages but I know a true, enlightening and entertaining story about something that looked like a bug but wasn't.

Define sinc𝑥=(sin𝑥)/𝑥.

Someone found the following result in an algebra package: ∫∞0𝑑𝑥sinc𝑥=𝜋/2
They then found the following results:

...

So of course when they got:

∫∞0𝑑𝑥sinc𝑥sinc(𝑥/3)sinc(𝑥/5)⋯sinc(𝑥/15)=(467807924713440738696537864469/935615849440640907310521750000)𝜋

hmm:
Which means that nobody knows Fourier analysis nowdays. Very sad and discouraging story... – fedja Jan 29 '10 at 18:47

--

Because the most popular systems are all commercial, they tend to guard their bug database rather closely -- making them public would seriously cut their sales. For example, for the open source project Sage (which is quite young), you can get a list of all the known bugs from this page. 1582 known issues on Feb.16th 2010 (which includes feature requests, problems with documentation, etc).

That is an order of magnitude less than the commercial systems. And it's not because it is better, it is because it is younger and smaller. It might be better, but until SAGE does a lot of analysis (about 40% of CAS bugs are there) and a fancy user interface (another 40%), it is too hard to compare.

I once ran a graduate course whose core topic was studying the fundamental disconnect between the algebraic nature of CAS and the analytic nature of the what it is mostly used for. There are issues of logic -- CASes work more or less in an intensional logic, while most of analysis is stated in a purely extensional fashion. There is no well-defined 'denotational semantics' for expressions-as-functions, which strongly contributes to the deeper bugs in CASes.]

...

Should such widely-believed conjectures as P≠NP or the Riemann hypothesis turn out be false, then because they are assumed by so many existing proofs, a far larger math holocaust would ensue38 - and our previous estimates of error rates will turn out to have been substantial underestimates. But it may be a cloud with a silver lining, if it doesn’t come at a time of danger.

https://mathoverflow.net/questions/338607/why-doesnt-mathematics-collapse-down-even-though-humans-quite-often-make-mista

more on formal methods in programming:
https://www.quantamagazine.org/formal-verification-creates-hacker-proof-code-20160920/
https://intelligence.org/2014/03/02/bob-constable/

https://softwareengineering.stackexchange.com/questions/375342/what-are-the-barriers-that-prevent-widespread-adoption-of-formal-methods
Update: measured effort
In the October 2018 issue of Communications of the ACM there is an interesting article about Formally verified software in the real world with some estimates of the effort.

Interestingly (based on OS development for military equipment), it seems that producing formally proved software requires 3.3 times more effort than with traditional engineering techniques. So it's really costly.

On the other hand, it requires 2.3 times less effort to get high security software this way than with traditionally engineered software if you add the effort to make such software certified at a high security level (EAL 7). So if you have high reliability or security requirements there is definitively a business case for going formal.

WHY DON'T PEOPLE USE FORMAL METHODS?: https://www.hillelwayne.com/post/why-dont-people-use-formal-methods/
You can see examples of how all of these look at Let’s Prove Leftpad. HOL4 and Isabelle are good examples of “independent theorem” specs, SPARK and Dafny have “embedded assertion” specs, and Coq and Agda have “dependent type” specs.6

If you squint a bit it looks like these three forms of code spec map to the three main domains of automated correctness checking: tests, contracts, and types. This is not a coincidence. Correctness is a spectrum, and formal verification is one extreme of that spectrum. As we reduce the rigour (and effort) of our verification we get simpler and narrower checks, whether that means limiting the explored state space, using weaker types, or pushing verification to the runtime. Any means of total specification then becomes a means of partial specification, and vice versa: many consider Cleanroom a formal verification technique, which primarily works by pushing code review far beyond what’s humanly possible.

...

The question, then: “is 90/95/99% correct significantly cheaper than 100% correct?” The answer is very yes. We all are comfortable saying that a codebase we’ve well-tested and well-typed is mostly correct modulo a few fixes in prod, and we’re even writing more than four lines of code a day. In fact, the vast… [more]
ratty  gwern  analysis  essay  realness  truth  correctness  reason  philosophy  math  proofs  formal-methods  cs  programming  engineering  worse-is-better/the-right-thing  intuition  giants  old-anglo  error  street-fighting  heuristic  zooming  risk  threat-modeling  software  lens  logic  inference  physics  differential  geometry  estimate  distribution  robust  speculation  nonlinearity  cost-benefit  convexity-curvature  measure  scale  trivia  cocktail  history  early-modern  europe  math.CA  rigor  news  org:mag  org:sci  miri-cfar  pdf  thesis  comparison  examples  org:junk  q-n-a  stackex  pragmatic  tradeoffs  cracker-prog  techtariat  invariance  DSL  chart  ecosystem  grokkability  heavyweights  CAS  static-dynamic  lower-bounds  complexity  tcs  open-problems  big-surf  ideas  certificates-recognition  proof-systems  PCP  mediterranean  SDP  meta:prediction  epistemic  questions  guessing  distributed  overflow  nibble  soft-question  track-record  big-list  hmm  frontier  state-of-art  move-fast-(and-break-things)  grokkability-clarity  technical-writing  trust 
july 2019 by nhaliday
Antinomia Imediata – experiments in a reaction from the left
https://antinomiaimediata.wordpress.com/lrx/
So, what is the Left Reaction? First of all, it’s reaction: opposition to the modern rationalist establishment, the Cathedral. It opposes the universalist Jacobin program of global government, favoring a fractured geopolitics organized through long-evolved complex systems. It’s profoundly anti-socialist and anti-communist, favoring market economy and individualism. It abhors tribalism and seeks a realistic plan for dismantling it (primarily informed by HBD and HBE). It looks at modernity as a degenerative ratchet, whose only way out is intensification (hence clinging to crypto-marxist market-driven acceleration).

How come can any of this still be in the *Left*? It defends equality of power, i.e. freedom. This radical understanding of liberty is deeply rooted in leftist tradition and has been consistently abhored by the Right. LRx is not democrat, is not socialist, is not progressist and is not even liberal (in its current, American use). But it defends equality of power. It’s utopia is individual sovereignty. It’s method is paleo-agorism. The anti-hierarchy of hunter-gatherer nomads is its understanding of the only realistic objective of equality.

...

In more cosmic terms, it seeks only to fulfill the Revolution’s side in the left-right intelligence pump: mutation or creation of paths. Proudhon’s antinomy is essentially about this: the collective force of the socius, evinced in moral standards and social organization vs the creative force of the individuals, that constantly revolutionize and disrupt the social body. The interplay of these forces create reality (it’s a metaphysics indeed): the Absolute (socius) builds so that the (individualistic) Revolution can destroy so that the Absolute may adapt, and then repeat. The good old formula of ‘solve et coagula’.

Ultimately, if the Neoreaction promises eternal hell, the LRx sneers “but Satan is with us”.

https://antinomiaimediata.wordpress.com/2016/12/16/a-statement-of-principles/
Liberty is to be understood as the ability and right of all sentient beings to dispose of their persons and the fruits of their labor, and nothing else, as they see fit. This stems from their self-awareness and their ability to control and choose the content of their actions.

...

Equality is to be understood as the state of no imbalance of power, that is, of no subjection to another sentient being. This stems from their universal ability for empathy, and from their equal ability for reason.

...

It is important to notice that, contrary to usual statements of these two principles, my standpoint is that Liberty and Equality here are not merely compatible, meaning they could coexist in some possible universe, but rather they are two sides of the same coin, complementary and interdependent. There can be NO Liberty where there is no Equality, for the imbalance of power, the state of subjection, will render sentient beings unable to dispose of their persons and the fruits of their labor[1], and it will limit their ability to choose over their rightful jurisdiction. Likewise, there can be NO Equality without Liberty, for restraining sentient beings’ ability to choose and dispose of their persons and fruits of labor will render some more powerful than the rest, and establish a state of subjection.

https://antinomiaimediata.wordpress.com/2017/04/18/flatness/
equality is the founding principle (and ultimately indistinguishable from) freedom. of course, it’s only in one specific sense of “equality” that this sentence is true.

to try and eliminate the bullshit, let’s turn to networks again:

any nodes’ degrees of freedom is the number of nodes they are connected to in a network. freedom is maximum when the network is symmetrically connected, i. e., when all nodes are connected to each other and thus there is no topographical hierarchy (middlemen) – in other words, flatness.

in this understanding, the maximization of freedom is the maximization of entropy production, that is, of intelligence. As Land puts it:

https://antinomiaimediata.wordpress.com/category/philosophy/mutualism/
gnon  blog  stream  politics  polisci  ideology  philosophy  land  accelerationism  left-wing  right-wing  paradox  egalitarianism-hierarchy  civil-liberty  power  hmm  revolution  analytical-holistic  mutation  selection  individualism-collectivism  tribalism  us-them  modernity  multi  tradeoffs  network-structure  complex-systems  cybernetics  randy-ayndy  insight  contrarianism  metameta  metabuch  characterization  cooperate-defect  n-factor  altruism  list  coordination  graphs  visual-understanding  cartoons  intelligence  entropy-like  thermo  information-theory  order-disorder  decentralized  distribution  degrees-of-freedom  analogy  graph-theory  extrema  evolution  interdisciplinary  bio  differential  geometry  anglosphere  optimate  nascent-state  deep-materialism  new-religion  cool  mystic  the-classics  self-interest  interests  reason  volo-avolo  flux-stasis  invariance  government  markets  paying-rent  cost-benefit  peace-violence  frontier  exit-voice  nl-and-so-can-you  war  track-record  usa  history  mostly-modern  world-war  military  justice  protestant-cathol 
march 2018 by nhaliday
Uniformitarianism - Wikipedia
Uniformitarianism, also known as the Doctrine of Uniformity,[1] is the assumption that the same natural laws and processes that operate in the universe now have always operated in the universe in the past and apply everywhere.[2][3] It refers to invariance in the principles underpinning science, such as the constancy of causality, or causation, throughout time,[4] but it has also been used to describe invariance of physical laws through time and space.[5] Though an unprovable postulate that cannot be verified using the scientific method, uniformitarianism has been a key first principle of virtually all fields of science.[6]

In geology, uniformitarianism has included the gradualistic concept that "the present is the key to the past" (that events occur at the same rate now as they have always done); many geologists now, however, no longer hold to a strict theory of gradualism.[7] Coined by William Whewell, the word was proposed in contrast to catastrophism[8] by British naturalists in the late 18th century, starting with the work of the geologist James Hutton. Hutton's work was later refined by scientist John Playfair and popularised by geologist Charles Lyell's Principles of Geology in 1830.[9] Today, Earth's history is considered to have been a slow, gradual process, punctuated by occasional natural catastrophic events.
concept  axioms  jargon  homo-hetero  wiki  reference  science  the-trenches  philosophy  invariance  universalism-particularism  time  spatial  religion  christianity  theos  contradiction  noble-lie  thinking  metabuch  reason  rigidity  flexibility  analytical-holistic  systematic-ad-hoc  degrees-of-freedom  absolute-relative  n-factor  explanans  the-great-west-whale  occident  sinosphere  orient  truth  earth  conceptual-vocab  metameta  history  early-modern  britain  anglo  anglosphere  roots  forms-instances  volo-avolo  deep-materialism  new-religion  logos 
january 2018 by nhaliday
Is the speed of light really constant?
So what if the speed of light isn’t the same when moving toward or away from us? Are there any observable consequences? Not to the limits of observation so far. We know, for example, that any one-way speed of light is independent of the motion of the light source to 2 parts in a billion. We know it has no effect on the color of the light emitted to a few parts in 1020. Aspects such as polarization and interference are also indistinguishable from standard relativity. But that’s not surprising, because you don’t need to assume isotropy for relativity to work. In the 1970s, John Winnie and others showed that all the results of relativity could be modeled with anisotropic light so long as the two-way speed was a constant. The “extra” assumption that the speed of light is a uniform constant doesn’t change the physics, but it does make the mathematics much simpler. Since Einstein’s relativity is the simpler of two equivalent models, it’s the model we use. You could argue that it’s the right one citing Occam’s razor, or you could take Newton’s position that anything untestable isn’t worth arguing over.

SPECIAL RELATIVITY WITHOUT ONE-WAY VELOCITY ASSUMPTIONS:
https://sci-hub.bz/https://www.jstor.org/stable/186029
https://sci-hub.bz/https://www.jstor.org/stable/186671
nibble  scitariat  org:bleg  physics  relativity  electromag  speed  invariance  absolute-relative  curiosity  philosophy  direction  gedanken  axioms  definition  models  experiment  space  science  measurement  volo-avolo  synchrony  uniqueness  multi  pdf  piracy  study  article 
november 2017 by nhaliday
Power of a point - Wikipedia
The power of point P (see in Figure 1) can be defined equivalently as the product of distances from the point P to the two intersection points of any ray emanating from P.
nibble  math  geometry  spatial  ground-up  concept  metrics  invariance  identity  atoms  wiki  reference  measure  yoga  calculation 
september 2017 by nhaliday
rotational dynamics - Why do non-rigid bodies try to increase their moment of inertia? - Physics Stack Exchange
This happens to isolated rotating system that is not a rigid body.

Inside such a body (for example, steel chain in free fall) the parts move relatively to each other and there is internal friction that dissipates kinetic energy of the system, while angular momentum is conserved. The dissipation goes on until the parts stop moving with respect to each other, so body rotates as a rigid body, even if it is not rigid by constitution.

The rotating state of the body that has the lowest kinetic energy for given angular momentum is that in which the body has the greatest moment of inertia (with respect to center of mass). For example, a long chain thrown into free fall will twist and turn until it is all straight and rotating as rigid body.

...

If LL is constant (net torque of external forces acting on the system is zero) and the constitution and initial conditions allow it, the system's dissipation will work to diminish energy until it has the minimum value, which happens for maximum IaIa possible.
nibble  q-n-a  overflow  physics  mechanics  tidbits  spatial  rigidity  flexibility  invariance  direction  stylized-facts  dynamical  volo-avolo  street-fighting  yoga 
august 2017 by nhaliday
Tidal locking - Wikipedia
The Moon's rotation and orbital periods are tidally locked with each other, so no matter when the Moon is observed from Earth the same hemisphere of the Moon is always seen. The far side of the Moon was not seen until 1959, when photographs of most of the far side were transmitted from the Soviet spacecraft Luna 3.[12]

never actually thought about this
nibble  wiki  reference  space  mechanics  gravity  navigation  explanation  flux-stasis  marginal  volo-avolo  spatial  direction  invariance  physics  flexibility  rigidity  time  identity  phase-transition  being-becoming 
august 2017 by nhaliday
Is the U.S. Aggregate Production Function Cobb-Douglas? New Estimates of the Elasticity of Substitution∗
world-wide: http://www.socsci.uci.edu/~duffy/papers/jeg2.pdf
https://www.weforum.org/agenda/2016/01/is-the-us-labour-share-as-constant-as-we-thought
https://www.economicdynamics.org/meetpapers/2015/paper_844.pdf
We find that IPP capital entirely explains the observed decline of the US labor share, which otherwise is secularly constant over the past 65 years for structures and equipment capital. The labor share decline simply reflects the fact that the US economy is undergoing a transition toward a larger IPP sector.
https://ideas.repec.org/p/red/sed015/844.html
http://www.robertdkirkby.com/blog/2015/summary-of-piketty-i/
https://www.brookings.edu/bpea-articles/deciphering-the-fall-and-rise-in-the-net-capital-share/
The Fall of the Labor Share and the Rise of Superstar Firms: http://www.nber.org/papers/w23396
The Decline of the U.S. Labor Share: https://www.brookings.edu/wp-content/uploads/2016/07/2013b_elsby_labor_share.pdf
Table 2 has industry disaggregation
Estimating the U.S. labor share: https://www.bls.gov/opub/mlr/2017/article/estimating-the-us-labor-share.htm

Why Workers Are Losing to Capitalists: https://www.bloomberg.com/view/articles/2017-09-20/why-workers-are-losing-to-capitalists
Automation and offshoring may be conspiring to reduce labor's share of income.
pdf  study  economics  growth-econ  econometrics  usa  data  empirical  analysis  labor  capital  econ-productivity  manifolds  magnitude  multi  world  🎩  piketty  econotariat  compensation  inequality  winner-take-all  org:ngo  org:davos  flexibility  distribution  stylized-facts  regularizer  hmm  history  mostly-modern  property-rights  arrows  invariance  industrial-org  trends  wonkish  roots  synthesis  market-power  efficiency  variance-components  business  database  org:gov  article  model-class  models  automation  nationalism-globalism  trade  news  org:mag  org:biz  org:bv  noahpinion  explanation  summary  methodology  density  polarization  map-territory  input-output 
july 2017 by nhaliday
soft question - Thinking and Explaining - MathOverflow
- good question from Bill Thurston
- great answers by Terry Tao, fedja, Minhyong Kim, gowers, etc.

Terry Tao:
- symmetry as blurring/vibrating/wobbling, scale invariance
- anthropomorphization, adversarial perspective for estimates/inequalities/quantifiers, spending/economy

fedja walks through his though-process from another answer

Minhyong Kim: anthropology of mathematical philosophizing

Per Vognsen: normality as isotropy
comment: conjugate subgroup gHg^-1 ~ "H but somewhere else in G"

gowers: hidden things in basic mathematics/arithmetic
comment by Ryan Budney: x sin(x) via x -> (x, sin(x)), (x, y) -> xy
I kinda get what he's talking about but needed to use Mathematica to get the initial visualization down.
To remind myself later:
- xy can be easily visualized by juxtaposing the two parabolae x^2 and -x^2 diagonally
- x sin(x) can be visualized along that surface by moving your finger along the line (x, 0) but adding some oscillations in y direction according to sin(x)
q-n-a  soft-question  big-list  intuition  communication  teaching  math  thinking  writing  thurston  lens  overflow  synthesis  hi-order-bits  👳  insight  meta:math  clarity  nibble  giants  cartoons  gowers  mathtariat  better-explained  stories  the-trenches  problem-solving  homogeneity  symmetry  fedja  examples  philosophy  big-picture  vague  isotropy  reflection  spatial  ground-up  visual-understanding  polynomials  dimensionality  math.GR  worrydream  scholar  🎓  neurons  metabuch  yoga  retrofit  mental-math  metameta  wisdom  wordlessness  oscillation  operational  adversarial  quantifiers-sums  exposition  explanation  tricki  concrete  s:***  manifolds  invariance  dynamical  info-dynamics  cool  direction  elegance  heavyweights  analysis  guessing  grokkability-clarity  technical-writing 
january 2017 by nhaliday

bundles : abstractmathproblem-solving

related tags

absolute-relative  accelerationism  acm  acmtariat  adversarial  ai  algorithms  altruism  analogy  analysis  analytical-holistic  anglo  anglosphere  aphorism  apollonian-dionysian  arrows  article  assembly  atoms  automation  axioms  baez  bare-hands  being-becoming  benevolence  best-practices  better-explained  big-list  big-picture  big-surf  bio  blog  books  britain  business  calculation  caltech  capital  carmack  cartoons  CAS  causation  certificates-recognition  characterization  chart  christianity  civil-liberty  clarity  cocktail  code-organizing  cohesion  coloring  communication  comparison  compensation  complex-systems  complexity  composition-decomposition  computation  concept  conceptual-vocab  concrete  config  contradiction  contrarianism  convexity-curvature  cool  cooperate-defect  coordination  correctness  cost-benefit  counter-revolution  counterexample  cracker-prog  creative  cs  curiosity  cybernetics  cycles  cynicism-idealism  d-lang  data  data-science  data-structures  database  decentralized  deep-materialism  definition  degrees-of-freedom  density  design  differential  dimensionality  direction  dirty-hands  discovery  distributed  distribution  documentation  draft  DSL  duality  duty  dynamical  early-modern  earth  ecology  econ-productivity  econometrics  economics  econotariat  ecosystem  efficiency  egalitarianism-hierarchy  einstein  electromag  elegance  emergent  empirical  ends-means  engineering  entropy-like  epistemic  error  essay  estimate  ethics  europe  evolution  examples  exit-voice  experiment  explanans  explanation  exposition  externalities  extrema  fedja  finance  flexibility  flux-stasis  formal-methods  formal-values  forms-instances  fourier  frontier  functional  gedanken  geometry  giants  gnon  good-evil  gotchas  government  gowers  graph-theory  graphs  gravity  grokkability  grokkability-clarity  ground-up  growth-econ  guessing  gwern  heavyweights  heterodox  heuristic  hi-order-bits  history  hmm  homo-hetero  homogeneity  ideas  identity  ideology  iidness  impetus  increase-decrease  individualism-collectivism  industrial-org  inequality  inference  info-dynamics  information-theory  inner-product  input-output  insight  intelligence  interdisciplinary  interests  interface-compatibility  internet  intricacy  intuition  invariance  isotropy  iteration-recursion  janus  jargon  justice  labor  land  language  lectures  left-wing  lens  levers  linear-algebra  linearity  links  lisp  list  local-global  logic  logos  lower-bounds  machine-learning  magnitude  manifolds  map-territory  marginal  market-power  markets  math  math.CA  math.CO  math.FA  math.GR  mathtariat  measure  measurement  mechanics  mediterranean  mental-math  meta:math  meta:prediction  metabuch  metameta  methodology  metric-space  metrics  military  minimalism  miri-cfar  mit  model-class  models  modernity  monotonicity  morality  mostly-modern  motivation  move-fast-(and-break-things)  multi  mutation  mystic  n-factor  nascent-state  nationalism-globalism  navigation  network-structure  networking  neurons  new-religion  news  nibble  nihil  nitty-gritty  nl-and-so-can-you  noahpinion  noble-lie  nonlinearity  norms  ocaml-sml  occident  old-anglo  oly  oop  open-problems  operational  optimate  order-disorder  org:biz  org:bleg  org:bv  org:com  org:davos  org:edu  org:gov  org:junk  org:mag  org:med  org:ngo  org:sci  orient  oscillation  overflow  paradox  parallax  pareto  path-dependence  paying-rent  PCP  pdf  peace-violence  performance  phase-transition  philosophy  phys-energy  physics  pic  piketty  piracy  pls  plt  polarization  polisci  politics  polynomials  positivity  power  power-law  pragmatic  pre-ww2  probability  problem-solving  programming  proof-systems  proofs  properties  property-rights  protestant-catholic  protocol-metadata  puzzles  q-n-a  quantifiers-sums  questions  quotes  random  randy-ayndy  ratty  realness  reason  rec-math  reference  reflection  regularizer  relativity  religion  replication  responsibility  retrofit  revolution  rhetoric  right-wing  rigidity  rigor  risk  robust  roots  s:***  scale  scholar  sci-comp  science  scitariat  SDP  selection  self-interest  series  shipping  signum  sinosphere  smoothness  social-capital  social-norms  sociality  soft-question  software  space  spatial  speculation  speed  stackex  stat-mech  state  state-of-art  static-dynamic  stats  stochastic-processes  stories  stream  street-fighting  strings  structure  study  stylized-facts  summary  symmetry  synchrony  synthesis  system-design  systematic-ad-hoc  systems  tails  tcs  teaching  technical-writing  techtariat  the-classics  the-great-west-whale  the-self  the-trenches  theory-of-mind  theos  thermo  thesis  things  thinking  threat-modeling  thurston  tidbits  time  tip-of-tongue  track-record  trade  tradeoffs  trends  tribalism  tricki  trivia  trust  truth  turing  types  uniqueness  unit  universalism-particularism  us-them  usa  vague  variance-components  vcs  video  visual-understanding  visuo  volo-avolo  war  whole-partial-many  wiki  winner-take-all  wisdom  wonkish  wordlessness  workflow  world  world-war  worrydream  worse-is-better/the-right-thing  writing  yak-shaving  yoga  zooming  🎓  🎩  👳 

Copy this bookmark:



description:


tags: