nhaliday + synthesis   165

What are the Laws of Biology?
The core finding of systems biology is that only a very small subset of possible network motifs is actually used and that these motifs recur in all kinds of different systems, from transcriptional to biochemical to neural networks. This is because only those arrangements of interactions effectively perform some useful operation, which underlies some necessary function at a cellular or organismal level. There are different arrangements for input summation, input comparison, integration over time, high-pass or low-pass filtering, negative auto-regulation, coincidence detection, periodic oscillation, bistability, rapid onset response, rapid offset response, turning a graded signal into a sharp pulse or boundary, and so on, and so on.

These are all familiar concepts and designs in engineering and computing, with well-known properties. In living organisms there is one other general property that the designs must satisfy: robustness. They have to work with noisy components, at a scale that’s highly susceptible to thermal noise and environmental perturbations. Of the subset of designs that perform some operation, only a much smaller subset will do it robustly enough to be useful in a living organism. That is, they can still perform their particular functions in the face of noisy or fluctuating inputs or variation in the number of components constituting the elements of the network itself.
scitariat  reflection  proposal  ideas  thinking  conceptual-vocab  lens  bio  complex-systems  selection  evolution  flux-stasis  network-structure  structure  composition-decomposition  IEEE  robust  signal-noise  perturbation  interdisciplinary  graphs  circuits  🌞  big-picture  hi-order-bits  nibble  synthesis 
november 2017 by nhaliday
What is the connection between special and general relativity? - Physics Stack Exchange
Special relativity is the "special case" of general relativity where spacetime is flat. The speed of light is essential to both.
nibble  q-n-a  overflow  physics  relativity  explanation  synthesis  hi-order-bits  ground-up  gravity  summary  aphorism  differential  geometry 
november 2017 by nhaliday
What is the difference between general and special relativity? - Quora
General Relativity is, quite simply, needed to explain gravity.

Special Relativity is the special case of GR, when the metric is flat — which means no gravity.

You need General Relativity when the metric gets all curvy, and when things start to experience gravitation.
nibble  q-n-a  qra  explanation  physics  relativity  synthesis  hi-order-bits  ground-up  gravity  summary  aphorism  differential  geometry 
november 2017 by nhaliday
If Quantum Computers are not Possible Why are Classical Computers Possible? | Combinatorics and more
As most of my readers know, I regard quantum computing as unrealistic. You can read more about it in my Notices AMS paper and its extended version (see also this post) and in the discussion of Puzzle 4 from my recent puzzles paper (see also this post). The amazing progress and huge investment in quantum computing (that I presented and update  routinely in this post) will put my analysis to test in the next few years.
tcstariat  mathtariat  org:bleg  nibble  tcs  cs  computation  quantum  volo-avolo  no-go  contrarianism  frontier  links  quantum-info  analogy  comparison  synthesis  hi-order-bits  speedometer  questions  signal-noise 
november 2017 by nhaliday
Benedict Evans on Twitter: ""University can save you from the autodidact tendency to overrate himself. Democracy depends on people who know they don’t know everything.""
“The autodidact’s risk is that they think they know all of medieval history but have never heard of Charlemagne” - Umberto Eco

Facts are the least part of education. The structure and priorities they fit into matters far more, and learning how to learn far more again
techtariat  sv  twitter  social  discussion  rhetoric  info-foraging  learning  education  higher-ed  academia  expert  lens  aphorism  quotes  hi-order-bits  big-picture  synthesis  expert-experience 
october 2017 by nhaliday
Is the U.S. Aggregate Production Function Cobb-Douglas? New Estimates of the Elasticity of Substitution∗
world-wide: http://www.socsci.uci.edu/~duffy/papers/jeg2.pdf
https://www.weforum.org/agenda/2016/01/is-the-us-labour-share-as-constant-as-we-thought
https://www.economicdynamics.org/meetpapers/2015/paper_844.pdf
We find that IPP capital entirely explains the observed decline of the US labor share, which otherwise is secularly constant over the past 65 years for structures and equipment capital. The labor share decline simply reflects the fact that the US economy is undergoing a transition toward a larger IPP sector.
https://ideas.repec.org/p/red/sed015/844.html
http://www.robertdkirkby.com/blog/2015/summary-of-piketty-i/
https://www.brookings.edu/bpea-articles/deciphering-the-fall-and-rise-in-the-net-capital-share/
The Fall of the Labor Share and the Rise of Superstar Firms: http://www.nber.org/papers/w23396
The Decline of the U.S. Labor Share: https://www.brookings.edu/wp-content/uploads/2016/07/2013b_elsby_labor_share.pdf
Table 2 has industry disaggregation
Estimating the U.S. labor share: https://www.bls.gov/opub/mlr/2017/article/estimating-the-us-labor-share.htm

Why Workers Are Losing to Capitalists: https://www.bloomberg.com/view/articles/2017-09-20/why-workers-are-losing-to-capitalists
Automation and offshoring may be conspiring to reduce labor's share of income.
pdf  study  economics  growth-econ  econometrics  usa  data  empirical  analysis  labor  capital  econ-productivity  manifolds  magnitude  multi  world  🎩  piketty  econotariat  compensation  inequality  winner-take-all  org:ngo  org:davos  flexibility  distribution  stylized-facts  regularizer  hmm  history  mostly-modern  property-rights  arrows  invariance  industrial-org  trends  wonkish  roots  synthesis  market-power  efficiency  variance-components  business  database  org:gov  article  model-class  models  automation  nationalism-globalism  trade  news  org:mag  org:biz  org:bv  noahpinion  explanation  summary  methodology  density  polarization  map-territory  input-output 
july 2017 by nhaliday
A VERY BRIEF REVIEW OF MEASURE THEORY
A brief philosophical discussion:
Measure theory, as much as any branch of mathematics, is an area where it is important to be acquainted with the basic notions and statements, but not desperately important to be acquainted with the detailed proofs, which are often rather unilluminating. One should always have in a mind a place where one could go and look if one ever did need to understand a proof: for me, that place is Rudin’s Real and Complex Analysis (Rudin’s “red book”).
gowers  pdf  math  math.CA  math.FA  philosophy  measure  exposition  synthesis  big-picture  hi-order-bits  ergodic  ground-up  summary  roadmap  mathtariat  proofs  nibble  unit  integral  zooming  p:whenever 
february 2017 by nhaliday
What is the relationship between information theory and Coding theory? - Quora
basically:
- finite vs. asymptotic
- combinatorial vs. probabilistic (lotsa overlap their)
- worst-case (Hamming) vs. distributional (Shannon)

Information and coding theory most often appear together in the subject of error correction over noisy channels. Historically, they were born at almost exactly the same time - both Richard Hamming and Claude Shannon were working at Bell Labs when this happened. Information theory tends to heavily use tools from probability theory (together with an "asymptotic" way of thinking about the world), while traditional "algebraic" coding theory tends to employ mathematics that are much more finite sequence length/combinatorial in nature, including linear algebra over Galois Fields. The emergence in the late 90s and first decade of 2000 of codes over graphs blurred this distinction though, as code classes such as low density parity check codes employ both asymptotic analysis and random code selection techniques which have counterparts in information theory.

They do not subsume each other. Information theory touches on many other aspects that coding theory does not, and vice-versa. Information theory also touches on compression (lossy & lossless), statistics (e.g. large deviations), modeling (e.g. Minimum Description Length). Coding theory pays a lot of attention to sphere packing and coverings for finite length sequences - information theory addresses these problems (channel & lossy source coding) only in an asymptotic/approximate sense.
q-n-a  qra  math  acm  tcs  information-theory  coding-theory  big-picture  comparison  confusion  explanation  linear-algebra  polynomials  limits  finiteness  math.CO  hi-order-bits  synthesis  probability  bits  hamming  shannon  intricacy  nibble  s:null  signal-noise 
february 2017 by nhaliday
electromagnetism - Is Biot-Savart law obtained empirically or can it be derived? - Physics Stack Exchange
Addendum: In mathematics and science it is important to keep in mind the distinction between the historical and the logical development of a subject. Knowing the history of a subject can be useful to get a sense of the personalities involved and sometimes to develop an intuition about the subject. The logical presentation of the subject is the way practitioners think about it. It encapsulates the main ideas in the most complete and simple fashion. From this standpoint, electromagnetism is the study of Maxwell's equations and the Lorentz force law. Everything else is secondary, including the Biot-Savart law.
q-n-a  overflow  physics  electromag  synthesis  proofs  nibble 
february 2017 by nhaliday
general topology - What should be the intuition when working with compactness? - Mathematics Stack Exchange
http://math.stackexchange.com/questions/485822/why-is-compactness-so-important

The situation with compactness is sort of like the above. It turns out that finiteness, which you think of as one concept (in the same way that you think of "Foo" as one concept above), is really two concepts: discreteness and compactness. You've never seen these concepts separated before, though. When people say that compactness is like finiteness, they mean that compactness captures part of what it means to be finite in the same way that shortness captures part of what it means to be Foo.

--

As many have said, compactness is sort of a topological generalization of finiteness. And this is true in a deep sense, because topology deals with open sets, and this means that we often "care about how something behaves on an open set", and for compact spaces this means that there are only finitely many possible behaviors.

--

Compactness does for continuous functions what finiteness does for functions in general.

If a set A is finite then every function f:A→R has a max and a min, and every function f:A→R^n is bounded. If A is compact, the every continuous function from A to R has a max and a min and every continuous function from A to R^n is bounded.

If A is finite then every sequence of members of A has a subsequence that is eventually constant, and "eventually constant" is the only kind of convergence you can talk about without talking about a topology on the set. If A is compact, then every sequence of members of A has a convergent subsequence.
q-n-a  overflow  math  topology  math.GN  concept  finiteness  atoms  intuition  oly  mathtariat  multi  discrete  gowers  motivation  synthesis  hi-order-bits  soft-question  limits  things  nibble  definition  convergence  abstraction 
january 2017 by nhaliday
Shtetl-Optimized » Blog Archive » Logicians on safari
So what are they then? Maybe it’s helpful to think of them as “quantitative epistemology”: discoveries about the capacities of finite beings like ourselves to learn mathematical truths. On this view, the theoretical computer scientist is basically a mathematical logician on a safari to the physical world: someone who tries to understand the universe by asking what sorts of mathematical questions can and can’t be answered within it. Not whether the universe is a computer, but what kind of computer it is! Naturally, this approach to understanding the world tends to appeal most to people for whom math (and especially discrete math) is reasonably clear, whereas physics is extremely mysterious.

the sequel: http://www.scottaaronson.com/blog/?p=153
tcstariat  aaronson  tcs  computation  complexity  aphorism  examples  list  reflection  philosophy  multi  summary  synthesis  hi-order-bits  interdisciplinary  lens  big-picture  survey  nibble  org:bleg  applications  big-surf  s:*  p:whenever  ideas 
january 2017 by nhaliday
ca.analysis and odes - Why do functions in complex analysis behave so well? (as opposed to functions in real analysis) - MathOverflow
Well, real-valued analytic functions are just as rigid as their complex-valued counterparts. The true question is why complex smooth (or complex differentiable) functions are automatically complex analytic, whilst real smooth (or real differentiable) functions need not be real analytic.
q-n-a  overflow  math  math.CA  math.CV  synthesis  curiosity  gowers  oly  mathtariat  tcstariat  comparison  rigidity  smoothness  singularity  regularity  nibble 
january 2017 by nhaliday
ho.history overview - Proofs that require fundamentally new ways of thinking - MathOverflow
my favorite:
Although this has already been said elsewhere on MathOverflow, I think it's worth repeating that Gromov is someone who has arguably introduced more radical thoughts into mathematics than anyone else. Examples involving groups with polynomial growth and holomorphic curves have already been cited in other answers to this question. I have two other obvious ones but there are many more.

I don't remember where I first learned about convergence of Riemannian manifolds, but I had to laugh because there's no way I would have ever conceived of a notion. To be fair, all of the groundwork for this was laid out in Cheeger's thesis, but it was Gromov who reformulated everything as a convergence theorem and recognized its power.

Another time Gromov made me laugh was when I was reading what little I could understand of his book Partial Differential Relations. This book is probably full of radical ideas that I don't understand. The one I did was his approach to solving the linearized isometric embedding equation. His radical, absurd, but elementary idea was that if the system is sufficiently underdetermined, then the linear partial differential operator could be inverted by another linear partial differential operator. Both the statement and proof are for me the funniest in mathematics. Most of us view solving PDE's as something that requires hard work, involving analysis and estimates, and Gromov manages to do it using only elementary linear algebra. This then allows him to establish the existence of isometric embedding of Riemannian manifolds in a wide variety of settings.
q-n-a  overflow  soft-question  big-list  math  meta:math  history  insight  synthesis  gowers  mathtariat  hi-order-bits  frontier  proofs  magnitude  giants  differential  geometry  limits  flexibility  nibble  degrees-of-freedom  big-picture  novelty  zooming  big-surf  wild-ideas  metameta  courage  convergence  ideas  innovation  the-trenches  discovery  creative 
january 2017 by nhaliday
« earlier      
per page:    204080120160

bundles : abstractacademegood-vibesmathmetametametathinkingvague

related tags

:)  aaronson  absolute-relative  abstraction  academia  accretion  acm  acmtariat  additive  additive-combo  adversarial  advice  aesthetics  ai  ai-control  algebra  algebraic-complexity  algorithmic-econ  algorithms  ama  analogy  analysis  analytical-holistic  antiquity  aphorism  apollonian-dionysian  applicability-prereqs  applications  approximation  arrows  art  article  asia  atoms  automata  automation  axioms  baez  bare-hands  bayesian  beauty  ben-recht  benchmarks  berkeley  best-practices  better-explained  bias-variance  biases  big-list  big-peeps  big-picture  big-surf  binomial  bio  bits  boaz-barak  boltzmann  books  boolean-analysis  borel-cantelli  business  caching  caltech  capital  career  cartoons  characterization  chart  cheatsheet  checklists  chemistry  china  christianity  circuits  clarity  classic  clever-rats  closure  cmu  coarse-fine  coding-theory  cog-psych  comics  commentary  communication  communication-complexity  comparison  compensation  competition  complex-systems  complexity  composition-decomposition  compressed-sensing  computation  concentration-of-measure  concept  conceptual-vocab  concrete  confluence  confusion  conquest-empire  constraint-satisfaction  contrarianism  convergence  convexity-curvature  cool  cooperate-defect  core-rats  counterexample  counting  courage  course  creative  critique  crosstab  crypto  cs  curiosity  curvature  cybernetics  dan-luu  data  data-science  database  debate  decision-making  decision-theory  deep-learning  deep-materialism  deepgoog  definition  degrees-of-freedom  dennett  density  descriptive  design  detail-architecture  differential  dimensionality  direction  dirty-hands  discovery  discrete  discussion  distributed  distribution  DP  draft  duality  duplication  dynamic  dynamical  early-modern  econ-metrics  econ-productivity  econometrics  economics  econotariat  education  efficiency  EGT  electromag  embeddings  emergent  empirical  encyclopedic  ends-means  energy-resources  engineering  ensembles  entropy-like  epistemic  equilibrium  ergodic  essay  estimate  ethics  europe  events  evolution  examples  existence  expanders  expert  expert-experience  explanans  explanation  exposition  extrema  fall-2016  features  fedja  feynman  finance  finiteness  fisher  flexibility  flux-stasis  formal-values  fourier  frequentist  frontier  game-theory  games  generalization  genetics  geometry  giants  google  government  gowers  grad-school  gradient-descent  graph-theory  graphical-models  graphs  gravity  greedy  ground-up  group-selection  growth  growth-econ  GT-101  guide  hamming  hanson  hardness  hashing  heuristic  hi-order-bits  hierarchy  high-dimension  higher-ed  history  hmm  homepage  homogeneity  human-capital  humanity  humility  hypothesis-testing  ideas  identity  ideology  idk  IEEE  iidness  impact  impetus  incentives  industrial-org  inequality  inference  info-dynamics  info-foraging  infographic  information-theory  init  inner-product  innovation  input-output  insight  instinct  integral  intelligence  interdisciplinary  internet  interview  intricacy  intuition  invariance  investing  iron-age  islam  isotropy  iteration-recursion  iterative-methods  jargon  judaism  knowledge  labor  large-factor  latent-variables  law  learning  learning-theory  lecture-notes  lectures  len:long  len:short  lens  lesswrong  let-me-see  letters  levers  lifts-projections  limits  linear-algebra  linear-models  linear-programming  linearity  liner-notes  links  list  local-global  logic  long-term  lower-bounds  luca-trevisan  machine-learning  macro  magnitude  manifolds  map-territory  maps  market-failure  market-power  markets  markov  martingale  matching  math  math.AC  math.AG  math.AT  math.CA  math.CO  math.CV  math.DS  math.FA  math.GN  math.GR  math.MG  math.NT  math.RT  mathtariat  matrix-factorization  measure  measurement  mechanics  mechanism-design  medieval  mediterranean  memory-management  MENA  mental-math  meta:math  meta:research  meta:science  metabuch  metameta  methodology  metric-space  micro  mihai  minimalism  minimum-viable  miri-cfar  ML-MAP-E  model-class  models  moments  monetary-fiscal  money  monotonicity  monte-carlo  morality  mostly-modern  motivation  mrtz  multi  multiplicative  narrative  nationalism-globalism  naturality  nature  network-structure  networking  neuro  neurons  new-religion  news  nibble  nitty-gritty  nlp  no-go  noahpinion  norms  notetaking  novelty  number  obama  objektbuch  occam  occident  off-convex  oly  online-learning  open-problems  openai  operational  optimization  order-disorder  org:anglo  org:biz  org:bleg  org:bv  org:davos  org:econlib  org:edge  org:edu  org:gov  org:inst  org:junk  org:mag  org:mat  org:ngo  org:popup  org:sci  organization  orient  orourke  os  oscillation  outcome-risk  overflow  p:*  p:**  p:***  p:someday  p:whenever  PAC  papadimitriou  papers  paradox  parsimony  paste  pcp  pdf  performance  personal-finance  perturbation  philosophy  phys-energy  physics  pic  pigeonhole-markov  piketty  poast  poetry  polarization  political-econ  polynomials  population-genetics  positivity  practice  pragmatic  pre-2013  preprint  presentation  princeton  prioritizing  priors-posteriors  probabilistic-method  probability  problem-solving  productivity  prof  programming  proof-systems  proofs  properties  property-rights  proposal  pseudorandomness  psychology  psychometrics  puzzles  q-n-a  qra  quantifiers-sums  quantum  quantum-info  quantum-money  questions  quixotic  quotes  rand-approx  rand-complexity  random  rant  rationality  ratty  reading  realness  reason  rec-math  recommendations  recruiting  reduction  reference  reflection  regression  regularity  regularization  regularizer  reinforcement  relativity  relativization  religion  replication  repo  research  research-program  retention  retrofit  rhetoric  rigidity  rigor  rigorous-crypto  roadmap  robust  roots  ryan-odonnell  s:*  s:**  s:***  s:null  salil-vadhan  sampling  sanjeev-arora  scale  scaling-tech  scholar  scholar-pack  science  scitariat  search  securities  selection  selfish-gene  sensitivity  separation  sequential  series  shannon  signal-noise  signum  singularity  sinosphere  skeleton  skunkworks  smoothness  social  social-science  soft-question  space  space-complexity  sparsity  spatial  spectral  speculation  speed  speedometer  spengler  spock  stanford  stat-mech  stats  stirling  stochastic-processes  stories  stream  structure  study  studying  stylized-facts  subjective-objective  sublinear  sum-of-squares  summary  supply-demand  survey  sv  symmetry  synchrony  synthesis  system-design  systems  tactics  talks  tcs  tcstariat  teaching  technology  techtariat  telos-atelos  tensors  the-basilisk  the-classics  the-great-west-whale  the-self  the-trenches  the-world-is-just-atoms  theory-of-mind  theory-practice  theos  thermo  thick-thin  things  thinking  thurston  tidbits  tightness  tim-roughgarden  time-complexity  tip-of-tongue  todo  toolkit  top-n  topics  topology  track-record  trade  tradeoffs  trees  trends  tricki  tricks  troll  truth  turing  tutorial  twitter  uncertainty  uniqueness  unit  universalism-particularism  unsupervised  usa  vague  valiant  values  variance-components  vazirani  video  virtu  visual-understanding  visualization  vitality  volo-avolo  waves  web  whole-partial-many  wigderson  wiki  wild-ideas  winner-take-all  wire-guided  wisdom  within-without  wonkish  wordlessness  working-stiff  workshop  world  wormholes  worrydream  writing  yoga  zooming  🌞  🎓  🎩  👳  🔬  🖥  🤖 

Copy this bookmark:



description:


tags: