nhaliday + hamming   10

6.896: Essential Coding Theory
- probabilistic method and Chernoff bound for Shannon coding
- probabilistic method for asymptotically good Hamming codes (Gilbert coding)
- sparsity used for LDPC codes
mit  course  yoga  tcs  complexity  coding-theory  math.AG  fields  polynomials  pigeonhole-markov  linear-algebra  probabilistic-method  lecture-notes  bits  sparsity  concentration-of-measure  linear-programming  linearity  expanders  hamming  pseudorandomness  crypto  rigorous-crypto  communication-complexity  no-go  madhu-sudan  shannon  unit  p:** 
february 2017 by nhaliday
What is the relationship between information theory and Coding theory? - Quora
basically:
- finite vs. asymptotic
- combinatorial vs. probabilistic (lotsa overlap their)
- worst-case (Hamming) vs. distributional (Shannon)

Information and coding theory most often appear together in the subject of error correction over noisy channels. Historically, they were born at almost exactly the same time - both Richard Hamming and Claude Shannon were working at Bell Labs when this happened. Information theory tends to heavily use tools from probability theory (together with an "asymptotic" way of thinking about the world), while traditional "algebraic" coding theory tends to employ mathematics that are much more finite sequence length/combinatorial in nature, including linear algebra over Galois Fields. The emergence in the late 90s and first decade of 2000 of codes over graphs blurred this distinction though, as code classes such as low density parity check codes employ both asymptotic analysis and random code selection techniques which have counterparts in information theory.

They do not subsume each other. Information theory touches on many other aspects that coding theory does not, and vice-versa. Information theory also touches on compression (lossy & lossless), statistics (e.g. large deviations), modeling (e.g. Minimum Description Length). Coding theory pays a lot of attention to sphere packing and coverings for finite length sequences - information theory addresses these problems (channel & lossy source coding) only in an asymptotic/approximate sense.
q-n-a  qra  math  acm  tcs  information-theory  coding-theory  big-picture  comparison  confusion  explanation  linear-algebra  polynomials  limits  finiteness  math.CO  hi-order-bits  synthesis  probability  bits  hamming  shannon  intricacy  nibble  s:null  signal-noise 
february 2017 by nhaliday

bundles : academepeepstcs

related tags

aaronson  academia  acm  advice  alg-combo  aphorism  big-picture  bits  boolean-analysis  career  checklists  classic  coding-theory  communication-complexity  comparison  complexity  concentration-of-measure  concept  confluence  confusion  courage  course  crypto  curiosity  dana-moshkovitz  decision-making  discovery  einstein  equilibrium  expanders  expert  expert-experience  explanation  fields  finiteness  fourier  frontier  giants  grad-school  graph-theory  graphs  ground-up  gtd  hamming  hi-order-bits  high-variance  info-dynamics  information-theory  init  innovation  intersection  intersection-connectedness  intricacy  lecture-notes  len:long  lens  limits  linear-algebra  linear-programming  linearity  madhu-sudan  math  math.AG  math.CO  math.NT  mathtariat  measure  meta:math  meta:research  meta:science  metabuch  metameta  metric-space  mit  nibble  no-go  novelty  optimate  org:bleg  org:edu  org:junk  p:**  p:whenever  people  pigeonhole-markov  polynomials  prioritizing  probabilistic-method  probability  productivity  pseudorandomness  q-n-a  qra  quotes  ratty  reference  reflection  research  rigorous-crypto  s-factor  s:***  s:null  scholar  science  shannon  signal-noise  similarity  sparsity  stories  strategy  success  synthesis  tactics  tcs  tcstariat  the-trenches  thinking  top-n  tradeoffs  unit  wiki  wisdom  yoga  🎓  👳  🔬  🦉 

Copy this bookmark:



description:


tags: