polynomials   133

« earlier    

[1705.00098] Xorshift random number generators from primitive polynomials
A class of xorshift random number generators (RNGs) are introduced by Marsaglia. We have proposed an algorithm which constructs a full period xorshift RNG from a given primitive polynomial. It is shown there is a weakness present in those RNGs and is suggested its improvement. A separate algorithm is also proposed which returns a full period xorshift generator with desired number of xorshift operations.%We also introduce the notion of tweaked primitive multiple-recursive matrix method with improved linear complexity.
cryptography  algorithms  number-theory  polynomials  rather-interesting  performance-measure  nudge-targets  consider:stress-testing 
may 2017 by Vaguery
[1109.2396] New Solutions of $d=2x^3+y^3+z^3$
We discuss finding large integer solutions of d=2x3+y3+z3 by using Elsenhans and Jahnel's adaptation of Elkies' LLL-reduction method. We find 28 first solutions for |d|<10000.
number-theory  algebra  polynomials  constraint-satisfaction  rather-interesting  to-write-about  nudge-targets  consider:looking-to-see  stamp-collecting  algorithms 
april 2017 by Vaguery
Peter Norvig, the meaning of polynomials, debugging as psychotherapy | Quomodocumque
He briefly showed a demo where, given values of a polynomial, a machine can put together a few lines of code that successfully computes the polynomial. But the code looks weird to a human eye. To compute some quadratic, it nests for-loops and adds things up in a funny way that ends up giving the right output. So has it really ”learned” the polynomial? I think in computer science, you typically feel you’ve learned a function if you can accurately predict its value on a given input. For an algebraist like me, a function determines but isn’t determined by the values it takes; to me, there’s something about that quadratic polynomial the machine has failed to grasp. I don’t think there’s a right or wrong answer here, just a cultural difference to be aware of. Relevant: Norvig’s description of “the two cultures” at the end of this long post on natural language processing (which is interesting all the way through!)
mathtariat  org:bleg  nibble  tech  ai  talks  summary  philosophy  lens  comparison  math  cs  tcs  polynomials  nlp  debugging  psychology  cog-psych  complex-systems  deep-learning  analogy  legibility  interpretability 
march 2017 by nhaliday
6.896: Essential Coding Theory
- probabilistic method and Chernoff bound for Shannon coding
- probabilistic method for asymptotically good Hamming codes (Gilbert coding)
- sparsity used for LDPC codes
mit  course  yoga  tcs  complexity  coding-theory  math.AG  fields  polynomials  pigeonhole-markov  linear-algebra  probabilistic-method  lecture-notes  bits  sparsity  concentration-of-measure  linear-programming  linearity  expanders  hamming  pseudorandomness  crypto  rigorous-crypto  communication-complexity  no-go  madhu-sudan  shannon  unit  p:** 
february 2017 by nhaliday
What is the relationship between information theory and Coding theory? - Quora
basically:
- finite vs. asymptotic
- combinatorial vs. probabilistic (lotsa overlap their)
- worst-case (Hamming) vs. distributional (Shannon)

Information and coding theory most often appear together in the subject of error correction over noisy channels. Historically, they were born at almost exactly the same time - both Richard Hamming and Claude Shannon were working at Bell Labs when this happened. Information theory tends to heavily use tools from probability theory (together with an "asymptotic" way of thinking about the world), while traditional "algebraic" coding theory tends to employ mathematics that are much more finite sequence length/combinatorial in nature, including linear algebra over Galois Fields. The emergence in the late 90s and first decade of 2000 of codes over graphs blurred this distinction though, as code classes such as low density parity check codes employ both asymptotic analysis and random code selection techniques which have counterparts in information theory.

They do not subsume each other. Information theory touches on many other aspects that coding theory does not, and vice-versa. Information theory also touches on compression (lossy & lossless), statistics (e.g. large deviations), modeling (e.g. Minimum Description Length). Coding theory pays a lot of attention to sphere packing and coverings for finite length sequences - information theory addresses these problems (channel & lossy source coding) only in an asymptotic/approximate sense.
q-n-a  qra  math  acm  tcs  information-theory  coding-theory  big-picture  comparison  confusion  explanation  linear-algebra  polynomials  limits  finiteness  math.CO  hi-order-bits  synthesis  probability  bits  hamming  shannon  intricacy  nibble  s:null  signal-noise 
february 2017 by nhaliday
Ehrhart polynomial - Wikipedia
In mathematics, an integral polytope has an associated Ehrhart polynomial that encodes the relationship between the volume of a polytope and the number of integer points the polytope contains. The theory of Ehrhart polynomials can be seen as a higher-dimensional generalization of Pick's theorem in the Euclidean plane.
math  math.MG  trivia  polynomials  discrete  wiki  reference  atoms  geometry  spatial  nibble  curvature  convexity-curvature 
january 2017 by nhaliday

« earlier    

related tags

08-august  10-12  11-12  12-december  2013  2014  285  a2h  aaronson  abstraction  acm  acmtariat  additive-combo  ai  alg-combo  algebra-ii  algebra  algebraic-complexity  algebraic-geometry  algorithm  algorithms  analogy  animation  announcement  anticoncentration  applet  arrows  art  atoms  beautiful  beauty  ben-recht  better-explained  bezier  big-list  big-picture  big-surf  bigpicture  binomial  bits  books  boolean-analysis  boolean  calculators  ccss  characterization  chart  classic  clojure  coding-theory  cog-psych  combo-optimization  communication-complexity  comparison  complex-systems  complex  complexity  complexnumbers  compression  compsci  computer  computerscience  concentration-of-measure  concentration_of_measure  concept  confusion  conics  consider:feature-discovery  consider:looking-to-see  consider:performance-measures  consider:rediscovery  consider:representation  consider:stress-testing  constraint-satisfaction  contradiction  convexity-curvature  convolution  cool  course  critical-points  crypto  cryptography  cs  curvature  curves  data  debugging  deep-learning  dense.js  deriatives  diigo  discrete-mathematics  discrete  distribution  dp  dynamic  dynamical  edchat  education  egan  equation  equations  erdos  error-correcting-codes  error  essay  examples  expanders  explanation  exposition  factor  factorization  fields  finite  finiteness  fourier  fractal  fractals  fractional  fractions  functions  geometry  gowers  gradient-descent  graph-theory  graph  graphics  graphing-quadratics  graphs  greedy  ground-up  hamming  have_skimmed  hi-order-bits  highered  history  hmm  homogeneity  identity  ifttt  images  information-theory  inner-product  integral-calculus  interesting  interpretability  intricacy  intuition  ios  iteration-recursion  iterative-methods  julia  knowledge  learning-theory  learning_theory  lecture-notes  legendre  legibility  lens  let-me-see  levers  lifts-projections  limits  linear-algebra  linear-programming  linearity  linearization  linearized  liner-notes  list  littlewood  log-functions  machine-learning  madhu-sudan  magnitude  mandelbrot  markov  math-7  math-overflow  math.ac  math.ag  math.at  math.ca  math.co  math.cv  math.ds  math.mg  math.nt  math.rt  math  mathchat  mathematics  mathfest  maths  mathtariat  mating  maxima-and-minima  metabuch  metameta  method_of_moments  mit  monotonicity  monte-carlo  motivation  movies  mrtz  multiplicative  nibble  nlp  no-go  non-unique  nudge-targets  number-theory  number  numbers  numbertheory  numericalmethods  nurbs  objektbuch  oly  open-questions  operational  optimization  org:bleg  org:inst  org:mat  origami  oscillation  overflow  oxbridge  p:***  p:**  papers  patterns  pcp  pdf  performance-measure  permutation  philosophy  picasso  pictures  pigeonhole-markov  plot  polynomial  positivity  postmortem  presentation  probabilistic-method  probability  problem-solving  programming  proof  proofs  properties-of-functions  properties  pseudorandomness  psychology  python  q-n-a  qra  quadratic-equations  quantum-info  quantum-money  quantum  question  questions  rand-approx  random  rather-interesting  recipes  recommendations  reference  regularity  representation  research-article  research  residue  riemann  rigidity  rigorous-crypto  rings  roots  s:***  s:null  scala  scale  science  sequences  set  shannon  signal-noise  signal-processing  signal_processing  skeleton  soft-question  space-complexity  sparsity  spatial  sphere  splines  stamp-collecting  street-fighting  structure  summary  survey  synthesis  synthetic_division  talks  tcs  tcstariat  teaching  tech  techtariat  tensors  thurston  tidbits  time-complexity  to-write-about  to_read  todo  toolkit  top-n  topics  topology  tradeoffs  tricki  trivia  unit  video  videos  visual-understanding  visualization  wiki  wow  yoga  👳 

Copy this bookmark:



description:


tags: