linear-algebra   766

« earlier    

Relationship between SVD and PCA. How to use SVD to perform PCA? - Cross Validated
Principal component analysis (PCA) is usually explained via an eigen-decomposition of the covariance matrix. However, it can also be performed via singular value decomposition (SVD) of the data matrix X. How does it work? What is the connection between these two approaches? What is the relationship between SVD and PCA? – Answer by amoeba
principal-component-analysis  data-analysis  statistical-methods  linear-algebra  mathematics  stackexchange  methodology 
9 days ago by haikara
Linear algebra cheat sheet for deep learning – Towards Data Science – Medium
While participating in Jeremy Howard’s excellent deep learning course I realized I was a little rusty on the prerequisites and my fuzziness was impacting my ability to understand concepts like…
linear-algebra  python  numpy 
5 weeks ago by sidmitra

« earlier    

related tags

aaronson  academic_lab  acm  acmtariat  additive  ai  alg-combo  algebra-tricks  algebra  algorithms  analogy  approximation  arrows  atoms  backprop  ben-recht  better-explained  big-list  big-picture  binomial  biplots  bits  book  books  calculation  cartoons  characterization  chart  cheat-sheet  classic  closure  coding-theory  combo-optimization  communication-complexity  comparison  complexity  compressed-sensing  concentration-of-measure  concept  confusion  consider:representation  convexity  core-rats  correspondence-analysis  course  crypto  curiosity  cyclejs  daily  data-analysis  data-science  data_science  datascience  deep-learning  degrees-of-freedom  differential  dimensionality  direction  distribution  dp  duality  dynamic  dynamical  electricity-and-magnetism  embeddings  entropy-like  estimate  examples  expanders  explanation  explorable-explanations  exploratory  exposition  extrema  faq  fields  finiteness  fourier  free  from-inoreader  geometry  gowers  gradient-descent  graph-theory  graphics  graphs  greedy  ground-up  hacker-news-comments  hamming  hi-order-bits  hierarchy  high-dimension  homogeneity  identity  ieee  ifttt  inbox  information-theory  init  inner-product  interactive  intricacy  intuition  invariance  isotropy  iteration-recursion  iterative-methods  knowledge  large-factor  learning-theory  learning  lecture-notes  lecture  lectures  let-me-see  levers  limits  linear-programming  linearalgebra  linearity  liner-notes  links  list  machine-learning  madhu-sudan  magnitude  markov  matching  math.ct  math.fa  math.nt  math.rt  math  mathematician  mathematics  maths  mathtariat  matrix  measure  metabuch  metameta  methodology  metric-space  metrics  michael-nielsen  mit  ml  monotonicity  monte-carlo  motivation  mrtz  multi  multiplication  nibble  nitty-gritty  nn  no-go  norms  novelty  nudge-targets  numerical-methods  numpy  oly  online-courses  online-learning  operational  optimization  org:bleg  orourke  oscillation  overflow  p:**  p:null  pac  papers  paradox  pdf  person  pigeonhole-markov  polynomials  positivity  principal-component-analysis  probabilistic-method  probability  problem-solving  programming  proofs  pseudorandomness  python  q-n-a  qra  random-matrices  random  reads  reference  relaxation  research  rigidity  rigorous-crypto  robust  s:***  s:**  s:*  s:null  sage  scipy  sdp  sebastien-bubeck  separation  shannon  shift  signal-noise  skeleton  soft-question  sparsity  spatial  spectral  stackexchange  stanford  statistical-methods  stats  structure  summary  survey  svd-pca  synthesis  talks  tcs  tcstariat  techtariat  tensors  textbook  thurston  tidbits  tightness  time-complexity  toolkit  top-n  tricki  tricks  tutorial  type:book  unit  video  visual-understanding  visualisation  visualization  wiki  yoga  🔬 

Copy this bookmark: