The matrix calculus you need for deep learning

65 bookmarks. First posted by dlkinney july 2018.

Jeremy's courses show how to become a world-class deep learning practitioner with only a minimal level of scalar calculus, thanks to leveraging the automatic differentiation built in to modern deep learning libraries. But if you really want to really understand what's going on under the hood of these libraries, and grok academic papers discussing the latest advances in model training techniques, you'll need to understand certain bits of the field of matrix calculus.

calculus
13 days ago by yizhexu

Jeremy's courses show how to become a world-class deep learning practitioner with only a minimal level of scalar calculus, thanks to leveraging the automatic differentiation built in to modern deep learning libraries. But if you really want to really understand what's going on under the hood of these libraries, and grok academic papers discussing the latest advances in model training techniques, you'll need to understand certain bits of the field of matrix calculus.

deeplearning
math
learning
howto
guides
machinelearning
neuralnetworks
13 days ago by wesleythill

Jeremy's courses show how to become a world-class deep learning practitioner with only a minimal level of scalar calculus, thanks to leveraging the automatic differentiation built in to modern deep learning libraries. But if you really want to really understand what's going on under the hood of these libraries, and grok academic papers discussing the latest advances in model training techniques, you'll need to understand certain bits of the field of matrix calculus.

13 days ago
by snafubar
> This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks. We assume no math knowledge beyond what you learned in calculus 1, and provide links to help you refresh the necessary math where needed. Note that you do not need to understand this material before you start learning to train and use deep learning in practice; rather, this material is for those who are already familiar with the basics of neural networks, and wish to deepen their understanding of the underlying math. Don't worry if you get stuck at some point along the way---just go back and reread the previous section, and try writing down and working through some examples. And if you're still stuck, we're happy to answer your questions in the Theory category at forums.fast.ai. Note: There is a reference section at the end of the paper summarizing all the key matrix calculus rules and terminology discussed here.

Co-written by Terrence Parr of ANTLR fame

15 days ago
by briandk
Co-written by Terrence Parr of ANTLR fame

The matrix calculus you need for deep learning The Matrix Calculus You Need For Deep Learning Brought to you by explained.ai (We teach in University of San…

from instapaper
16 days ago by rogerhsueh

16 days ago
by kas

(We teach in University of San Francisco's MS in Data Science program and have other nefarious projects underway. You might know Terence as the creator of the ANTLR parser generator. For more material, see Jeremy's fast. via Pocket

IFTTT
Pocket
august 2019 by domingogallardo

This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks. We assume no math knowledge beyond what you learned in calculus 1, and provide links to help you refresh the necessary math where needed.

maths
machinelearning
july 2019 by sandipb

Brought to you by explained.ai (We teach in University of San Francisco's MS in Data Science program and have other nefarious projects underway. You might know…

from instapaper
march 2019 by matttrent

Reviews matrix derivatives. "This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks. We assume no math knowledge beyond what you learned in calculus 1, and provide links to help you refresh the necessary math where needed."

calculus
matrix-calculus
neural-networks
!M-⚽-methods-data-analysis-bayesian-statistics
january 2019 by beyondseven

This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks

calculus
math
january 2019 by force

january 2019
by vqc

(We teach in University of San Francisco's MS in Data Science program and have other nefarious projects underway. You might know Terence as the creator of the ANTLR parser generator. For more material, see Jeremy's fast.

IFTTT
Pocket
december 2018 by timothyarnold

july 2018 by dlkinney

tags

!m-⚽-methods-data-analysis-bayesian-statistics ai algebra bigdata calculus data-science deep-learning deeplearning deep_learning dl education field gpreps guides howto ifttt learn learning linear-algebra linear linearalgebra machine-learning machine machinelearning machine_learning math mathematics maths matrix-algebra matrix-calculus matrix ml mlt network neural-networks neural neuralnets neuralnetworks pocket python reference stats:machine-learning tensor textbook theory toread tutorial tutorials vector