convolutions   28

GitHub - vdumoulin/conv_arithmetic: A technical report on convolution arithmetic in the context of deep learning
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
cnn  deep-learning  deeplearning  visualization  convolution  convnet  convolutional  convolutions  deconvolution  deep_learning 
july 2018 by ohnice
Intuitively Understanding Convolutions for Deep Learning
convolutions as a concept are fascinatingly powerful and highly extensible, and in this post, we’ll break down the mechanics of the convolution operation, step-by-step, relate it to the standard fully connected network, and explore just how they build up a strong visual hierarchy, making them powerful feature extractors for images.
deeplearning  neuralnetworks  convolutions  tutorials  datascience 
june 2018 by areich
deep learning - What do you mean by 1D, 2D and 3D Convolutions in CNN? - Stack Overflow
Can anyone please clearly explain the difference between 1D, 2D and 3D convolutions in CNN (Deep learning) with examples? In a nutshell, convolutional direction & output shape is important!
Archive  1d  2d  3d  convolutions  deeplearning 
june 2018 by leninworld
locuslab/TCN: Sequence modeling benchmarks and temporal convolutional networks
GitHub is where people build software. More than 27 million people use GitHub to discover, fork, and contribute to over 80 million projects.
cnn  nlp  deep-learning  temporal  convolution  convolutional  convolutions 
march 2018 by nharbour
Case Study: A world class image classifier for dogs and cats (err.., anything)
It is amazing how far computer vision has come in the last couple of years. Problems that are insanely intractable for classical machine learning methods are a piece of cake for the emerging field of…
deep-learning  convolutions  convolutional-neural-networks  neural-networks  differential-learning-rates  learning-rate  kaggle  fast.ai  transfer-learning  from pocket
february 2018 by rishaanp
Going beyond full utilization: The inside scoop on Nervana's Winograd kernels - Nervana | Nervana
By: Urs Köster and Scott Gray This is part 2 of a series of posts on how Nervana uses the Winograd algorithm to make convolutional networks faster than e
DeepLearning  Winograd  convolutions  CVPR  CVPR2016 
july 2016 by rseymour

related tags

1d  2d  3d  adam  alternative  animation  animations  archive  article  blogs  calculus  cnn  cnns  computer-vision  convnet  convolution  convolutional-neural-networks  convolutional  cool  cuda  cvpr  cvpr2016  data-augmentation  datascience  dataset  datasets  deconvolution  deep-learning  deep_learning  deeplearning  differential-learning-rates  distributions  dropout  electromagnetism  fast.ai  fft  gradient-boosting  gradients  image-recognition  image-similarity  image  interactive  javascript  kaggle  learning-rate-annealing  learning-rate  long-short-term-memory_networks  machine-learning  mathematics  max-pooling  maxpooling  neural-net  neural-networks  neuralnetworks  nlp  normals  physics  probability  rectified-linear-unit  recurrent-neural-networks  reference  reinforcement-learning  rnn  rnns  sgd  sgdr  signal_processing  temporal  test-time-augmentation  theano  transfer-learning  translation  tutorial  tutorials  types  visualization  wikipedia  winograd 

Copy this bookmark:



description:


tags: