rishaanp + deep-learning   9

Deep Learning, NLP, and Representations - colah's blog
This post reviews some extremely remarkable results in applying deep neural networks to natural language processing (NLP). In doing so, I hope to make accessible one promising answer as to why deep neural networks work. I think it’s a very elegant perspective.
deep-learning  NLP  machine-learning  embeddings  word-embeddings  t-sne  recurrent-neural-networks  rnns  shared-representation  bilingual-word-embeddings  modular-network 
february 2018 by rishaanp
Transfer Learning using differential learning rates
In this post, I will be sharing what is transfer learning and how one can use transfer learning with differential learning rates approach to create powerful custom models
transfer-learning  fast.ai  cnns  convolutional-neural-networks  differential-learning-rates  neural-networks  machine-learning  deep-learning 
february 2018 by rishaanp
Visualizing and Understanding Convolutional Networks
Large Convolutional Network models have recently demonstrated impressive classification performance on the ImageNet benchmark. However there is no clear understanding of why they perform so well, or how they might be improved. In this paper we address both issues. We introduce a novel visualization technique that gives insight into the function of intermediate feature layers and the operation of the classifier. We also perform an ablation study to discover the performance contribution from diffe...
deep-learning  arxiv  convolutional-neural-networks  cnns  visualization  interpretability  interpretation 
february 2018 by rishaanp
Do smoother areas of the error surface lead to better generalization?
In the first lecture of the outstanding Deep Learning Course (linking to version 1, which is also superb, v2 to become available early 2018), we learned how to train a state of the art model using…
deep-learning  generalization  Neural-Networks 
february 2018 by rishaanp
Case Study: A world class image classifier for dogs and cats (err.., anything)
It is amazing how far computer vision has come in the last couple of years. Problems that are insanely intractable for classical machine learning methods are a piece of cake for the emerging field of…
deep-learning  convolutions  convolutional-neural-networks  neural-networks  differential-learning-rates  learning-rate  kaggle  fast.ai  transfer-learning  from pocket
february 2018 by rishaanp
Decoding the ResNet architecture // teleported.in
A blog where I share my intuitions about artificial intelligence, machine learning, deep learning.
resnet  shortcut-connections  network-architecture  convolutional-neural-networks  cnns  deep-learning  fast.ai  from pocket
february 2018 by rishaanp
Improving the way we work with learning rate. – techburst
Most optimization algorithms(such as SGD, RMSprop, Adam) require setting the learning rate — the most important hyper-parameter for training deep neural networks. Naive method for choosing learning…
deep-learning  learning-rate  cyclical-learning-rate  fast.ai  learning-rate-annealing  from pocket
february 2018 by rishaanp
Batch normalization in Neural Networks
This article explains batch normalization in a simple way. I wrote this article after what I learned from Fast.ai and deeplearning.ai. I will start with why we need it, how it works, then how to…
batch-normalization  transfer-learning  deep-learning  neural-networks  fast.ai  from pocket
february 2018 by rishaanp

bundles : Data-Science

Copy this bookmark:



description:


tags: