« earlier    

Ten Techniques Learned From
1. Use the library
2. Don’t use one learning rate, use many
3. How to find the right learning rate
4. Cosine annealing
5. Stochastic Gradient Descent with restarts
6. Anthropomorphise your activation functions
7. Transfer learning is hugely effective in NLP
8. Deep learning can challenge ML in tackling structured data
9. A game-winning bundle: building up sizes, dropout and TTA
10. Creativity is key
dl  ml  deeplearning  transferlearning  learningrate 
14 days ago by drmeme
Tensorboard Callback for Fastai - Deep Learning - Deep Learning Course Forums
I created a callback that logs model and training information that can be viewed in tensorboard.
Tensorboard is a visualization tool that can help debug and explore your model. Read more about it here. Tensorboa…
tensorboard  fastai  deep-learning 
28 days ago by nharbour
fastai/devise.ipynb at master · fastai/fastai
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
knn  nearest-neighbour  k-nearest-neighbours  deep-learning  nmslib 
7 weeks ago by nharbour

« earlier    

related tags

adam  adamw  ai  algorithm  algorithms  ami  anaconda  arguments  attention  augmentation  autoencoder  autograd  automl  averaging  aws  batch-normalization  batch-size  benjamin-franklin  bias  black-and-white  blogs  box  build  cam  capsule  capsules  career  case  categorical  cheat-sheat  class-activation  classification  classifier  cli  cloud  cluster  cnmem  cnn  cnns  code  colab  collaborative-filtering  collaborative  compact  competition  conda  convolutional-neural-networks  convolutions  corpus  course-notes  course  courses  cuda  cudnn  cyclical-learning-rate  dae  data-augmentation  data  dataset  dawnbench  decision-trees  deep-learning  deep-sleep  deeplearning  denoising  differential-learning-rates  discussion  dl  download  dropout  education  eigenvector  eigenvectors  embedding  ensemble  ethics  farm  fastai  feature-importance  filtering  free  gcloud  git  github  google  gpu  gradient-boosting  health  healthcare  heatmap  hinton  hype  hyperparameter-optimization  image  imagenet  images  individual-conditional-expectation  install  installation  interpretation  jigsaw  job  jupyter-notebook  jupyter  k-nearest-neighbours  kaggle  keras  knn  language-modeling  learning-rate-annealing  learning-rate-finder  learning-rate  learning  learningrate  lecture  libraries  library  long-short-term-memory_networks  lstm  machine-learning  machine  machine_learning  machinelearning  math  metaphor  ml  model  models  naive-bayes  nearest-neighbour  network-architecture  neural-networks  nlp  nmslib  nn  normalization  notebooks  notes  object-detection  ocr  optimizer  optimizers  overfitting  packages  parameter  partial-dependence  pc  pil  pillow  programming  python  pytorch  random-forest  randomforests  rankgauss  reading  recommendation  rectified-linear-unit  recurrent-neural-networks  reinforcement-learning  resnet  resources  rig  rnn  rnns  sae  sagemaker  script  sdae  search  seq2seq  server  service  setup  sff  sgd  sgdr  shortcut-connections  single-shot  spam  spell-check  spelling  ssd  state-farm  state  statistics  structure  structured-data  summarisation  summarise  super-convergence  svm  swa  systemd  tasks  teaching  technique  techniques  tensorboard  termicam  test-time-augmentation  testing  text_classification  theano  time-series-data  tips  training  transfer-learning  transferlearning  tree-interpreter  tuning  tutorial  ubuntu  unit-test  unit-testing  validation  verge  vggcam  videos  visualisation  visualise  visualization  waterfall-charts  weight  wikipedia  windows  winner  winners  work  xgboost 

Copy this bookmark: