neural_networks   1018

« earlier    

[1902.06720] Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent
"A longstanding goal in deep learning research has been to precisely characterize training and generalization. However, the often complex loss landscapes of neural networks have made a theory of learning dynamics elusive. In this work, we show that for wide neural networks the learning dynamics simplify considerably and that, in the infinite width limit, they are governed by a linear model obtained from the first-order Taylor expansion of the network around its initial parameters. Furthermore, mirroring the correspondence between wide Bayesian neural networks and Gaussian processes, gradient-based training of wide neural networks with a squared loss produces test set predictions drawn from a Gaussian process with a particular compositional kernel. While these theoretical results are only exact in the infinite width limit, we nevertheless find excellent empirical agreement between the predictions of the original network and those of the linearized version even for finite practically-sized networks. This agreement is robust across different architectures, optimization methods, and loss functions."
to:NB  neural_networks  optimization 
2 days ago by cshalizi
Do ImageNet Classifiers Generalize to ImageNet?
"We build new test sets for the CIFAR-10 and ImageNet datasets. Both benchmarks have been
the focus of intense research for almost a decade, raising the danger of overfitting to excessively
re-used test sets. By closely following the original dataset creation processes, we test to what
extent current classification models generalize to new data. We evaluate a broad range of models
and find accuracy drops of 3% – 15% on CIFAR-10 and 11% – 14% on ImageNet. However,
accuracy gains on the original test sets translate to larger gains on the new test sets. Our results
suggest that the accuracy drops are not caused by adaptivity, but by the models’ inability to
generalize to slightly “harder” images than those found in the original test sets."

--- The astonishing thing to me is the _linear_ relationship between accuracy on the old and new data-set versions. It's uncannily good. (Also: tiny changes in data-preparation make a big difference!)
to:NB  have_read  classifiers  neural_networks  data_sets  to_teach:data-mining 
5 days ago by cshalizi
NLP Learning Series: Text Preprocessing Methods for Deep Learning
Recently, I started up with an NLP competition on Kaggle called Quora Question insincerity challenge. It is an NLP Challenge on text classification and as the problem has become more clear after working through the competition as well as by going through the invaluable kernels put up by the kaggle ...
neural_networks  natural_language_processing  machine_learning 
5 weeks ago by n8henrie
Size-Independent Sample Complexity of Neural Networks
"We study the sample complexity of learning neural networks, by providing new bounds on their Rademacher complexity assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth, and under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest."
to:NB  to_read  neural_networks  learning_theory  rakhlin.sasha  via:? 
5 weeks ago by cshalizi
mit-deep-learning
This repository is a collection of tutorials for MIT Deep Learning courses. More added as courses progress.
machine_learning  deep_learning  neural_networks  AI 
6 weeks ago by naijeru
GitHub - BrainJS/brain.js: 🤖 Neural networks in JavaScript

brain.js is a library of Neural Networks written in JavaScript.

NEW! A fun and practical introduction to Brain.js

💡 Note: This is a continuation of the harthur/brain repository (which is not maintained anymore). For more details, check out this issue.
for_future_projects  brain_candy  brain_candy/start_here  neural_networks  javascript  artificial_intelligence  machine_learning 
7 weeks ago by psawaya
A Style-Based Generator Architecture for Generative Adversarial Networks - YouTube
Какое-то просто монстричество с созданием и модификацией одних изображений на основе других с помощью генеративно состязательных сетей.
Работа: http://stylegan.xyz/paper
Об алгоритме: https://ru.wikipedia.org/wiki/%D0%93%D0%B5%D0%BD%D0%B5%D1%80%D0%B0%D1%82%D0%B8%D0%B2%D0%BD%D0%BE-%D1%81%D0%BE%D1%81%D1%82%D1%8F%D0%B7%D0%B0%D1%82%D0%B5%D0%BB%D1%8C%D0%BD%D0%B0%D1%8F_%D1%81%D0%B5%D1%82%D1%8C
programming  neural_networks  machine_learning  deep_learning 
8 weeks ago by gevorg
Neural networks - YouTube
Мини видеокурс по нейронным сетям
video  courses  neural_networks  deep_learning  machine_learning 
8 weeks ago by gevorg
torch-rnn options & settings
Efficient, reusable RNNs and LSTMs for torch. Contribute to jcjohnson/torch-rnn development by creating an account on GitHub.
neural_networks  torch_rnn 
10 weeks ago by jkeefe

« earlier    

related tags

2019  a-i  adobevoco  adversarial-examples  adversarial-learning  adversarial  adversarial_examples  ai  ai_assisted_fake_porn  alexa  algorithms  amazon  amazon_echo  animation  approximation  art  artificial_intelligence  arxiv  audio  august  bayes  biometrics  blinking  blog  book  books  bot  brain_candy/start_here  brain_candy  c_elegans  cameras  capital_one  cheatsheet  classifiers  clinical_vs_actuarial_prediction  cognitive_science  comparison  computational_statistics  computer  computer_vision  computing  conference  constraint-satisfaction  convolution  courses  cs  dance  data_science  data_sets  datascience  deep-learning  deep_learning  deepfakes  demis_hassabis  doom  earthquakes  eric.vanden-eijnden  excel  experimental_psychology  explorative  facebook  facialrecognition  fascinating  finance  florence  for_future_projects  free  functional  gaussian_processes  generative  generators  geoffrey_hinton  graphics  graphs  hardware  have_read  high-dimensional_statistics  humor  hunches  image-recognition  image  images  in_nb  interating_particle_system  interface  javascript  language  learning  learning_theory  linear_regression  literature  mac  machine-learning  machine_learning  machinelearning  mahoney.michael  mario_klingemann  math  matloff.norman  mixture_models  ml  mods  music  natural_language_processing  neuroevolution  neuroscience  nitzan_mekel-bobrov  nlp  nonparametrics  online  optimization  papers  perception  poetry  programming  project_ideas  python  rakhlin.sasha  rasmussen.carl_edward  raspberrypi  regression  reinforcement  rekognition  relational_learning  research  security  self-driving_cars  simulation  sound  speech_recognition  speech_synthesis  spreadsheets  sprites  statistical_mechanics  statistics  stochastic_processes  stocks  style_transfer  surveillance  synthesis  tensorflow  tiernan_ray  to:nb  to_be_shot_after_a_fair_trial  to_read  to_teach:data-mining  torch_rnn  toread  transparency  vector_space  video  video_evidence  visual_system  visualization  voice  voice_alteration  voice_cloning  voice_data  voice_isolation  waymo  workshop  world_models  worms  your_favorite_deep_neural_network_sucks 

Copy this bookmark:



description:


tags: