seq2seq   100

« earlier    

dreasysnail/textCNN_public
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
tensorflow  convolutional  seq2seq  autoencoder  deep-learning  github 
6 weeks ago by nharbour
ymym3412/textcnn-conv-deconv-pytorch: text convolution-deconvolution auto-encoder model in PyTorch
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
pytorch  convolutional  seq2seq  autoencoder  deep-learning  github 
6 weeks ago by nharbour
Seq2Seq-PyTorch/nmt_autoencoder.py at master · MaximumEntropy/Seq2Seq-PyTorch
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
autoencoder  pytorch  nlp  lstm  deep-learning  seq2seq 
6 weeks ago by nharbour
Attention Is All You Need
The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English-to-German translation task, improving over the existing best results, including ensembles by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state-of-the-art BLEU score of 41.8 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. We show that the Transformer generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
nlp  rnn  cnn  research  paper  seq2seq  AI  machinecomprehension  machinelearning 
10 weeks ago by sachaa

« earlier    

related tags

act  adversarial  ai  algorithms  amazing  asr-error  asr  attention  aug1  autoencoder  blog  bot  bots  brenden-lake  causal-convnet  chainer  chat  chatbot  chollet  cnn  code  colab  convnet  convolutional  debugging  decoder  deep-learning  deep  deep_learning  deeplearning  dialog  dialogue  discrete  discussion  dothis  dtw  e2e  encoder  encoding  fast.ai  finance  financial  forecast  forecasting  generalization  generative  geometry  github  google  graves  history  howto  icml2017  information  keras  layperson  learning  legaldeck  library  libs  links  lstm  machine-translation  machine_learning  machine_translation  machinecomprehension  machinelearning  ml  models  mt  nearest  neighbor  network  neural-mt  neural-net  neural_networks  nlp  nlproc  nmt  nn  note  paper  papers  python  pytorch  q&a  reading-lists  readit  reddit  reinforcementlearning  research  rnn  search  segphrase  seqm  sequence  sequential-modeling  speech  speech_recognition  summarisation  summariser  summarization  summarizer  tensorflow  textsum  tf  theano  time-series  tips  torch  transducer  transformer  tts  tutorial  tutorials  visualization  vqvae  wavenet 

Copy this bookmark:



description:


tags: