nharbour + seq2seq   12

Why you should care about byte-level sequence-to-sequence models in NLP
This blogpost explains how byte-level models work, how this brings about the benefits they have, and how they relate to other models — character-level and word-level models, in particular.
rnn  seq2seq  byte  bytes  deep-learning  oov  nlp 
november 2018 by nharbour
Attention? Attention!
Attention has been a fairly popular concept and a useful tool in the deep learning community in recent years. In this post, we are gonna look into how attent...
tutorial  tutorials  attention  deep-learning  seq2seq 
september 2018 by nharbour
dreasysnail/textCNN_public
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
tensorflow  convolutional  seq2seq  autoencoder  deep-learning  github 
july 2018 by nharbour
ymym3412/textcnn-conv-deconv-pytorch: text convolution-deconvolution auto-encoder model in PyTorch
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
pytorch  convolutional  seq2seq  autoencoder  deep-learning  github 
july 2018 by nharbour
Seq2Seq-PyTorch/nmt_autoencoder.py at master · MaximumEntropy/Seq2Seq-PyTorch
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
autoencoder  pytorch  nlp  lstm  deep-learning  seq2seq 
july 2018 by nharbour

Copy this bookmark:



description:


tags: