summarization   536

« earlier    

[1811.01824] Structured Neural Summarization
Summarization of long sequences into a concise statement is a core problem in natural language processing, requiring non-trivial understanding of the input. Based on the promising results of graph neural networks on highly structured data, we develop a framework to extend existing sequence encoders with a graph component that can reason about long-distance relationships in weakly structured data such as text. In an extensive evaluation, we show that the resulting hybrid sequence-graph models outperform both pure sequence models as well as pure graph models on a range of summarization tasks.
natural-language-processing  summarization  complicated  algorithms  neural-networks  to-write-about  consider:performance-measures  consider:benchmarks 
8 weeks ago by Vaguery
Unsupervised Text Summarization using Sentence Embeddings
Overview of an approach used to perform Text Summarization in Python
writiny  engine  research  summarization  email  python  howto 
11 weeks ago by jaygooby
Get To The Point: Summarization with Pointer-Generator Network
Neural sequence-to-sequence models have
provided a viable new approach for abstractive text summarization (meaning
they are not restricted to simply selecting
and rearranging passages from the original text). However, these models have two
shortcomings: they are liable to reproduce
factual details inaccurately, and they tend
to repeat themselves. In this work we propose a novel architecture that augments the
standard sequence-to-sequence attentional
model in two orthogonal ways. First,
we use a hybrid pointer-generator network
that can copy words from the source text
via pointing, which aids accurate reproduction of information, while retaining the
ability to produce novel words through the
generator. Second, we use coverage to
keep track of what has been summarized,
which discourages repetition. We apply
our model to the CNN / Daily Mail summarization task, outperforming the current abstractive state-of-the-art by at least 2 ROUGE points
writiny  engine  research  summarization  paper  Pointer.Generator.Networks 
11 weeks ago by jaygooby
Attentional, RNN-based encoder-decoder models for abstractive summarization
have achieved good performance on short input and output sequences. For longer
documents and summaries however these models often include repetitive and
incoherent phrases. We introduce a neural network model with a novel intraattention that attends over the input and continuously generated output separately,
and a new training method that combines standard supervised word prediction and
reinforcement learning (RL). Models trained only with supervised learning often
exhibit “exposure bias” – they assume ground truth is provided at each step during
training. However, when standard word prediction is combined with the global sequence prediction training of RL the resulting summaries become more readable.
We evaluate this model on the CNN/Daily Mail and New York Times datasets.
Our model obtains a 41.16 ROUGE-1 score on the CNN/Daily Mail dataset, an
improvement over previous state-of-the-art models. Human evaluation also shows that our model produces higher quality summaries.
writiny  engine  research  reinforcement.learning  summarization  paper 
11 weeks ago by jaygooby

« earlier    

related tags

abstraction  ai  algorithm  algorithms  article-rewriter  attention  automatic  bq  chat  citation  code  comments  communication  companies  complicated  computer-vision  consider:benchmarks  consider:performance-measures  content-samurai  datascience  dataset  datasets  datatsets  deep-learning  deepai  deeplearning  deliberation  demos  design  dialogue  digital-libraries  document  download  email  embeddings  engine  evaluation  events  extraction  extractive  factcheck  frase  free  from:pocket  games  generation  github  glove  google  hn  howto  intro  keras  layperson  legaldeck  library  machine-learning  machinecomprehension  machinelearning  metric  ml  model  monitoring  mt  narrative  natural-language-processing  network  neural-net  neural-networks  neuralnets  news  newsletter  nlg  nlp  nlproc  nltk  package  paper  papers  paraphrase  people  pointer.generator.networks  pointer.networks  pointer  programming  python  pytorch  r-project  reddit  reference  references  reinforcement-learning  reinforcement.learning  research  researchers  rewriter  rl  rouge  semantic-embedding  semantics  sentiment  seq2seq  side-information  spacy  storytelling  summarisation  summariser  summarize  summarizer  summary  survey  tensorflow  text-summarization  text  textgeneration  textrank  to-write-about  tool  tutorial  tutorials  vectors  video  vis  web  wikipedia  word  writing  writiny 

Copy this bookmark: