Neural paraphrase generatio with stacked residual LSTM Network

* Prakash 2016


1. Introduction

* Paraphrasing 的三种形式

  • recognition
  • extraction
  • genration

2 传统的PG方法

传统方法的综述

  • Madnani  2010
  • Generating Phrasal and Sentential Paraphrases: A Survey of Data-driven Methods

手动制作规则的方法

  • McKeown. 1983.
  • Paraphrasing Questions Using Given and New Information

* 自动学习复杂模式

  • zhao 2009
  • Application-driven Statistical Paraphrase Generation

* 使用知识库

  • Hassan 2007
  • UNT: SubFinder: Combining Knowledge Sources for Automatic Lexical Substitution.

* 语义分析

  • Kozlowski 2003
  • Generation of Single-sentence Paraphrases from Predicate/Argument Structure Using Lexico-grammatical Resources

* 机器翻译

  • Quirk 2004
  • Monolingual Machine Translation for Paraphrase Generation

  • Wuben 2010 Paraphrase Generation As Monolingual Translation: Data and Evaluation.


3 基于深度学习的方法

* seq2seq模型

  • Sutskever
  • Sequence to Sequence Learning with Neural Networks

* 在多个领域有研究,但是对于PG 应用还很少

* 使用多种已经存在的seq2seq模型

* 提出新的LSTM网络来解决PG问题

  • 灵感来源于

  • He 2015

  • Deep Residual Learning for Image Recognition

* 基于LSTM

  • Cho 2014

Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation.


4. 关于DL

* RNN

  • Sutskever 2011
  • Generating Text with Recurrent Neural Networks

* RNN的相关

LSTM

GRU


5 模型的描述

* seq2seq

  • Sutskever 2014

Sequence to Sequence Learning with Neural Networks

results matching ""

    No results matching ""