Ref:
eg: 0-ACL17-Ng-Paper_title.pdf
- Abstract
- Overview:
- Advantage:
- Disadvantage:
- What can I do? / Can I employ its idea?
- Experiments
- DataSet:
- Toolkit:
- Baseline:
- Result:
[TOC]
Task, QA-LSTM, Innter-Attention
- A Deep Reinforced Model for Abstractive Summarization
- ACL17-Stanford-Get to the point
- COLING'16-AttSum- Joint Learning of Focusing and Summarization with Neural Attention
- IJCAI‘16-Agreement-Based Joint Training for Bidirectional Attention-Based Neural Machine Translation
- NIPS15-Pointer Networks
- Neural Relation Extraction with Multi-lingual Attention-