description |
---|
I read papers, and here are my highlights. |
- Learning Word Meta-Embeddings
- Frustratingly Easy Meta-Embedding – Computing Meta-Embeddings by Averaging Source Word Embeddings
- Teaching Machines to Read and Comprehend
- Attention-over-Attention Neural Networks for Reading Comprehension
- Consensus Attention-based Neural Networks for Chinese Reading Comprehension
- Convolutional Neural Networks for Sentence Classification
- Deep contextualized word representations
- A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification
- Attention Is All You Need
- Neural Machine Translation by Jointly Learning to Align and Translate
- U-Net: Convolutional Networks for Biomedical Image Segmentation
- Transforming Auto-encoders
- Text Understanding with the Attention Sum Reader Network
- Teaching Machines to Read and Comprehend
- An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
- Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
- Self-Attention with Relative Position Representations
- Deep Residual Learning for Image Recognition
- Memory Networks
- Hierarchical Attention Networks for Document Classification
- Graph Attention Networks
- Grammar as a Foreign Language
- Effective Approaches to Attention-based Neural Machine Translation
- Distance-based Self-Attention Network for Natural Language Inference
- Convolutional Sequence to Sequence Learning
- A Structured Self-attentive Sentence Embedding
- DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding