Skip to content

papers about sequence encoder/graph network/CTR/new nlp method

Notifications You must be signed in to change notification settings

currylym/reading-papers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

1. PAPER LEARNING

1.1. NLP Trends

  • ELMO
  • GPT
  • BERT

1.2. Sequence Encoder

  • TCNN
  • Transformer

1.3. Graph Network

  • GCN

1.4. Graph Embedding

  • Node2vec/DeepWalk

1.5. Attention

  • 乘性注意力
  • 加性注意力
  • 点乘注意力
  • 减法注意力
  • 自注意力(attention is all you need)

    • single-head attention
    • multi-head attention
  • 比较
    《atention is all you need》里面提出的attention一统了注意力机制的框架。之前的注意力都可以用(Q,K,V)这一套来进行表示。并且multi-head attention进一步提升了模型的表达能力。

在下面我们将针对文本分类任务来对这些attention机制的性能进行比较。具体见

1.6. CTR

  • DeepFM
  • DIN

New Papers

  • Auto Cross

Author:Lym
Email:[email protected]
Keep Carry and Going On!

About

papers about sequence encoder/graph network/CTR/new nlp method

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages