Skip to content

Latest commit

 

History

History
120 lines (68 loc) · 4.66 KB

readme.md

File metadata and controls

120 lines (68 loc) · 4.66 KB

Neural Tangent Kernel

Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent

PDF Highlight

Every Model Learned by Gradient Descent Is Approximately a Kernel Machine

University of Washington

Paper Explained Video

Finding sparse trainable neural networks through Neural Tangent Transfer

Blog

Paper

Deep Neural Networks as Gaussian Processes

ICLR 2018

Citation 370

Paper

PDF Highlight

Propose NNGP.

Generalization bounds of stochastic gra�dient descent for wide and deep neural networks

ICML 2019

Citation 129

Paper

PDF Highlight

Propose Neural Tangent Random Feature (NTRF).

A Convergence Theory for Deep Learning via Over-Parameterization

ICML 2019

Citation 619

Paper

PDF Highlight

Learning and Generalization in Overparameterized Neural Networks, Going Beyond Two Layers

NIPS 2019

Citation 407

Paper

PDF Highlight

On Exact Computation with an Infinitely WideNeural Net

Paper

PDF Highlight

Citation 344

Propose Convolutional Neural Tangent Kernel(CNTK).

Neural Tangent Kernel:Convergence and Generalization in Neural Networks

Paper

PDF Highlight

Propose Neural Tangent Kernel(NTK).

Neural Contextual Bandits with UCB-based Exploration

It propose a way to use NTK features for UCB exploration/exploitation.

What Can ResNet Learn Efficiently, Going Beyond Kernels?

NIPS 2019

Citation 75

Full Paper(Version 3)

NIPS Version

The paper shows that NTK might not represent all learning capacity of the DNN.


Reference