From Gaussian Process to Neural Tangent Kernel - A Guide to Infinitely Wide Neural Networks
4.2 هزار بار بازدید -
3 سال پیش
-
Introduction to infinitely wide neural
Introduction to infinitely wide neural networks algorithms #NNGP and #NTK.
https://www.seevid.ir/fa/w/VUX2bsrYag8 From Gaussian Process to Neural Tangent Kernel
https://www.seevid.ir/fa/w/VUX2bsrYag8 Outline
https://www.seevid.ir/fa/w/VUX2bsrYag8 Introduction to Gaussian Process
https://www.seevid.ir/fa/w/VUX2bsrYag8 Gaussian Process Regression(#GPR)
https://www.seevid.ir/fa/w/VUX2bsrYag8 Example of GPR
https://www.seevid.ir/fa/w/VUX2bsrYag8 GPR with Kernel
https://www.seevid.ir/fa/w/VUX2bsrYag8 Deep Neural Network as Gaussian Process and NNGP Algorithm
https://www.seevid.ir/fa/w/VUX2bsrYag8 Single Layer Neural Network
https://www.seevid.ir/fa/w/VUX2bsrYag8 Arc-Cos Kernel
https://www.seevid.ir/fa/w/VUX2bsrYag8 Multilayer Neural Network
https://www.seevid.ir/fa/w/VUX2bsrYag8 Bayesian Inference with Gaussian Process and NNGP Numerical Algorithm
https://www.seevid.ir/fa/w/VUX2bsrYag8 NNGP Experiment
https://www.seevid.ir/fa/w/VUX2bsrYag8 Neural Tangent Kernel(NTK)
https://www.seevid.ir/fa/w/VUX2bsrYag8 Dual Space and Bilinear Form
https://www.seevid.ir/fa/w/VUX2bsrYag8 NTK at Initialization
https://www.seevid.ir/fa/w/VUX2bsrYag8 NTK during Training
https://www.seevid.ir/fa/w/VUX2bsrYag8 NTK Extension
https://www.seevid.ir/fa/w/VUX2bsrYag8 Neural Tangent: NNGP and NTK Library
reference:
https://github.com/google/neural-tang...
https://github.com/brain-research/nngp
A. Jacot et al. 2018, Neural Tangent Kernel: Convergence and Generalization in Neural Networks
S. Arora et al. 2019, On Exact Computation with an Infinitely Wide Neural Net
R. Navok et al. 2019, Neural Tangents: Fast and Easy Infinite Neural Networks in Python
J. Lee et al. 2018, Deep Neural Networks as Gaussian Process
Z. Li et al. 2019, Enhanced Convolutional Neural Tangent Kernels
Liu et al. 2019, When Gaussian Process Meets Big Data: A Review of Scalable GPs
A. Rahimi et al. 2007, Random Features for Large-Scale Kernel Machines
Y. Cho et al. 2009, Kernel Methods for Deep Learning
J. Lee et al. 2019, Wide Neural Networks of Any Depth Evolves as Linear Models Under Gradient Descent
J. Hron et al. 2020, Infinite Attention: NNGP an d NTK for Deep Attention Networks
A. G Matthews et al. 2018, Gaussian Process Behavior in Wide Deep Neural Networks
T. Beckers 2020, An introduction to Gaussian Process Models
U.H. Gerlach 2016, The Dual of a Vector Space: From the Concrete to the Abstract to the Concrete
https://www.maths.tcd.ie/~pete/ma1212...
https://www.cs.princeton.edu/courses/...
https://rajatvd.github.io/NTK/
3 سال پیش
در تاریخ 1400/03/18 منتشر شده
است.
4,252
بـار بازدید شده