Generalisation dynamics of online learning in over-parameterised neural networks - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2019

Generalisation dynamics of online learning in over-parameterised neural networks

Résumé

Deep neural networks achieve stellar generalisation on a variety of problems, despite often being large enough to easily fit all their training data. Here we study the generalisation dynamics of two-layer neural networks in a teacher-student setup, where one network, the student, is trained using stochastic gradient descent (SGD) on data generated by another network, called the teacher. We show how for this problem, the dynamics of SGD are captured by a set of differential equations. In particular, we demonstrate analytically that the generalisation error of the student increases linearly with the network size, with other relevant parameters held constant. Our results indicate that achieving good generalisation in neural networks depends on the interplay of at least the algorithm, its learning rate, the model architecture, and the data set.
Fichier principal
Vignette du fichier
generalisation.pdf (750.7 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

cea-02009764 , version 1 (06-02-2019)

Identifiants

Citer

Sebastian Goldt, Madhu S. Advani, Andrew M. Saxe, Florent Krzakala, Lenka Zdeborova. Generalisation dynamics of online learning in over-parameterised neural networks. 2019. ⟨cea-02009764⟩
312 Consultations
100 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More