Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Generalisation dynamics of online learning in over-parameterised neural networks

Abstract : Deep neural networks achieve stellar generalisation on a variety of problems, despite often being large enough to easily fit all their training data. Here we study the generalisation dynamics of two-layer neural networks in a teacher-student setup, where one network, the student, is trained using stochastic gradient descent (SGD) on data generated by another network, called the teacher. We show how for this problem, the dynamics of SGD are captured by a set of differential equations. In particular, we demonstrate analytically that the generalisation error of the student increases linearly with the network size, with other relevant parameters held constant. Our results indicate that achieving good generalisation in neural networks depends on the interplay of at least the algorithm, its learning rate, the model architecture, and the data set.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

Cited literature [45 references]  Display  Hide  Download
Contributor : Emmanuelle De Laborderie Connect in order to contact the contributor
Submitted on : Wednesday, February 6, 2019 - 3:31:44 PM
Last modification on : Friday, March 18, 2022 - 3:37:59 AM
Long-term archiving on: : Tuesday, May 7, 2019 - 4:02:33 PM


Files produced by the author(s)


  • HAL Id : cea-02009764, version 1
  • ARXIV : 1901.09085


Sebastian Goldt, Madhu S. Advani, Andrew M. Saxe, Florent Krzakala, Lenka Zdeborova. Generalisation dynamics of online learning in over-parameterised neural networks. 2019. ⟨cea-02009764⟩



Record views


Files downloads