The committee machine: Computational to statistical gaps in learning a two-layers neural network

Abstract : Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks. In this contribution, we provide a rigorous justiication of these approaches for a two-layers neural network model called the committee machine. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows to perform optimal learning in polynomial time for a large set of parameters. We nd that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it, strongly suggesting that no eecient algorithm exists for those cases, and unveiling a large computational gap.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [46 references]  Display  Hide  Download

https://hal-cea.archives-ouvertes.fr/cea-01933130
Contributor : Emmanuelle de Laborderie <>
Submitted on : Friday, November 23, 2018 - 3:21:48 PM
Last modification on : Tuesday, June 25, 2019 - 9:54:06 AM

File

comittee_machine.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : cea-01933130, version 1

Citation

Benjamin Aubin, Antoine Maillard, Jean Barbier, Florent Krzakala, Nicolas Macris, et al.. The committee machine: Computational to statistical gaps in learning a two-layers neural network. 2018. ⟨cea-01933130⟩

Share

Metrics

Record views

88

Files downloads

78