Marvels and pitfalls of the Langevin algorithm in noisy high-dimensional inference - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Access content directly
Journal Articles Physical Review X Year : 2020

Marvels and pitfalls of the Langevin algorithm in noisy high-dimensional inference

Abstract

Gradient-descent-based algorithms and their stochastic versions have widespread applications in machine learning and statistical inference. In this work we perform an analytic study of the performances of one of them, the Langevin algorithm, in the context of noisy high-dimensional inference. We employ the Langevin algorithm to sample the posterior probability measure for the spiked matrix-tensor model. The typical behaviour of this algorithm is described by a system of integro-differential equations that we call the Langevin state evolution, whose solution is compared with the one of the state evolution of approximate message passing (AMP). Our results show that, remarkably, the algorithmic threshold of the Langevin algorithm is sub-optimal with respect to the one given by AMP. We conjecture this phenomenon to be due to the residual glassiness present in that region of parameters. Finally we show how a landscape-annealing protocol, that uses the Langevin algorithm but violate the Bayes-optimality condition, can approach the performance of AMP.
Fichier principal
Vignette du fichier
marvels.pdf (1.83 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

cea-02009687 , version 1 (06-02-2019)

Identifiers

Cite

Stefano Sarao Mannelli, Giulio Biroli, Chiara Cammarota, Florent Krzakala, Pierfrancesco Urbani, et al.. Marvels and pitfalls of the Langevin algorithm in noisy high-dimensional inference. Physical Review X, 2020, 10, pp.011057. ⟨cea-02009687⟩
89 View
136 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More