Mutual information for symmetric rank-one matrix estimation: A proof of the replica formula

Abstract : Factorizing low-rank matrices has many applications in machine learning and statistics. For probabilistic models in the Bayes optimal setting, a general expression for the mutual information has been proposed using heuristic statistical physics computations, and proven in few specific cases. Here, we show how to rigorously prove the conjectured formula for the symmetric rank-one case. This allows to express the minimal mean-square-error and to characterize the detectability phase transitions in a large set of estimation problems ranging from community detection to sparse PCA. We also show that for a large set of parameters, an iterative algorithm called approximate message-passing is Bayes optimal. There exists, however, a gap between what currently known polynomial algorithms can do and what is expected information theoretically. Additionally, the proof technique has an interest of its own and exploits three essential ingredients: the interpolation method introduced in statistical physics by Guerra, the analysis of the approximate message-passing algorithm and the theory of spatial coupling and threshold saturation in coding. Our approach is generic and applicable to other open problems in statistical estimation where heuristic statistical physics predictions are available.
Liste complète des métadonnées

Littérature citée [23 références]  Voir  Masquer  Télécharger
Contributeur : Emmanuelle De Laborderie <>
Soumis le : mardi 25 juillet 2017 - 15:15:23
Dernière modification le : mercredi 21 mars 2018 - 18:56:41


Fichiers produits par l'(les) auteur(s)


  • HAL Id : cea-01568705, version 1
  • ARXIV : 1606.04142


Jean Barbier, Mohamad Dia, Nicolas Macris, Florent Krzakala, Thibault Lesieur, et al.. Mutual information for symmetric rank-one matrix estimation: A proof of the replica formula. t16/125. 2016. 〈cea-01568705〉



Consultations de la notice


Téléchargements de fichiers