Streaming Bayesian inference: theoretical limits and mini-batch approximate message-passing

Abstract : In statistical learning for real-world large-scale data problems, one must often resort to "streaming" algorithms which operate sequentially on small batches of data. In this work, we present an analysis of the information-theoretic limits of mini-batch inference in the context of generalized linear models and low-rank matrix factorization. In a controlled Bayes-optimal setting, we characterize the optimal performance and phase transitions as a function of mini-batch size. We base part of our results on a detailed analysis of a mini-batch version of the approximate message-passing algorithm (Mini-AMP), which we introduce. Additionally, we show that this theoretical optimality carries over into real-data problems by illustrating that Mini-AMP is competitive with standard streaming algorithms for clustering.
Liste complète des métadonnées

Littérature citée [43 références]  Voir  Masquer  Télécharger
Contributeur : Emmanuelle De Laborderie <>
Soumis le : lundi 3 juillet 2017 - 16:04:57
Dernière modification le : mardi 24 avril 2018 - 17:20:04


Fichiers produits par l'(les) auteur(s)


  • HAL Id : cea-01553517, version 1
  • ARXIV : 1706.00705


Andre Manoel, Florent Krzakala, Eric W. Tramel, Lenka Zdeborová. Streaming Bayesian inference: theoretical limits and mini-batch approximate message-passing. T14/110. 19 pages, 4 figures. 2017. 〈cea-01553517〉



Consultations de la notice


Téléchargements de fichiers