Asymptotic Performance Analysis of Subspace Adaptive Algorithms Introduced in the Neural Network Literature - Télécom SudParis Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Signal Processing Année : 1998

Asymptotic Performance Analysis of Subspace Adaptive Algorithms Introduced in the Neural Network Literature

Résumé

In the neural network literature, many algorithms have been proposed for estimating the eigenstructure of covariance matrices. We first show that many of these algorithms, when presented in a common framework, show great similitudes with the gradient-like stochastic algorithms usually encountered in the signal processing literature. We derive the asymptotic distribution of these different recursive subspace estimators. A closed-form expression of the covariances in distribution of eigenvectors and associated projection matrix estimators are given and analyzed. In particular, closed-form expressions of the mean square error of these estimators are given. It is found that these covariance matrices have a structure very similar to those describing batch estimation techniques. The accuracy of our asymptotic analysis is checked by numerical simulations, and it is found to be valid not only for a "small" step size but in a very large domain. Finally, convergence speed and deviation from orthonormality of the different algorithms are compared and several tradeoffs are analyzed.
Fichier principal
Vignette du fichier
3.pdf (1.06 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03435761 , version 1 (18-11-2021)

Identifiants

  • HAL Id : hal-03435761 , version 1

Citer

Jean-Pierre Delmas, Florence Alberge. Asymptotic Performance Analysis of Subspace Adaptive Algorithms Introduced in the Neural Network Literature. IEEE Transactions on Signal Processing, 1998. ⟨hal-03435761⟩
22 Consultations
31 Téléchargements

Partager

Gmail Facebook X LinkedIn More