Non-negative non-redundant tensor decomposition - Archive ouverte HAL Access content directly
Journal Articles Frontiers of Mathematics in China Year : 2013

Non-negative non-redundant tensor decomposition

Abstract

Non-negative tensor decomposition allows us to analyze data in their ‘native’ form and to present results in the form of the sum of rank-1 tensors that does not nullify any parts of the factors. In this paper, we propose the geometrical structure of a basis vector frame for sum-of-rank-1 type decomposition of real-valued non-negative tensors. The decomposition we propose reinterprets the orthogonality property of the singular vectors of matrices as a geometric constraint on the rank-1 matrix bases which leads to a geometrically constrained singular vector frame. Relaxing the orthogonality requirement, we developed a set of structured-bases that can be utilized to decompose any tensor into a similar constrained sum-of-rank-1 decomposition. The proposed approach is essentially a reparametrization and gives us an upper bound of the rank for tensors. At first, we describe the general case of tensor decomposition and then extend it to its non-negative form. At the end of this paper, we show numerical results which conform to the proposed tensor model and utilize it for non-negative data decomposition.
Fichier principal
Vignette du fichier
article_OlexiyKyrgyzov_final.pdf (719.17 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

cea-01802385 , version 1 (23-03-2020)

Identifiers

Cite

Olexiy Kyrgyzov, Deniz Erdogmus. Non-negative non-redundant tensor decomposition. Frontiers of Mathematics in China, 2013, 8 (1), pp.41 - 61. ⟨10.1007/s11464-012-0261-y⟩. ⟨cea-01802385⟩
38 View
76 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More