Skip to Main content Skip to Navigation
Journal articles

Non-negative non-redundant tensor decomposition

Abstract : Non-negative tensor decomposition allows us to analyze data in their ‘native’ form and to present results in the form of the sum of rank-1 tensors that does not nullify any parts of the factors. In this paper, we propose the geometrical structure of a basis vector frame for sum-of-rank-1 type decomposition of real-valued non-negative tensors. The decomposition we propose reinterprets the orthogonality property of the singular vectors of matrices as a geometric constraint on the rank-1 matrix bases which leads to a geometrically constrained singular vector frame. Relaxing the orthogonality requirement, we developed a set of structured-bases that can be utilized to decompose any tensor into a similar constrained sum-of-rank-1 decomposition. The proposed approach is essentially a reparametrization and gives us an upper bound of the rank for tensors. At first, we describe the general case of tensor decomposition and then extend it to its non-negative form. At the end of this paper, we show numerical results which conform to the proposed tensor model and utilize it for non-negative data decomposition.
Document type :
Journal articles
Complete list of metadatas

Cited literature [19 references]  Display  Hide  Download

https://hal-cea.archives-ouvertes.fr/cea-01802385
Contributor : Bruno Savelli <>
Submitted on : Monday, March 23, 2020 - 10:06:59 PM
Last modification on : Tuesday, March 24, 2020 - 10:57:52 AM
Long-term archiving on: : Wednesday, June 24, 2020 - 4:36:25 PM

File

article_OlexiyKyrgyzov_final.p...
Files produced by the author(s)

Identifiers

Collections

Citation

Olexiy Kyrgyzov, Deniz Erdogmus. Non-negative non-redundant tensor decomposition. Frontiers of Mathematics in China, Springer Verlag, 2013, 8 (1), pp.41 - 61. ⟨10.1007/s11464-012-0261-y⟩. ⟨cea-01802385⟩

Share

Metrics

Record views

322

Files downloads

742