Authors
Christopher Strohmeier, Hanbaek Lyu, Deanna Needell
Description
We introduce a novel online nonnegative tensor factorization (NTF) algorithm that learns a CANDECOMP/PARAFAC (CP) basis from a given stream of tensor-valued data under general constraints. In particular, using nonnegativity constraints, the learned CP modes also give localized dictionary atoms that respect the tensor structure in multi-model data. On the theoretical side, we prove that our algorithm converges to the set of stationary points of the objective function under the hypothesis that the sequence of data tensors have functional Markovian dependence. This assumption covers a wide range of application contexts including data streams generated by independent or MCMC sampling. On the application side, we demonstrate the efficiency of our online algorithm against standard offline algorithms on both synthetic and realworld tensor data, and also illustrate the advantage of being able to flexibly reshape multi-modal tensor data and learn CP-dictionary atoms for any desired groups of modes jointly through video data applications.