Incremental Regularized Least Squares for Dimensionality Reduction of LargeScale Data

Published date : 23 Feb 2016

Over the past few decades, much attention has been drawn to large-scale incremental data analysis, where researchers are faced with huge amount of highdimensional data acquired incrementally. In such a case, conventional algorithms that compute the result from scratch whenever a new sample comes are highly inef- ficient. To handle this problem, we propose a new incremental algorithm IRLS that incrementally computes the solution to the regularized least squares (RLS) problem with multiple columns on the right-hand side. More specifically, for aRLS problem with c (c > 1) columns on the right-hand side, we update its unique solution by solving a RLS problem with single column on the right-hand side whenever a new sample arrives, instead of solving a RLS problem with columns on the right-hand side 3 from scratch. As an application, we apply the newly proposed IRLS to supervised dimensionality reduction of large-scale data and focus on linear discriminant analysis (LDA). We first propose a new batch LDA model that is closely related to RLS problem, and then apply IRLS to develop a new incremental LDA algorithm. Experimental results on real-world datasets demonstrate the effectiveness and efficiency of our algorithms.

Conference Paper/Poster
SIAM Journal on Scientific Computing 2016
Impact Factor