The hmatrix-nipals package
NIPALS -- Nonlinear Iterative Partial Least Squares http://en.wikipedia.org/wiki/NIPALS, is a method for iteratively finding the left singular vectors of a large matrix. In other words it discovers the largest principal component http://en.wikipedia.org/wiki/Principal_component of a set of mean-centred samples, along with the score (the magnitude of the principal component) for each sample, and the residual of each sample that is orthogonal to the principal component. By repeating the procedure on the residuals, the second principal component is found, and so on.
The advantage of NIPALS over more traditional methods, like SVD, is that it is memory efficient, and can complete early if only a small number of principal components are needed. It is also simple to implement correctly. Additionally, because it doesn't pre-condition the sample matrix in any way, it can be implemented with only two sequential passes per iteration through the sample data, which is much more efficient than random accesses if the data-set is too large to fit in memory.
NIPALS is not generally recommended because sample matrices where the largest eigenvalues are close in magnitude will cause NIPALS to converge very slowly. In general, Lanczos methods http://en.wikipedia.org/wiki/Lanczos_algorithm or some other truncated singular value decomposition algorithm are preferred to NIPALS because of this convergence issue, but these methods often require the sample matrix to fit in memory, or store large conditioning matrices, which isn't always feasible. However, if you know of free and memory-efficient implementations of these more sophisticated algorithms, please contact the author with a pointer.
- No changelog available
|Dependencies||base (>=3 && <5), hmatrix (>=0.11)|
|Copyright||Copyright (c) 2011 Alan Falloon|
|Upload date||Tue Feb 8 04:15:08 UTC 2011|
- hmatrix-nipals-0.1.tar.gz [browse] (Cabal source package)
- Package description (included in the package)
For package maintainers and hackage trustees