You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information. |
Table of Contents Math Background Matrices The NIPALS Algorithm | |
See also: Eigenvectors and Eigenvalues - Advanced Discussion, Singular Value Decomposition |
The NIPALS Algorithm ("Nonlinear Iterative vartial Least Squares") has been developed by H. Wold ( ) at first for PCA and later-on for PLS. It is the most commonly used method for calculating the principal components of a data set. It gives more numerically accurate results when compared with the SVD of the covariance matrix, but is slower to calculate.
Assuming that the data to be analyzed is stored in matrix X, the steps to calculate the loadings u and scores v of the principal components are as follows:
Step | Math | Explanation |
1. | u := x_{i} | Select a column vector x_{i} of the matrix X and copy it to the vector u |
2. | v := (X'u)/(u'u) | Project the matrix X onto u in order to find the corresponding loading v |
3. | v := v/|v| | Normalize the loading vector v to length 1 |
4. |
u_{old} := u u := (Xp)/(v'v) |
Store the score vector u into u_{old} and project the matrix X onto v in order to find corresponding score vector u |
5. | d := u_{old}-u | In order to check for the convergence of the process calculate the difference vector d as the difference between the previous scores and the current scores. If the difference |d| is larger than a pre-defined threshold (e.g. 10^{-8}) then return to step 2. |
6. | E := X - tp' | Remove the estimated PCA component (the product of the scores and the loadings) from X |
7. | X := E | In order to estimate the other PCA components repeat this procedure from step 1 using the matrix E as the new X |
Last Update: 2005-Apr-12