Faq difference between a loading and a weighting

From Eigenvector Research Documentation Wiki
Revision as of 11:03, 30 November 2018 by imported>Lyle (Created page with "===Issue:=== What is the difference between a loading and a weighting? ===Possible Solutions:=== When performing Principal Components Analysis (PCA), you get loadings, P,...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Issue:

What is the difference between a loading and a weighting?

Possible Solutions:

When performing Principal Components Analysis (PCA), you get loadings, P, which are an orthonormal basis which can be used to calculate scores: T = X*P or to estimate data X = T*P'

These operations are invertible (repeating them gives the same result) because the loadings are the eigenvectors of X'X.

When using Partial Least Squares (PLS), you get loadings, P, but also weights, W, because the decomposition is based on X'Y. The weights and loadings must be used together to calculate scores: T = X*W*pinv(P'*W) From a phenomenological point of view, the weights represent features in X which are related to the original Y values. The loadings represent the features in X which are related to the scores, T, which are the given factor's estimate of Y.

Note, by the way, that the weights are the ones used to calculate the regression vector (that which is used to make a prediction). Loadings are only used when calculating scores and, of course, Hotelling's T2.