What are Loadings in PCA?

 

In this tutorial, you’ll learn what loadings are and how to interpret them in Principal Component Analysis (PCA) context.

Table of contents:

Here’s the step-by-step process:

 

Loadings in PCA

Principal Component Analysis (PCA) is a widely used technique for dimensionality reduction and data simplification. One of the core outputs of PCA is the derivation of principal components. These components are linear combinations of the original features or variables in your dataset. The coefficients, or weights, assigned to these original variables within these linear combinations are termed loadings.

Definition: Loadings offer valuable insights into the nature of the principal components. Specifically, they tell us how much each original variable contributes to the respective principal component.

Interpretation: The magnitude of a loading points to the importance or relevance of its corresponding original variable to that component. A larger absolute value suggests that the variable significantly influences the component. Furthermore, the sign of the loading — whether positive or negative — denotes the directionality of this influence. A positive sign implies that the variable and the principal component are positively related, moving in the same direction. Conversely, a negative sign indicates an inverse relationship, where one increases as the other decreases.

Matrix Representation: In the mathematical underpinnings of PCA, loadings are connected with the eigenvectors of the covariance or correlation matrix of the data. These eigenvectors represent the loadings for the resultant principal components. When visualized in matrix form, each column corresponds to a specific principal component, and each row aligns with an original variable from the dataset.

Orthogonality: An intriguing aspect of PCA is the orthogonality of the principal components, meaning they remain uncorrelated with each other. Consequently, the loadings associated with different principal components are also orthogonal.

Scaling: The choice of scaling before applying PCA is another critical aspect. If PCA performed on a correlation matrix, it means that each variable was standardized to have a mean of 0 and variance of 1 before the analysis. In this scenario, the relationships between variables are evaluated based on their correlations. On the other hand, if PCA is performed on the covariance matrix, the relationships are interpreted based on covariances. In the latter case, variables with larger variances will naturally have more substantial loadings.

Contribution to Variance: A valuable insight from loadings is their contribution to variance. By squaring the value of a loading, one can ascertain the proportion of the variance of a variable captured by a principal component. For instance, if a loading value for a specific variable on the first principal component is 0.8, this indicates that 64% (which is 0.8 squared) of the variance of that variable is encapsulated by the first principal component.

In summary, loadings in PCA provide insights into how the original variables are combined to create each principal component, helping to interpret the nature and meaning of the principal components in the context of the original data.

 

Video & Further Resources

Would you like to know more about the understanding the concept of loadings in PCA? Then you might have a look at the following video on my YouTube channel.

 

 

Furthermore, you may have a look at the other tutorials on this homepage.

 

At this point you should know how to interpret loadings in PCA. Let me know in the comments, in case you have any further questions.

 

Cansu Kebabci R Programmer & Data Scientist

This page was created in collaboration with Cansu Kebabci. Have a look at Cansu’s author page to get more information about her professional background, a list of all his tutorials, as well as an overview on her other tasks on Statistics Globe.

 

Subscribe to the Statistics Globe Newsletter

Get regular updates on the latest tutorials, offers & news at Statistics Globe.
I hate spam & you may opt out anytime: Privacy Policy.


Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Top