Scaling is done to bring all the variables of a dataset to the same scale. Scaling basically helps to normalise the data within a particular range. Sometimes, it also helps in speeding up the calculations in an algorithm.

The most common techniques of feature scaling are **Normalization** and **Standardization**.

Normalization is used when we want to bound our values between two numbers, typically, between [0,1] or [-1,1]. While Standardization transforms the data to have zero mean and a variance of 1, they make our data **unitless**.

In many machine learning algorithms, to bring all features in the same standing, we need to do scaling so that one significant number doesn’t impact the model just because of their large magnitude.

## Comments

0 comments

Article is closed for comments.