L1 and L2 regularization | Difference between L1 and L2 regularization
Regularization is a very important technique in machine learning to prevent overfitting. Mathematically speaking, it adds a regularization term in order to prevent the coefficients to fit so perfectly to overfit.
The difference between the L1 and L2 is just that L2 is the sum of the square of the weights, while L1 is just the sum of the weights.
L2 regularization | L1 regularization |
---|---|
Computational effieient due to having analyticlsl solutions | Computational inefficient on non-sparse cases |
Non-sparse outputs | Sparse Outputs |
No feature selection | Built-feature selection |
Read here for more.