- Regularization discourages overly complex models by penalizing the loss function
- Lasso and Ridge are two commonly used so-called regularization techniques
- In Ridge regression, the cost function is changed by adding a penalty term to the square of the magnitude of the coefficients
- Ridge regression is often also referred to as L2 Norm Regularization
- Lasso regression is very similar to Ridge regression, except that the magnitude of the coefficients are not squared in the penalty term
- Lasso regression is often also referred to as L1 Norm Regularization
- AIC and BIC are two measures which give you a comprehensive measure of model performace taking into account the varying number of features
- The lower the AIC and/or BIC, the better the model
coreyautoforkingbufferaccount / dsc-regularization-recap-v2-1-nyc-ds-021720 Goto Github PK
View Code? Open in Web Editor NEWThis project forked from learn-co-students/dsc-regularization-recap-v2-1-nyc-ds-021720
License: Other