Performance Metrics The various metrics used to evaluate the results of the prediction are : Mean Squared Error(MSE) Mean Absolute error(MAE) Root-Mean-Squared-Error(RMSE) Adjusted R² Mean Squared Error: Mean Squared error is one of the most used metrics for regression tasks. MSE is simply the average of the squared difference between the target value and value predicted by the regression model. As it squares the differences and penalizes (punish)even a small error which leads to over-estimation of how bad the model is. It is preferred more than other metrics because it is differentiable and hence can be optimized better. in the above formulae, y=actual value and ( yhat) means predicted value by the model. RMSE(Root Mean Squared Error: This is the same as MSE (Mean Squared Error) but the root of the value is considered while determining the accuracy of the model. It is preferred more in some cases because the errors are first...
BOOSTING What is Boosting? Boosting and Bagging algorithms are belonging to the ensemble learning techniques. Boosting refers to converting weak learners into Strong learners. what are weak learners? if our base model predicts wrongly(like spam mail as not spam) the features are called weak learners. weak learners are nothing but the To convert Weak learners into strong learners we generally apply weights. Let's take an example. Note:-The base learner may be any model like decision Tree, SVM, etc... we have a dataset with 100 features, in the ensemble technique we have base learners here what we do is we take some amount of data in the first base learner. in our first base learner model, the model predicts some features are not predicted well These w...