BOOSTING
What is Boosting?
Boosting and Bagging algorithms are belonging to the ensemble learning techniques.
Boosting refers to converting weak learners into Strong learners.
what are weak learners?
if our base model predicts wrongly(like spam mail as not spam) the features are called weak learners.
weak learners are nothing but the
To convert Weak learners into strong learners we generally apply weights.
Let's take an example.
Note:-The base learner may be any model like decision Tree, SVM, etc...
we have a dataset with 100 features, in the ensemble technique we have base learners here what we do is we take some amount of data in the first base learner.
in our first base learner model, the model predicts some features are not predicted well These wrongly predicted features are called weak learners.
In the next base learner, we take this weak learner and some other features from the dataset we will do the same procedure here also we will get some other weak learners.
After applying base learners at some stage we get a pure model.
To convert weak learner to strong learner, we’ll combine the prediction of each weak learner using methods like:
> weighted average.
> Considering prediction has a higher vote.
in this way, the boosting algorithm convert weak learners into strong learners
THE END
Comments
Post a Comment