Cherreads

Chapter 4 - Chepter 4

Bagging

Bootstrap aggregating, often abbreviated as bagging, involves having each model in the ensemble vote with equal weight. In order to promote model variance, bagging trains each model in the ensemble using a randomly drawn sunset of the training set. As an example, the random forest algorithm combines random decision trees with baggaging to achieve very high classification accuracy.

The simplest method of combining classifiers is known as bagging, which stands for bootstrap aggregating, the statistical description of the method. This is fine if you know what a bootstrap is,but fairy useless if you don't. A bootstrap sample is a sample taken from the original dataset with replacement, so that we may get some data several times and others not at all.

Example -

Deadline Urgent, Near, None

Is there a party

Yes,no

Lazy

Yes,ni

Activity

Party,study

More Chapters