Bagging metanode does not do the bagging in fact...

According to the sources, the bagging at each iteration should use a training set of a size of the original training set built up by randomly sampling, with repetition an original training set. So when the original training set contains, lets say, 8 samples, each new training set should also contain 8 samples, but some samples will be there multiple times, while others may be ommited.

EXAMPLE:

Original set: A, B, C, D, E, F, G, H

Bag 1: A, C, A, G, E, C, H, H

Bag 2: C, A, H, F, D, H, D, B

Bag 3: F, E, D, D, A, E, B, A

The bagging metanode doesn't work like this - it uses a chunking loop, so trains the classsifers with a randomly chosen subsets of the training set without any bootstrapping.

 

...and WEKA nodes still don't work!

You can easily achieve this behaviour by replacing the Chunk Loop Start with an Counting Loop Start followed by a Bootstrap Sampling node.

 

Great, but why doesn't predefined metanode work as it is supposed to?