Adaboost Decision Stump

MSTRPCMSTRPC MemberPosts:2Newbie
Hey all,
I have a Question about Decision Stumps in the Adaboost Algorithm, because in Literature it is recommended to use a "Weak Learner".

First I implemented an Decision Stump in the Adaboost Operator with 10 Iterations, but the Trees looked identical and my results weren't as expected. I saw that in the Tutorial Process of the Adaboost Algorithm ist used an Decision Tree with a Depth of 10. But isn't the advantage of Adaboost, that you use weak learner to get better results through iterative learning?

With the default Decision Tree the Results are good, but I don't understand why a normal Decision Tree can be used here.


After that Process I got the Precision of the Model and in the Results there is a "w" with a value, ist this the Sum of the weights per Stump? I couldn't find any explanation. Sorry if this Question is obsolete, I am not that long into Rapidminer.


Greetings:smile:

MSTRPC
Tagged:

Best Answer

Answers

  • MSTRPCMSTRPC MemberPosts:2Newbie
    Hello,

    thank you for the answer, this really helps me solving the problem:)

    Greetings,

    MSTRPC
    sara20
Sign InorRegisterto comment.