[SOLVED] Unexpected output for linear regression operator

dysprosiumdysprosium MemberPosts:7Contributor I
edited August 2019 inHelp
我很新Rapid Miner and trying the linear regression for the first time. I’m applying the Linear Regression operator on a training data set, then outputting a regression model which is input to the Apply Model operator. Then I will apply the Model to an unlabelled data set.
There is one special attribute (label) and 3 regular attributes in the training data set. It has only 7 examples at the moment (easier for me to see what’s happening). The attributes are all integers.
In the attribute weights output from the linear regression, I’m expecting all of the 3 regular attributes to have a weight greater than zero. However, when I run the process only one attribute has a weight greater than 0. Its weight is 0.268. The other two attributes have a weight of 0. It seems as if the linear regression operator is ignoring those two attributes. Why?
The reason I expect all of the weights from the Linear Regression to be non-zero is because when I inputexactly the same training setto the Vector Linear Regression operator, I get either positive or negative weights for all three regular attributes.





<宏/ >

























Tagged:

Answers

  • homburghomburg Moderator, Employee, MemberPosts:114RM Data Scientist
    Hi Dy,

    the Linear Regression in RapidMiner offers a few built-in features such as feature selection or colinear feature elimination. Please set feature selection to none (the default is M5 prime) and disable the "eliminate colinear features" check box. No the algorithm shall use all of your three attributes.

    Cheers,
    Helge
  • dysprosiumdysprosium MemberPosts:7Contributor I
    Hi Helge,
    I corrected the feature settings and now getting weights for all the attributes.
    Just one more question .... the results I get (for this data set) from the linear regression operator are exactly the same (except formatted differently) as the results from vector linear regression. How are the two algorithms different?
    Thanks!
    Dy
  • homburghomburg Moderator, Employee, MemberPosts:114RM Data Scientist
    Hi Dy,

    the algorithms only differ in those feature selection options you just disabled. The vector version does the same job as the linear regression but for a vector label (in this case serveral numerical attributes). If you input one label you will receive more or less the same model.

    Cheers,
    Helge
  • dysprosiumdysprosium MemberPosts:7Contributor I
    我很高兴听到是这样,我更喜欢equation that I get from the vector linear regression - more intuitive than the table from linear regression. I'll get on with applying the model now.

    Cheers,
    Dy





Sign InorRegisterto comment.