“使用两个队列d Linear Regression models"
Hi everybody,
I am about to analysze data from the KDD98 (http://archive.ics.uci.edu/ml/databases/kddcup98/kddcup98.html)dataset. I use the “optimize parameters” operator, and within that a cross-validation. For training,I queued two linear regression models to predict a feature (label). The first LRM works on all features and I do not optimize any parameter, while the second LRM works only on 50 features and I optimize the ridge factor.
However, once I run the process and Rapidminer starts to execute the second LRM, the following error message pops up:
WARNING: Error during calculation: Matrix is singular.: Increasing ridge factor from 0.0 to 1.0E-7
Does anybody have an idea where this could come from or what I would have to change.
Thanks in advance!
Ralf
I am about to analysze data from the KDD98 (http://archive.ics.uci.edu/ml/databases/kddcup98/kddcup98.html)dataset. I use the “optimize parameters” operator, and within that a cross-validation. For training,I queued two linear regression models to predict a feature (label). The first LRM works on all features and I do not optimize any parameter, while the second LRM works only on 50 features and I optimize the ridge factor.
However, once I run the process and Rapidminer starts to execute the second LRM, the following error message pops up:
WARNING: Error during calculation: Matrix is singular.: Increasing ridge factor from 0.0 to 1.0E-7
Does anybody have an idea where this could come from or what I would have to change.
Thanks in advance!
Ralf
Tagged:
0
Answers
well you said it yourself: If the ridge factor becomes too small, the matrices within the linear regression become singular and can hence not be inverted which does not allow the creation of the model. For that reason RapidMiner automatically increased the ridge factor again so that it works again. You can actually ignore the message or you could improve the optimization ranges so that those regions for the factor are no longer hit.
Cheers,
Ingo