"Non-linear regression"
Is there an operator for non-linear regression, e.g. polynomic? I didn't found something in this way.
Tagged:
0
0 Comments | 0 Discussions | 0 Members | 0 Online |
Answers
you can find many examples in the "sample" directory of RapidMiner. There should also be one for a general regression setting. For the polynomial LibSVM, you have to set the type to one of the both "SVR" types, select the kernel type "polynomial" and define an appropriate degree and values for C. Which parameter values are appropriate can be evaluated by using one of the parameter optimization operators (please also refer to the sample dir). Here is a simple setup (model is applied on the training data - never do this in real life ;-): However, I would usually prefer an RBF kernel or an (additional) feature construction (for example with YAGGA2) instead but if polynomial works for your data this is of course fine.
Cheers,
Ingo
My operator tree is the following: I hope someone can help me to solve these problems or can explain how to calculate these model...
if the runtime is too high you could try to reduce the value of "C". If the results are not satisfying, I always would try the RBF kernel with an optimized value for gamma / sigma. This often leads to much better fits. Instead of introducing the non-linearity in the learner, you could also construction additional (polynomial) features before learning and simply apply a linear regression scheme afterwards. This is often faster and leads to understandable models.
Cheers,
Ingo
here are the basic settings for a RBF SVM:
For the parameter optimization, you could have a look into the sample directory (..._Meta.../...ParameterOptimization.xml).
Cheers,
Ingo
But could it be possible that setting the degree of the function doesn't have influence to the result?
the kernel parameter "degree" is only used for a poynomial kernel, the parameters "sigma" / "gamma" are only used for RBF kernels.
Cheers,
Ingo