Outlier detection algorithms comparison
Hello, I'm new to RapidMiner and I'm kinda having a little bit of troubles here.
I'm trying to compare outlier detection algorithms such as LOF LoOP in terms of performance... and I have no clue how to do it.
I'm trying to compare outlier detection algorithms such as LOF LoOP in terms of performance... and I have no clue how to do it.
Tagged:
0
Best Answers
-
yyhuang Administrator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:363RM Data ScientistHi@zzM,
It is not possible to get the performance of unsupervised outlier detection, if we have no label for the ground truth.
Check out thisresearch paperfor a comprehensive overview of the anomaly detection models which are available in anomaly detection extension
YY6 -
Telcontar120 Moderator, RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:1,635Unicorn没有一个二进制一堂课sification problem that has a priori answers (the label) to which you are comparing a prediction (the score), it is not possible to produce the ROC/AUC performance metric. So the only way to produce that would be to separately label all cases as to whether they were in fact outliers in your opinion based on whatever criteria you are using, and then treat the output from the different outlier algorithms as though they were predictive models. This is the main difference between supervised and unsupervised machine learning problems, which is what@yyhuangwas talking about before. So the short answer to your question is "not unless you dramatically change the nature of the problem."7
Answers
One more thing, is there a way to compare the outlier detection algorithms in terms of AUC as a performance measure?