PASCAL Agnostic Learning
vs.
Prior Knowledge
IJCNN07

The challenge is now over. But it remains open for post-challenge submissions!


IMPORTANT: Entries made since February 1st 2007 might be using validation data, now available for training.

liknon feature selection + state of the art (3)

Submitted by Erinija

An ensemble of linear discriminants is identified by splitting the training data n times into training and monitoring. The liknon is solved with increasing values of regularization parameter, the best model of the individual split is selected based on minimum monitoring error. The weights of the discriminants identify useful features. Other rules were trained on the selected feature subsets. Only for Gina Knn3 otperformed the ensemble.

These models were of median performance in 10-fold crossvalidation
gina8 hiva3 ada8 sylva4 nova9
predicted number of misclassifications: 1964, 9151, 7444, 2565, 1403

Dataset Balanced Error Area Under Curve  
Train Valid Test Train Valid Test
ada 0.1787 0.1932 0.1833 0.8728 0.8629 0.8706 agnostic
gina 0.0286 0.0255 0.0533 0.9968 0.9897 0.974 agnostic
hiva 0.1641 0.1317 0.3053 0.9139 0.9266 0.7561 agnostic
nova 0.0376 0.058 0.0813 0.9901 0.9979 0.9663 agnostic
sylva 0.0198 0.0106 0.0195 0.9948 0.9982 0.9947 agnostic
Overall 0.0858 0.0838 0.1285 0.9537 0.9551 0.9123 agnostic

This entry is a complete agnostic learning entry.