TY - JOUR
T1 - Feature Selection Inspired Classifier Ensemble Reduction
AU - Diao, Ren
AU - Chao, Fei
AU - Peng, Taoxin
AU - Snooke, Neal
AU - Shen, Qiang
N1 - This is the author accepted manuscript. The final version is available from IEEE via http://dx.doi.org/10.1109/TCYB.2013.2281820
PY - 2014/7/15
Y1 - 2014/7/15
N2 - Classifier ensembles constitute one of the main research directions in machine learning and data mining. The use of multiple classifiers generally allows better predictive performance than that achievable with a single model. Several approaches exist in the literature that provide means to construct and aggregate such ensembles. However, these ensemble systems contain redundant members that, if removed, may further increase group diversity and produce better results. Smaller ensembles also relax the memory and storage requirements, reducing system's run-time overhead while improving overall efficiency. This paper extends the ideas developed for feature selection problems to support classifier ensemble reduction, by transforming ensemble predictions into training samples, and treating classifiers as features. Also, the global heuristic harmony search is used to select a reduced subset of such artificial features, while attempting to maximize the feature subset evaluation. The resulting technique is systematically evaluated using high dimensional and large sized benchmark datasets, showing a superior classification performance against both original, unreduced ensembles, and randomly formed subsets.
AB - Classifier ensembles constitute one of the main research directions in machine learning and data mining. The use of multiple classifiers generally allows better predictive performance than that achievable with a single model. Several approaches exist in the literature that provide means to construct and aggregate such ensembles. However, these ensemble systems contain redundant members that, if removed, may further increase group diversity and produce better results. Smaller ensembles also relax the memory and storage requirements, reducing system's run-time overhead while improving overall efficiency. This paper extends the ideas developed for feature selection problems to support classifier ensemble reduction, by transforming ensemble predictions into training samples, and treating classifiers as features. Also, the global heuristic harmony search is used to select a reduced subset of such artificial features, while attempting to maximize the feature subset evaluation. The resulting technique is systematically evaluated using high dimensional and large sized benchmark datasets, showing a superior classification performance against both original, unreduced ensembles, and randomly formed subsets.
UR - http://hdl.handle.net/2160/30655
U2 - 10.1109/TCYB.2013.2281820
DO - 10.1109/TCYB.2013.2281820
M3 - Article
C2 - 24108493
SN - 2168-2267
VL - 44
SP - 1259
EP - 1268
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 8
ER -