TY - GEN
T1 - Comparison of ensemble techniques for incremental learning of new concept classes under hostile non-stationary environments
AU - Mohammed, Hussein Syed
AU - Leander, James
AU - Marbach, Matthew
AU - Polikar, Robi
PY - 2006/1/1
Y1 - 2006/1/1
N2 - We have recently introduced Learn++, an incremental learning algorithm, inspired by the multiple classifiers structure of AdaBoost. Both algorithms generate an ensemble of classifiers trained on bootstrapped replicates of the training data, and the classifiers are then combined through a voting process. Learn+ however, generates additional ensembles as new data become available, and uses a different distribution update rule to resample the data. While AdaBoost was originally designed to improve the performance of a weak classifier, whether it can still achieve incremental learning through its ensemble structure is still an open question. In this paper, we compare the incremental learning ability of AdaBoost.M1 and Learn under very hostile nonstationary learning environments, which may introduce new concept classes. We also compare the algorithms under several combination rules to determine which of the three key components - ensemble structure, resampling procedure, or the combination rule - has the primary impact on incremental learning in nonstationary environments.
AB - We have recently introduced Learn++, an incremental learning algorithm, inspired by the multiple classifiers structure of AdaBoost. Both algorithms generate an ensemble of classifiers trained on bootstrapped replicates of the training data, and the classifiers are then combined through a voting process. Learn+ however, generates additional ensembles as new data become available, and uses a different distribution update rule to resample the data. While AdaBoost was originally designed to improve the performance of a weak classifier, whether it can still achieve incremental learning through its ensemble structure is still an open question. In this paper, we compare the incremental learning ability of AdaBoost.M1 and Learn under very hostile nonstationary learning environments, which may introduce new concept classes. We also compare the algorithms under several combination rules to determine which of the three key components - ensemble structure, resampling procedure, or the combination rule - has the primary impact on incremental learning in nonstationary environments.
UR - http://www.scopus.com/inward/record.url?scp=34548128463&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=34548128463&partnerID=8YFLogxK
U2 - 10.1109/ICSMC.2006.385071
DO - 10.1109/ICSMC.2006.385071
M3 - Conference contribution
SN - 1424401003
SN - 9781424401000
T3 - Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
SP - 4838
EP - 4844
BT - 2006 IEEE International Conference on Systems, Man and Cybernetics
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2006 IEEE International Conference on Systems, Man and Cybernetics
Y2 - 8 October 2006 through 11 October 2006
ER -