TY - GEN
T1 - Incremental learning in nonstationary environments with controlled forgetting
AU - Elwell, Ryan
AU - Polikar, Robi
PY - 2009
Y1 - 2009
N2 - We have recently introduced an incremental learning algorithm, called Learn++.NSE, designed for Non-Stationary Environments (concept drift), where the underlying data distribution changes over time. With each dataset drawn from a new environment, Learn++.NSE generates a new classifier to form an ensemble of classifiers. The ensemble members are combined through a dynamically weighted majority voting, where voting weights are determined based on classifiers' age-adjusted accuracy on current and past environments. Unlike other ensemble-based concept drift algorithms, Learn ++.NSE does not discard prior classifiers, allowing potentially cyclical environments to be learned more effectively. While Learn ++.NSE has been shown to work well on a variety of concept drift problems, a potential shortcoming of this approach is the cumulative nature of the ensemble size. In this contribution, we expand our analysis of the algorithm to include various ensemble pruning methods to introduce controlled forgetting. Error or age-based pruning methods have been integrated into the algorithm to prevent potential outvoting from irrelevant classifiers or simply to save memory over an extended period of time. Here, we analyze the tradeoff between these precautions and the desire to handle recurring contexts (cyclical data). Comparisons are made using several scenarios that introduce various types of drift.
AB - We have recently introduced an incremental learning algorithm, called Learn++.NSE, designed for Non-Stationary Environments (concept drift), where the underlying data distribution changes over time. With each dataset drawn from a new environment, Learn++.NSE generates a new classifier to form an ensemble of classifiers. The ensemble members are combined through a dynamically weighted majority voting, where voting weights are determined based on classifiers' age-adjusted accuracy on current and past environments. Unlike other ensemble-based concept drift algorithms, Learn ++.NSE does not discard prior classifiers, allowing potentially cyclical environments to be learned more effectively. While Learn ++.NSE has been shown to work well on a variety of concept drift problems, a potential shortcoming of this approach is the cumulative nature of the ensemble size. In this contribution, we expand our analysis of the algorithm to include various ensemble pruning methods to introduce controlled forgetting. Error or age-based pruning methods have been integrated into the algorithm to prevent potential outvoting from irrelevant classifiers or simply to save memory over an extended period of time. Here, we analyze the tradeoff between these precautions and the desire to handle recurring contexts (cyclical data). Comparisons are made using several scenarios that introduce various types of drift.
UR - http://www.scopus.com/inward/record.url?scp=70449436537&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=70449436537&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2009.5178779
DO - 10.1109/IJCNN.2009.5178779
M3 - Conference contribution
AN - SCOPUS:70449436537
SN - 9781424435531
T3 - Proceedings of the International Joint Conference on Neural Networks
SP - 771
EP - 778
BT - 2009 International Joint Conference on Neural Networks, IJCNN 2009
T2 - 2009 International Joint Conference on Neural Networks, IJCNN 2009
Y2 - 14 June 2009 through 19 June 2009
ER -