TY - GEN
T1 - Incremental learning of variable rate concept drift
AU - Elwell, Ryan
AU - Polikar, Robi
PY - 2009
Y1 - 2009
N2 - We have recently introduced an incremental learning algorithm, Learn ++.NSE, for Non-Stationary Environments, where the data distribution changes over time due to concept drift. Learn+ +.NSE is an ensemble of classifiers approach, training a new classifier on each consecutive batch of data that become available, and combining them through an age-adjusted dynamic error based weighted majority voting. Prior work has shown the algorithm's ability to track gradually changing environments as well as its ability to retain former knowledge in cases of cyclical or recurring data by retaining and appropriately weighting all classifiers generated thus far. In this contribution, we extend the analysis of the algorithm to more challenging environments experiencing varying drift rates; but more importantly we present preliminary results on the ability of the algorithm to accommodate addition or subtraction of classes over time. Furthermore, we also present comparative results of a variation of the algorithm that employs an error-based pruning in cyclical environments.
AB - We have recently introduced an incremental learning algorithm, Learn ++.NSE, for Non-Stationary Environments, where the data distribution changes over time due to concept drift. Learn+ +.NSE is an ensemble of classifiers approach, training a new classifier on each consecutive batch of data that become available, and combining them through an age-adjusted dynamic error based weighted majority voting. Prior work has shown the algorithm's ability to track gradually changing environments as well as its ability to retain former knowledge in cases of cyclical or recurring data by retaining and appropriately weighting all classifiers generated thus far. In this contribution, we extend the analysis of the algorithm to more challenging environments experiencing varying drift rates; but more importantly we present preliminary results on the ability of the algorithm to accommodate addition or subtraction of classes over time. Furthermore, we also present comparative results of a variation of the algorithm that employs an error-based pruning in cyclical environments.
UR - http://www.scopus.com/inward/record.url?scp=70349330806&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=70349330806&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-02326-2_15
DO - 10.1007/978-3-642-02326-2_15
M3 - Conference contribution
AN - SCOPUS:70349330806
SN - 3642023258
SN - 9783642023255
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 142
EP - 151
BT - Multiple Classifier Systems - 8th International Workshop, MCS 2009, Proceedings
T2 - 8th International Workshop on Multiple Classifier Systems, MCS 2009
Y2 - 10 June 2009 through 12 June 2009
ER -