TY - GEN
T1 - Incremental learning in non-stationary environments with concept drift using a multiple classifier based approach
AU - Karnick, Matthew
AU - Muhlbaier, Michael D.
AU - Polikar, Robi
PY - 2008
Y1 - 2008
N2 - We outline an incremental learning algorithm designed for nonstationary environments where the underlying data distribution changes over time. With each dataset drawn from a new environment, we generate a new classifier. Classifiers are combined through dynamically weighted majority voting, where voting weights are determined based on classifiers' age and accuracy on current and past environments. The most recent and relevant classifiers are weighted higher, allowing the algorithm to appropriately adapt to drifting concepts. This algorithm does not discard prior classifiers, allowing efficient learning of potentially cyclical environments. The algorithm learns incrementally, i.e., without access to previous data. Finally, the algorithm can use any supervised classifier as its base model, including those not normally capable of incremental learning. We present the algorithm and its performance using different base learners in different environments with varying types of drift.
AB - We outline an incremental learning algorithm designed for nonstationary environments where the underlying data distribution changes over time. With each dataset drawn from a new environment, we generate a new classifier. Classifiers are combined through dynamically weighted majority voting, where voting weights are determined based on classifiers' age and accuracy on current and past environments. The most recent and relevant classifiers are weighted higher, allowing the algorithm to appropriately adapt to drifting concepts. This algorithm does not discard prior classifiers, allowing efficient learning of potentially cyclical environments. The algorithm learns incrementally, i.e., without access to previous data. Finally, the algorithm can use any supervised classifier as its base model, including those not normally capable of incremental learning. We present the algorithm and its performance using different base learners in different environments with varying types of drift.
UR - http://www.scopus.com/inward/record.url?scp=77957948203&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77957948203&partnerID=8YFLogxK
U2 - 10.1109/icpr.2008.4761062
DO - 10.1109/icpr.2008.4761062
M3 - Conference contribution
AN - SCOPUS:77957948203
SN - 9781424421756
T3 - Proceedings - International Conference on Pattern Recognition
BT - 2008 19th International Conference on Pattern Recognition, ICPR 2008
PB - Institute of Electrical and Electronics Engineers Inc.
ER -