TY - GEN
T1 - Semi-supervised learning in initially labeled non-stationary environments with gradual drift
AU - Dyer, Karl B.
AU - Polikar, Robi
PY - 2012
Y1 - 2012
N2 - Semi-supervised learning (SSL) in non-stationary environments has received relatively little attention in machine learning, despite a growing number of applications that can benefit from a properly configured SSL algorithm. Previous works in learning non-stationary data have analyzed such cases where both labeled and unlabeled instances are received at every time step and/or in regular intervals; however, to the best of our knowledge, no work has investigated the case where labeled instances are received only at the initial time step, followed by unlabeled instances provided in subsequent time steps. In this proof-of-concept work, we propose a new framework for learning in a non-stationary environment that provides only unlabeled data after the initial time step, to which we refer to as initially labeled environment. The proposed framework generates labels for previously unlabeled data at each time step to be combined with incoming unlabeled data - possibly from a drifting distribution - using a compacted polytope sample extraction algorithm. We have conducted two experiments to demonstrate the feasibility and reliability of the approach. This proof-of-concept is presented in two dimensions; however, the algorithm can be extended to higher dimensions with appropriate modifications.
AB - Semi-supervised learning (SSL) in non-stationary environments has received relatively little attention in machine learning, despite a growing number of applications that can benefit from a properly configured SSL algorithm. Previous works in learning non-stationary data have analyzed such cases where both labeled and unlabeled instances are received at every time step and/or in regular intervals; however, to the best of our knowledge, no work has investigated the case where labeled instances are received only at the initial time step, followed by unlabeled instances provided in subsequent time steps. In this proof-of-concept work, we propose a new framework for learning in a non-stationary environment that provides only unlabeled data after the initial time step, to which we refer to as initially labeled environment. The proposed framework generates labels for previously unlabeled data at each time step to be combined with incoming unlabeled data - possibly from a drifting distribution - using a compacted polytope sample extraction algorithm. We have conducted two experiments to demonstrate the feasibility and reliability of the approach. This proof-of-concept is presented in two dimensions; however, the algorithm can be extended to higher dimensions with appropriate modifications.
UR - http://www.scopus.com/inward/record.url?scp=84865066012&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84865066012&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2012.6252541
DO - 10.1109/IJCNN.2012.6252541
M3 - Conference contribution
AN - SCOPUS:84865066012
SN - 9781467314909
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2012 International Joint Conference on Neural Networks, IJCNN 2012
T2 - 2012 Annual International Joint Conference on Neural Networks, IJCNN 2012, Part of the 2012 IEEE World Congress on Computational Intelligence, WCCI 2012
Y2 - 10 June 2012 through 15 June 2012
ER -