Core support extraction for learning from initially labeled nonstationary environments using COMPOSE

Robert Capo, Anthony Sanchez, Robi Polikar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Scopus citations

Abstract

Learning in nonstationary environments, also called concept drift, requires an algorithm to track and learn from streaming data, drawn from a nonstationary (drifting) distribution. When data arrive continuously, a concept drift algorithm is required to maintain an up-to-date hypothesis that evolves with the changing environment. A more difficult problem that has received less attention, however, is learning from so-called initially labeled nonstationary environments, where the the environment provides only unlabeled data after initialization. Since the labels to such data never become available, learning in such a setting is also referred to as extreme verification latency, where the algorithm must only use unlabeled data to keep the hypothesis current. In this contribution, we analyze COMPOSE, a framework recently proposed for learning in such environments. One of the central processes of COMPOSE is core support extraction, where the algorithm predicts which data instances will be useful and relevant for classification in future time steps. We compare two different options, namely Gaussian mixture model based maximum a posteriori sampling and a-shape compaction, for core support extraction, and analyze their effects on both accuracy and computational complexity of the algorithm. Our findings point to-as is the case in most engineering problems a trade-off: that a-shapes are more versatile in most situations, but they are far more computationally complex, especially as the dimensionality of the dataset increases. Our proposed GMM procedure allows COMPOSE to operate on datasets of substantially larger dimensionality without affecting its classification performance.

Original languageEnglish (US)
Title of host publicationProceedings of the International Joint Conference on Neural Networks
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages602-608
Number of pages7
ISBN (Electronic)9781479914845
DOIs
StatePublished - Sep 3 2014
Externally publishedYes
Event2014 International Joint Conference on Neural Networks, IJCNN 2014 - Beijing, China
Duration: Jul 6 2014Jul 11 2014

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Other

Other2014 International Joint Conference on Neural Networks, IJCNN 2014
Country/TerritoryChina
CityBeijing
Period7/6/147/11/14

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Core support extraction for learning from initially labeled nonstationary environments using COMPOSE'. Together they form a unique fingerprint.

Cite this