Domain adaptation bounds for multiple expert systems under concept drift

Gregory Ditzler, Gail Rosen, Robi Polikar

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    9 Scopus citations

    Abstract

    The ability to learn incrementally from streaming data either in an online or batch setting is of crucial importance for a prediction algorithm to learn from environments that generate vast amounts of data, where it is impractical or simply unfeasible to store all historical data. On the other hand, learning from streaming data becomes increasingly difficult when the probability distribution generating the data stream evolves over time, which renders the classification model generated from previously seen data suboptimal or potentially useless. Ensemble systems that employ multiple classifiers may be used to mitigate this effect, but even in such cases some classifiers (experts) become less knowledgeable for predicting on different domains than others as the distribution drifts. Further complication results when labeled data from a prediction (target) domain is not immediately available; hence, causing prediction on the target domain to yield sub-optimal results. In this work, we provide upper bounds on the loss, which hold with high probability, of a multiple expert system trained in such a nonstationary environment with verification latency. Furthermore, we show why a single model selection strategy can lead to undesirable results when learning in such nonstationary streaming settings. We present our analytical results with experiments on simulated as well as real-world data sets, comparing several different ensemble approaches to a single model.

    Original languageEnglish (US)
    Title of host publicationProceedings of the International Joint Conference on Neural Networks
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages595-601
    Number of pages7
    ISBN (Electronic)9781479914845
    DOIs
    StatePublished - Sep 3 2014
    Event2014 International Joint Conference on Neural Networks, IJCNN 2014 - Beijing, China
    Duration: Jul 6 2014Jul 11 2014

    Publication series

    NameProceedings of the International Joint Conference on Neural Networks

    Other

    Other2014 International Joint Conference on Neural Networks, IJCNN 2014
    CountryChina
    CityBeijing
    Period7/6/147/11/14

    All Science Journal Classification (ASJC) codes

    • Software
    • Artificial Intelligence

    Fingerprint Dive into the research topics of 'Domain adaptation bounds for multiple expert systems under concept drift'. Together they form a unique fingerprint.

    Cite this