Can AdaBoost.M1 learn incrementally? A comparison to learn++ under different combination rules

Hussein Syed Mohammed, James Leander, Matthew Marbach, Robi Polikar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

We had previously introduced Learn++, inspired in part by the ensemble based AdaBoost algorithm, for incrementally learning from new data, including new concept classes, without forgetting what had been previously learned. In this effort, we compare the incremental learning performance of Learn++ and AdaBoost under several combination schemes, including their native, weighted majority voting. We show on several databases that changing AdaBoost's distribution update rule from hypothesis based update to ensemble based update allows significantly more efficient incremental learning ability, regardless of the combination rule used to combine the classifiers.

Original languageEnglish (US)
Title of host publicationArtificial Neural Networks, ICANN 2006 - 16th International Conference, Proceedings
PublisherSpringer Verlag
Pages254-263
Number of pages10
ISBN (Print)3540386254, 9783540386254
DOIs
StatePublished - 2006
Externally publishedYes
Event16th International Conference on Artificial Neural Networks, ICANN 2006 - Athens, Greece
Duration: Sep 10 2006Sep 14 2006

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume4131 LNCS - I
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other16th International Conference on Artificial Neural Networks, ICANN 2006
Country/TerritoryGreece
CityAthens
Period9/10/069/14/06

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Can AdaBoost.M1 learn incrementally? A comparison to learn++ under different combination rules'. Together they form a unique fingerprint.

Cite this