We have recently introduced Learn++, an incremental learning algorithm, inspired by the multiple classifiers structure of AdaBoost. Both algorithms generate an ensemble of classifiers trained on bootstrapped replicates of the training data, and the classifiers are then combined through a voting process. Learn+ however, generates additional ensembles as new data become available, and uses a different distribution update rule to resample the data. While AdaBoost was originally designed to improve the performance of a weak classifier, whether it can still achieve incremental learning through its ensemble structure is still an open question. In this paper, we compare the incremental learning ability of AdaBoost.M1 and Learn under very hostile nonstationary learning environments, which may introduce new concept classes. We also compare the algorithms under several combination rules to determine which of the three key components - ensemble structure, resampling procedure, or the combination rule - has the primary impact on incremental learning in nonstationary environments.