Multiple classifier systems tend to suffer from outvoting when new concept classes need to be learned incrementally. Out-voting is primarily due to existing classifiers being unable to recognize the new class until there is a sufficient number of new classifiers that can influence the ensemble decision. This problem of learning new classes was explicitly addressed in Learn ++.NC, our previous work, where ensemble members dynamically adjust their own weights by consulting with each other based on their individual and collective confidence in classifying each concept class. Learn++.NC works remarkably well for learning new concept classes while requiring few ensemble members to do so. Learn++.NC cannot cope with the class imbalance problem, however, as it was not designed to do so. Yet, class imbalance is a common and important problem in machine learning, made even more challenging in an incremental learning setting. In this paper, we extend Learn++.NC so that it can incrementally learn new concept classes even if their instances are drawn from severely imbalanced class distributions. We show that the proposed algorithm is quite robust compared to other state-of-the-art algorithms.