Ensemblelearning of classifier has been a hot topic in pattern recognition problems for the last twenty years. This is because standalone classifier does not improve the performance when the dataset suffers from class imbalance.Ensemble learning is generally based on boosting and bagging techniques. Boostingcombines multiple classifiers of the same type, trained with weighted sample sets. Our aim is to improve the general boosting algorithm by usingdiversekinds of classifiers to build the ensemble of classifiers. Two different kinds of classifier-BP-MLP and RBFNN are considered for constructing the initial ensemble in our algorithm. Thestrategy is to assign an adaptive weight to the different types of classifiers based on their individual performancein order toboost a particular kind of classifier amongst the above two. Benchmark datasets from UCI repository are used for analysis which confirm that our method outperforms single type of learner based boosting.