TY - GEN
T1 - Variance Guided Continual Learning in a Convolutional Neural Network Gaussian Process Single Classifier Approach for Multiple Tasks in Noisy Images
AU - Javed, Mahed
AU - Mihaylova, Lyudmila
AU - Bouaynaya, Nidhal
N1 - Publisher Copyright:
© 2021 International Society of Information Fusion (ISIF).
PY - 2021
Y1 - 2021
N2 - This work provides a continual learning solution in a single-classifier to multiple classification tasks with various data sets. A Gaussian process (GP) is combined with a Convolutional Neural Network (CNN) feature extractor architecture (CNNGP). Post softmax samples are used to estimate the variance. The variance is characterising the impact of uncertainties and is part of the update process for the learning rate parameters. Within the proposed framework two learning approaches are adopted: 1) in the first, the weights of the CNN are deterministic and only the GP learning rate is updated, 2) in the second setting, prior distributions are adopted for the CNN weights. Both the learning rates of the CNN and the GP are updated. The algorithm is trained on two variants of the MNIST dataset, split-MNIST and permuted-MNIST. Results are compared with the Uncertainty Guided Continual Bayesian Networks (UCB) multi-classifier approach [1]. The validation shows that the proposed algorithm in the Bayesian setting outperforms the UCB in tasks subject to Gaussian noise image noises and shows robustness.
AB - This work provides a continual learning solution in a single-classifier to multiple classification tasks with various data sets. A Gaussian process (GP) is combined with a Convolutional Neural Network (CNN) feature extractor architecture (CNNGP). Post softmax samples are used to estimate the variance. The variance is characterising the impact of uncertainties and is part of the update process for the learning rate parameters. Within the proposed framework two learning approaches are adopted: 1) in the first, the weights of the CNN are deterministic and only the GP learning rate is updated, 2) in the second setting, prior distributions are adopted for the CNN weights. Both the learning rates of the CNN and the GP are updated. The algorithm is trained on two variants of the MNIST dataset, split-MNIST and permuted-MNIST. Results are compared with the Uncertainty Guided Continual Bayesian Networks (UCB) multi-classifier approach [1]. The validation shows that the proposed algorithm in the Bayesian setting outperforms the UCB in tasks subject to Gaussian noise image noises and shows robustness.
UR - http://www.scopus.com/inward/record.url?scp=85123411307&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85123411307&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85123411307
T3 - Proceedings of 2021 IEEE 24th International Conference on Information Fusion, FUSION 2021
BT - Proceedings of 2021 IEEE 24th International Conference on Information Fusion, FUSION 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 24th IEEE International Conference on Information Fusion, FUSION 2021
Y2 - 1 November 2021 through 4 November 2021
ER -