TY - GEN
T1 - OrderNet
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
AU - Hess, Samuel
AU - Ditzler, Gregory
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - Neural networks have shown remarkable classification performances in recent years, often outperforming humans in many tasks. Unfortunately, there are some tasks where conventional neural networks and their training methods have under-performed. One of these areas is learning from a small number of samples which have been partially addressed with the development of few-shot neural networks. Few-shot neural networks contrast traditional classification networks by learning classification tasks with datasets with only a few samples per class; however, these few-shot techniques are trained collectively with a large number of labeled samples. Hence, few-shot learning approaches only address the low sample per class learning problem whereas the task of learning strictly from a low sample size still goes mostly unresolved. In this contribution, we address the challenge of learning from high dimensional low sample data by revising the problem into a data ordering task. Specifically, we have designed OrderNet, a novel network design and training approach that can take a relatively small amount (less than 200 samples) of ordered high dimensional low sample data and organize many more unseen samples. To the best of our knowledge, OrderNet is the first neural network to address the high dimensional low sample data using techniques adopted from few-shot learning. We evaluate OrderNet against its ability to order images of analog clocks by time as well as images of profile pictures by age. Additionally, we demonstrate that OrderNet has superior performance over a conventional regression neural network in the low sample regime.
AB - Neural networks have shown remarkable classification performances in recent years, often outperforming humans in many tasks. Unfortunately, there are some tasks where conventional neural networks and their training methods have under-performed. One of these areas is learning from a small number of samples which have been partially addressed with the development of few-shot neural networks. Few-shot neural networks contrast traditional classification networks by learning classification tasks with datasets with only a few samples per class; however, these few-shot techniques are trained collectively with a large number of labeled samples. Hence, few-shot learning approaches only address the low sample per class learning problem whereas the task of learning strictly from a low sample size still goes mostly unresolved. In this contribution, we address the challenge of learning from high dimensional low sample data by revising the problem into a data ordering task. Specifically, we have designed OrderNet, a novel network design and training approach that can take a relatively small amount (less than 200 samples) of ordered high dimensional low sample data and organize many more unseen samples. To the best of our knowledge, OrderNet is the first neural network to address the high dimensional low sample data using techniques adopted from few-shot learning. We evaluate OrderNet against its ability to order images of analog clocks by time as well as images of profile pictures by age. Additionally, we demonstrate that OrderNet has superior performance over a conventional regression neural network in the low sample regime.
UR - http://www.scopus.com/inward/record.url?scp=85116422092&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85116422092&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9533766
DO - 10.1109/IJCNN52387.2021.9533766
M3 - Conference contribution
AN - SCOPUS:85116422092
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 18 July 2021 through 22 July 2021
ER -