Subset selection for tuning of hyper-parameters in artificial neural networks

Yükleniyor...
Küçük Resim

Tarih

2017

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

IEEE

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Araştırma projeleri

Organizasyon Birimleri

Dergi sayısı

Özet

Hyper-parameters of a machine learning architecture define its design. Tuning of hyper-parameters is costly and for large data sets outright impractical, whether it is performed manually or algorithmically. In this study we propose a Neocognitron based method for reducing the training set to a fraction, while keeping the dynamics and complexity of the domain. Our approach does not require processing of the entire training set, making it feasible for larger data sets. In our experiments we could successfully reduce the MNIST training data set to less than 2.5% (1,489 images) by processing less than 10% of the 60K images. We showed that the reduced data set can be used for tuning of number of hidden neurons in a multi-layer perceptron.

Açıklama

Anahtar Kelimeler

Optimization, Neocognitron, Recognition, Learning systems, Neural networks, Tuning, Hyper-parameter, Large datasets, Multi layer perceptron, Number of hidden neurons, Reduced data, Subset selection, Training data sets, Training sets, Data reduction, Convolution, Convolutional neural

Kaynak

24th IEEE International Conference on Electronics, Circuits and Systems

WoS Q Değeri

N/A

Scopus Q Değeri

N/A

Cilt

2018-01

Sayı

Künye

Aki, K. K. E., Erkoç, T. & Eskil, M. T. (2017). Subset selection for tuning of hyper-parameters in artificial neural networks. Paper presented at the 24th IEEE International Conference on Electronics, Circuits and Systems, 2018-January, 144-147. doi:10.1109/ICECS.2017.8292105