Subset selection for tuning of hyper-parameters in artificial neural networks

dc.authorid0000-0001-9033-8934
dc.authorid0000-0003-0298-0690
dc.contributor.authorAki, K.K.Emreen_US
dc.contributor.authorErkoç, Tuğbaen_US
dc.contributor.authorEskil, Mustafa Taneren_US
dc.date.accessioned2019-03-27T03:32:27Z
dc.date.available2019-03-27T03:32:27Z
dc.date.issued2017
dc.departmentIşık Üniversitesi, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümüen_US
dc.departmentIşık University, Faculty of Engineering, Department of Computer Engineeringen_US
dc.description.abstractHyper-parameters of a machine learning architecture define its design. Tuning of hyper-parameters is costly and for large data sets outright impractical, whether it is performed manually or algorithmically. In this study we propose a Neocognitron based method for reducing the training set to a fraction, while keeping the dynamics and complexity of the domain. Our approach does not require processing of the entire training set, making it feasible for larger data sets. In our experiments we could successfully reduce the MNIST training data set to less than 2.5% (1,489 images) by processing less than 10% of the 60K images. We showed that the reduced data set can be used for tuning of number of hidden neurons in a multi-layer perceptron.en_US
dc.description.versionPublisher's Versionen_US
dc.identifier.citationAki, K. K. E., Erkoç, T. & Eskil, M. T. (2017). Subset selection for tuning of hyper-parameters in artificial neural networks. Paper presented at the 24th IEEE International Conference on Electronics, Circuits and Systems, 2018-January, 144-147. doi:10.1109/ICECS.2017.8292105en_US
dc.identifier.doi10.1109/ICECS.2017.8292105
dc.identifier.endpage147
dc.identifier.isbn9781538619117
dc.identifier.scopus2-s2.0-85047324387
dc.identifier.scopusqualityN/A
dc.identifier.startpage144
dc.identifier.urihttps://hdl.handle.net/11729/1511
dc.identifier.urihttp://dx.doi.org/10.1109/ICECS.2017.8292105
dc.identifier.volume2018-01
dc.identifier.wosWOS:000426974200034
dc.identifier.wosqualityN/A
dc.indekslendigikaynakWeb of Scienceen_US
dc.indekslendigikaynakScopusen_US
dc.indekslendigikaynakConference Proceedings Citation Index – Science (CPCI-S)en_US
dc.institutionauthorAki, K.K.Emreen_US
dc.institutionauthorErkoç, Tuğbaen_US
dc.institutionauthorEskil, Mustafa Taneren_US
dc.institutionauthorid0000-0001-9033-8934
dc.institutionauthorid0000-0003-0298-0690
dc.language.isoenen_US
dc.peerreviewedYesen_US
dc.publicationstatusPublisheden_US
dc.publisherIEEEen_US
dc.relation.ispartof24th IEEE International Conference on Electronics, Circuits and Systemsen_US
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectOptimizationen_US
dc.subjectNeocognitronen_US
dc.subjectRecognitionen_US
dc.subjectLearning systemsen_US
dc.subjectNeural networksen_US
dc.subjectTuningen_US
dc.subjectHyper-parameteren_US
dc.subjectLarge datasetsen_US
dc.subjectMulti layer perceptronen_US
dc.subjectNumber of hidden neuronsen_US
dc.subjectReduced dataen_US
dc.subjectSubset selectionen_US
dc.subjectTraining data setsen_US
dc.subjectTraining setsen_US
dc.subjectData reductionen_US
dc.subjectConvolutionen_US
dc.subjectConvolutional neuralen_US
dc.titleSubset selection for tuning of hyper-parameters in artificial neural networksen_US
dc.typeConference Objecten_US
dspace.entity.typePublication

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
İsim:
1511.pdf
Boyut:
210.12 KB
Biçim:
Adobe Portable Document Format
Açıklama:
Publisher's Version
Lisans paketi
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
İsim:
license.txt
Boyut:
1.71 KB
Biçim:
Item-specific license agreed upon to submission
Açıklama: