Subset selection for tuning of hyper-parameters in artificial neural networks
dc.authorid | 0000-0001-9033-8934 | |
dc.authorid | 0000-0003-0298-0690 | |
dc.contributor.author | Aki, K.K.Emre | en_US |
dc.contributor.author | Erkoç, Tuğba | en_US |
dc.contributor.author | Eskil, Mustafa Taner | en_US |
dc.date.accessioned | 2019-03-27T03:32:27Z | |
dc.date.available | 2019-03-27T03:32:27Z | |
dc.date.issued | 2017 | |
dc.department | Işık Üniversitesi, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü | en_US |
dc.department | Işık University, Faculty of Engineering, Department of Computer Engineering | en_US |
dc.description.abstract | Hyper-parameters of a machine learning architecture define its design. Tuning of hyper-parameters is costly and for large data sets outright impractical, whether it is performed manually or algorithmically. In this study we propose a Neocognitron based method for reducing the training set to a fraction, while keeping the dynamics and complexity of the domain. Our approach does not require processing of the entire training set, making it feasible for larger data sets. In our experiments we could successfully reduce the MNIST training data set to less than 2.5% (1,489 images) by processing less than 10% of the 60K images. We showed that the reduced data set can be used for tuning of number of hidden neurons in a multi-layer perceptron. | en_US |
dc.description.version | Publisher's Version | en_US |
dc.identifier.citation | Aki, K. K. E., Erkoç, T. & Eskil, M. T. (2017). Subset selection for tuning of hyper-parameters in artificial neural networks. Paper presented at the 24th IEEE International Conference on Electronics, Circuits and Systems, 2018-January, 144-147. doi:10.1109/ICECS.2017.8292105 | en_US |
dc.identifier.doi | 10.1109/ICECS.2017.8292105 | |
dc.identifier.endpage | 147 | |
dc.identifier.isbn | 9781538619117 | |
dc.identifier.scopus | 2-s2.0-85047324387 | |
dc.identifier.scopusquality | N/A | |
dc.identifier.startpage | 144 | |
dc.identifier.uri | https://hdl.handle.net/11729/1511 | |
dc.identifier.uri | http://dx.doi.org/10.1109/ICECS.2017.8292105 | |
dc.identifier.volume | 2018-01 | |
dc.identifier.wos | WOS:000426974200034 | |
dc.identifier.wosquality | N/A | |
dc.indekslendigikaynak | Web of Science | en_US |
dc.indekslendigikaynak | Scopus | en_US |
dc.indekslendigikaynak | Conference Proceedings Citation Index – Science (CPCI-S) | en_US |
dc.institutionauthor | Aki, K.K.Emre | en_US |
dc.institutionauthor | Erkoç, Tuğba | en_US |
dc.institutionauthor | Eskil, Mustafa Taner | en_US |
dc.institutionauthorid | 0000-0001-9033-8934 | |
dc.institutionauthorid | 0000-0003-0298-0690 | |
dc.language.iso | en | en_US |
dc.peerreviewed | Yes | en_US |
dc.publicationstatus | Published | en_US |
dc.publisher | IEEE | en_US |
dc.relation.ispartof | 24th IEEE International Conference on Electronics, Circuits and Systems | en_US |
dc.relation.publicationcategory | Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.subject | Optimization | en_US |
dc.subject | Neocognitron | en_US |
dc.subject | Recognition | en_US |
dc.subject | Learning systems | en_US |
dc.subject | Neural networks | en_US |
dc.subject | Tuning | en_US |
dc.subject | Hyper-parameter | en_US |
dc.subject | Large datasets | en_US |
dc.subject | Multi layer perceptron | en_US |
dc.subject | Number of hidden neurons | en_US |
dc.subject | Reduced data | en_US |
dc.subject | Subset selection | en_US |
dc.subject | Training data sets | en_US |
dc.subject | Training sets | en_US |
dc.subject | Data reduction | en_US |
dc.subject | Convolution | en_US |
dc.subject | Convolutional neural | en_US |
dc.title | Subset selection for tuning of hyper-parameters in artificial neural networks | en_US |
dc.type | Conference Object | en_US |
dspace.entity.type | Publication |