Adaptive locally connected recurrent unit (ALCRU)
dc.authorid | 0000-0003-3903-7356 | |
dc.authorid | 0000-0002-8649-6013 | |
dc.contributor.author | Özçelik, Şuayb Talha | en_US |
dc.contributor.author | Tek, Faik Boray | en_US |
dc.date.accessioned | 2025-09-15T07:00:04Z | |
dc.date.available | 2025-09-15T07:00:04Z | |
dc.date.issued | 2025-07-03 | |
dc.department | Işık Üniversitesi, Lisansüstü Eğitim Enstitüsü, Bilgisayar Mühendisliği Doktora Programı | en_US |
dc.department | Işık University, School of Graduate Studies, Ph.D. in Computer Engineering | en_US |
dc.description.abstract | Research has shown that adaptive locally connected neurons outperform their fully connected (dense) counterparts, motivating this study on the development of the Adaptive Locally Connected Recurrent Unit (ALCRU). ALCRU modifies the Simple Recurrent Neuron Model (SimpleRNN) by incorporating spatial coordinate spaces for input and hidden state vectors, facilitating the learning of parametric local receptive fields. These modifications add four trainable parameters per neuron, resulting in a minor increase in computational complexity. ALCRU is implemented using standard frameworks and trained with back-propagation-based optimizers. We evaluate the performance of ALCRU using diverse benchmark datasets, including IMDb for sentiment analysis, AdditionRNN for sequence modelling, and the Weather dataset for time-series forecasting. Results show that ALCRU achieves accuracy and loss metrics comparable to GRU and LSTM while consistently outperforming SimpleRNN. In particular, experiments with longer sequence lengths on AdditionRNN and increased input dimensions on IMDb highlight ALCRU’s superior scalability and efficiency in processing complex data sequences. In terms of computational efficiency, ALCRU demonstrates a considerable speed advantage over gated models like LSTM and GRU, though it is slower than SimpleRNN. These findings suggest that adaptive local connectivity enhances both the accuracy and efficiency of recurrent neural networks, offering a promising alternative to standard architectures. | en_US |
dc.description.version | Publisher's Version | en_US |
dc.identifier.citation | Özçelik, Ş. T. & Tek, F. B. (2025). Adaptive locally connected recurrent unit (ALCRU). International Journal of Machine Learning and Cybernetics, 16(9), 6903-6914. doi:https://doi.org/10.1007/s13042-025-02652-7 | en_US |
dc.identifier.doi | 10.1007/s13042-025-02652-7 | |
dc.identifier.endpage | 6914 | |
dc.identifier.issn | 1868-8071 | |
dc.identifier.issn | 1868-808X | |
dc.identifier.issue | 9 | |
dc.identifier.scopus | 2-s2.0-105009623173 | |
dc.identifier.scopusquality | Q1 | |
dc.identifier.startpage | 6903 | |
dc.identifier.uri | https://hdl.handle.net/11729/6700 | |
dc.identifier.uri | https://doi.org/10.1007/s13042-025-02652-7 | |
dc.identifier.volume | 16 | |
dc.identifier.wos | WOS:001522431100001 | |
dc.identifier.wosquality | Q3 | |
dc.indekslendigikaynak | Scopus | en_US |
dc.indekslendigikaynak | Web of Science | en_US |
dc.indekslendigikaynak | Science Citation Index Expanded (SCI-EXPANDED) | en_US |
dc.institutionauthor | Özçelik, Şuayb Talha | en_US |
dc.institutionauthorid | 0000-0003-3903-7356 | |
dc.language.iso | en | en_US |
dc.peerreviewed | Yes | en_US |
dc.publicationstatus | Published | en_US |
dc.publisher | Springer Science and Business Media Deutschland GmbH | en_US |
dc.relation.ispartof | International Journal of Machine Learning and Cybernetics | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Öğrenci | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | Adaptive | en_US |
dc.subject | Focusing neuron | en_US |
dc.subject | Receptive field | en_US |
dc.subject | RNN | en_US |
dc.subject | Backpropagation | en_US |
dc.subject | Benchmarking | en_US |
dc.subject | Complex networks | en_US |
dc.subject | Data accuracy | en_US |
dc.subject | Data handling | en_US |
dc.subject | Long short-term memory | en_US |
dc.subject | Network architecture | en_US |
dc.subject | Neurons | en_US |
dc.subject | Sentiment analysis | en_US |
dc.subject | Time series analysis | en_US |
dc.subject | Coordinate space | en_US |
dc.subject | Hidden state | en_US |
dc.subject | Input state | en_US |
dc.subject | Neuron modeling | en_US |
dc.subject | Receptive fields | en_US |
dc.subject | Simple++ | en_US |
dc.subject | Spatial coordinates | en_US |
dc.subject | Computational efficiency | en_US |
dc.subject | Neural-networks | en_US |
dc.subject | Model | en_US |
dc.subject | LSTM | en_US |
dc.title | Adaptive locally connected recurrent unit (ALCRU) | en_US |
dc.type | Article | en_US |
dspace.entity.type | Publication | en_US |