LYAPUNOV THEORY BASED ADAPTIVE LEARNING ALGORITHM FOR MULTILAYER NEURAL NETWORKS

dc.authorid0000-0003-1186-3058en_US
dc.contributor.authorAcır, Nurettin
dc.contributor.authorMenguc, Engin Cemal
dc.date.accessioned2021-03-20T20:15:40Z
dc.date.available2021-03-20T20:15:40Z
dc.date.issued2014
dc.departmentBTÜ, Mühendislik ve Doğa Bilimleri Fakültesi, Elektrik Elektronik Mühendisliği Bölümüen_US
dc.descriptionMenguc, Engin Cemal/0000-0002-0619-549Xen_US
dc.description.abstractThis paper presents a novel weight updating algorithm for training of multilayer neural network (MLNN). The MLNN system is first linearized and then the design procedure is proposed as an inequality constraint optimization problem. A well selected Lyapunov function is suitably determined and integrated into the constraint function for satisfying asymptotic stability in the sense of Lyapunov. Thus, the convergence capability of training algorithm is improved by using a new analytical adaptation gain rate which has the ability to adaptively adjust itself depending on a sequential square error rate. The proposed algorithm is compared with two types of backpropagation algorithms and a Lyapunov theory based MLNN algorithm on three benchmark problems which are XOR, 3-bit parity, and 8-3 encoder. The results are compared in terms of number of learning iterations and computational time required for a specified convergence rate. The results clearly indicate that the proposed algorithm is much faster in convergence than other three algorithms. The proposed algorithm is also comparatively tested on a real iris image database for multiple-input and multiple-output classification problem and the effect of adaptation gain rate for faster convergence and higher performance is verified.en_US
dc.description.sponsorshipNigde UniversityOmer Halis Demir University [FEB-2010/32]; Scientific Research Project Unit [FEB-2010/32]en_US
dc.description.sponsorshipThis study has partially been supported by Nigde University, The Scientific Research Project Unit with the project number of FEB-2010/32. We also would like to thank to the authorities of Multimedia University MMU1 Iris Image Database.en_US
dc.identifier.doi10.14311/NNW.2014.24.035en_US
dc.identifier.endpage636en_US
dc.identifier.issn1210-0552
dc.identifier.issue6en_US
dc.identifier.scopusqualityQ4en_US
dc.identifier.startpage619en_US
dc.identifier.urihttp://doi.org/10.14311/NNW.2014.24.035
dc.identifier.urihttps://hdl.handle.net/20.500.12885/1221
dc.identifier.volume24en_US
dc.identifier.wosWOS:000348408100004en_US
dc.identifier.wosqualityQ4en_US
dc.indekslendigikaynakWeb of Scienceen_US
dc.indekslendigikaynakScopusen_US
dc.institutionauthorAcır, Nurettin
dc.language.isoenen_US
dc.publisherAcad Sciences Czech Republic, Inst Computer Scienceen_US
dc.relation.ispartofNeural Network Worlden_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectLyapunov stability theoryen_US
dc.subjectmultilayer neural networken_US
dc.subjectLagrange multiplier theoryen_US
dc.subjectadaptive learningen_US
dc.titleLYAPUNOV THEORY BASED ADAPTIVE LEARNING ALGORITHM FOR MULTILAYER NEURAL NETWORKSen_US
dc.typeArticleen_US

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
İsim:
2014_24_032.pdf
Boyut:
62.35 KB
Biçim:
Adobe Portable Document Format
Açıklama:
Tam Metin / Full Text