|Appears in Collections:||Computing Science and Mathematics Journal Articles|
|Peer Review Status:||Refereed|
|Title:||Multi-layered Echo State Machine: A Novel Architecture and Algorithm (Forthcoming/Available Online)|
Wu, Qingming Jonathan
multiple layer network and time series neural network
Biological neural networks
Recurrent neural networks
|Citation:||Malik Z, Hussain A & Wu QJ (2016) Multi-layered Echo State Machine: A Novel Architecture and Algorithm (Forthcoming/Available Online), IEEE Transactions on Cybernetics.|
|Abstract:||In this paper, we present a novel architecture and learning algorithm for a multilayered echo state machine (ML-ESM). Traditional echo state networks (ESNs) refer to a particular type of reservoir computing (RC) architecture. They constitute an effective approach to recurrent neural network (RNN) training, with the (RNN-based) reservoir generated randomly, and only the readout trained using a simple computationally efficient algorithm. ESNs have greatly facilitated the real-time application of RNN, and have been shown to outperform classical approaches in a number of benchmark tasks. In this paper, we introduce a novel criteria for integrating multiple layers of reservoirs within the ML-ESM. The addition of multiple layers of reservoirs are shown to provide a more robust alternative to conventional RC networks. We demonstrate the comparative merits of this approach in a number of applications, considering both benchmark datasets and real world applications.|
|Rights:||(c) 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.|
|IEEE_Trans_Cybernetics_revised(accepted)-2016.pdf||5.04 MB||Adobe PDF||View/Open|
This item is protected by original copyright
Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
If you believe that any material held in STORRE infringes copyright, please contact firstname.lastname@example.org providing details and we will remove the Work from public display in STORRE and investigate your claim.