Please use this identifier to cite or link to this item:
Appears in Collections:Computing Science and Mathematics Journal Articles
Title: Deep and sparse learning in speech and language processing: An overview
Author(s): Wang, Dong
Zhou, Qiang
Hussain, Amir
Contact Email:
Keywords: Deep learning
Sparse coding
Speech processing
Language processing
Issue Date: 2016
Citation: Wang D, Zhou Q & Hussain A (2016) Deep and sparse learning in speech and language processing: An overview. In: Liu C, Hussain A, Luo B, Tan K, Zeng Y & Zhang Z (eds.) Advances in Brain Inspired Cognitive Systems. BICS 2016. Lecture Notes in Computer Science, 10023. BICS 2016: 8th International Conference on Brain-Inspired Cognitive Systems, Beijing, China, 28.11.2016-30.11.2016. Cham, Switzerland: Springer, pp. 171-183.;
Series/Report no.: Lecture Notes in Computer Science, 10023
Abstract: Large-scale deep neural models, e.g., deep neural networks (DNN) and recurrent neural networks (RNN), have demonstrated significant success in solving various challenging tasks of speech and language processing (SLP), including speech recognition, speech synthesis, document classification and question answering. This growing impact corroborates the neurobiological evidence concerning the presence of layer-wise deep processing in the human brain. On the other hand, sparse coding representation has also gained similar success in SLP, particularly in signal processing, demonstrating sparsity as another important neurobiological characteristic. Recently, research in these two directions is leading to increasing cross-fertlisation of ideas, thus a unified Sparse Deep or Deep Sparse learning framework warrants much attention. This paper aims to provide an overview of growing interest in this unified framework, and also outlines future research possibilities in this multi-disciplinary area.
DOI Link: 10.1007/978-3-319-49685-6_16
Rights: The publisher does not allow this work to be made publicly available in this Repository. Please use the Request a Copy feature at the foot of the Repository record to request a copy directly from the author. You can only request a copy if you wish to use this work for your own research or private study.

Files in This Item:
File Description SizeFormat 
Wang_etal_LNCS_2016.pdfFulltext - Published Version401.36 kBAdobe PDFUnder Embargo until 3000-10-14    Request a copy

Note: If any of the files in this item are currently embargoed, you can request a copy directly from the author by clicking the padlock icon above. However, this facility is dependent on the depositor still being contactable at their original email address.

This item is protected by original copyright

Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

If you believe that any material held in STORRE infringes copyright, please contact providing details and we will remove the Work from public display in STORRE and investigate your claim.