Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/24675
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYang, Xien_UK
dc.contributor.authorHuang, Kaizhuen_UK
dc.contributor.authorZhang, Ruien_UK
dc.contributor.authorHussain, Amiren_UK
dc.contributor.editorHirose, Aen_UK
dc.contributor.editorOzawa, Sen_UK
dc.contributor.editorDoya, Ken_UK
dc.contributor.editorIkeda, Ken_UK
dc.contributor.editorLee, Men_UK
dc.contributor.editorLiu, Den_UK
dc.date.accessioned2017-11-10T04:29:13Z-
dc.date.available2017-11-10T04:29:13Z-
dc.date.issued2016-09-29en_UK
dc.identifier.urihttp://hdl.handle.net/1893/24675-
dc.description.abstractNon-negative Matrix Factorization (NMF) has been widely exploited to learn latent features from data. However, previous NMF models often assume a fixed number of features, saypfeatures, wherepis simply searched by experiments. Moreover, it is even difficult to learn binary features, since binary matrix involves more challenging optimization problems. In this paper, we propose a new Bayesian model called infinite non-negative binary matrix tri-factorizations model (iNBMT), capable of learning automatically the latent binary features as well as feature number based on Indian Buffet Process (IBP). Moreover, iNBMT engages a tri-factorization process that decomposes a nonnegative matrix into the product of three components including two binary matrices and a non-negative real matrix. Compared with traditional bi-factorization, the tri-factorization can better reveal the latent structures among items (samples) and attributes (features). Specifically, we impose an IBP prior on the two infinite binary matrices while a truncated Gaussian distribution is assumed on the weight matrix. To optimize the model, we develop an efficient modified maximization-expectation algorithm (ME-algorithm), with the iteration complexity one order lower than another recently-proposed Maximization-Expectation-IBP model[9]. We present the model definition, detail the optimization, and finally conduct a series of experiments. Experimental results demonstrate that our proposed iNBMT model significantly outperforms the other comparison algorithms in both synthetic and real data.en_UK
dc.language.isoenen_UK
dc.publisherSpringeren_UK
dc.relationYang X, Huang K, Zhang R & Hussain A (2016) Learning latent features with infinite non-negative binary matrix tri-factorization. In: Hirose A, Ozawa S, Doya K, Ikeda K, Lee M & Liu D (eds.) Neural Information Processing: 23rd International Conference, ICONIP 2016, Kyoto, Japan, October 16–21, 2016, Proceedings, Part I. Lecture Notes in Computer Science, 9947. ICONIP 2016: 23rd International Conference on Neural Information Processing, Kyoto, Japan, 16.10.2016-21.10.2016. Cham, Switzerland: Springer, pp. 587-596. https://doi.org/10.1007/978-3-319-46687-3_65en_UK
dc.relation.ispartofseriesLecture Notes in Computer Science, 9947en_UK
dc.rightsPublished in Neural Information Processing: 23rd International Conference, ICONIP 2016, Kyoto, Japan, October 16–21, 2016, Proceedings, Part I, ed. by Hirose A, Ozawa S, Doya K, Ikeda K, Lee M, Liu D, published by Springer. The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-46687-3_65en_UK
dc.subjectInfinite non-negative binary matrix tri-factorizationen_UK
dc.subjectInfinite latent feature modelen_UK
dc.subjectIndian Buffet Process prioren_UK
dc.titleLearning latent features with infinite non-negative binary matrix tri-factorizationen_UK
dc.typeConference Paperen_UK
dc.rights.embargodate2016-12-14en_UK
dc.identifier.doi10.1007/978-3-319-46687-3_65en_UK
dc.citation.issn0302-9743en_UK
dc.citation.spage587en_UK
dc.citation.epage596en_UK
dc.citation.publicationstatusPublisheden_UK
dc.citation.peerreviewedRefereeden_UK
dc.type.statusAM - Accepted Manuscripten_UK
dc.author.emailahu@cs.stir.ac.uken_UK
dc.citation.btitleNeural Information Processing: 23rd International Conference, ICONIP 2016, Kyoto, Japan, October 16–21, 2016, Proceedings, Part Ien_UK
dc.citation.conferencedates2016-10-16 - 2016-10-21en_UK
dc.citation.conferencelocationKyoto, Japanen_UK
dc.citation.conferencenameICONIP 2016: 23rd International Conference on Neural Information Processingen_UK
dc.citation.date30/09/2016en_UK
dc.citation.isbn978-3-319-46686-6en_UK
dc.citation.isbn978-3-319-46687-3en_UK
dc.publisher.addressCham, Switzerlanden_UK
dc.contributor.affiliationXi’an Jiaotong Universityen_UK
dc.contributor.affiliationXi’an Jiaotong Universityen_UK
dc.contributor.affiliationXi’an Jiaotong Universityen_UK
dc.contributor.affiliationComputing Scienceen_UK
dc.identifier.isiWOS:000389805900065en_UK
dc.identifier.scopusid2-s2.0-84992646563en_UK
dc.identifier.wtid543089en_UK
dc.contributor.orcid0000-0002-8080-082Xen_UK
dc.date.accepted2016-07-10en_UK
dcterms.dateAccepted2016-07-10en_UK
dc.date.filedepositdate2016-12-14en_UK
rioxxterms.apcnot requireden_UK
rioxxterms.typeConference Paper/Proceeding/Abstracten_UK
rioxxterms.versionAMen_UK
local.rioxx.authorYang, Xi|en_UK
local.rioxx.authorHuang, Kaizhu|en_UK
local.rioxx.authorZhang, Rui|en_UK
local.rioxx.authorHussain, Amir|0000-0002-8080-082Xen_UK
local.rioxx.projectInternal Project|University of Stirling|https://isni.org/isni/0000000122484331en_UK
local.rioxx.contributorHirose, A|en_UK
local.rioxx.contributorOzawa, S|en_UK
local.rioxx.contributorDoya, K|en_UK
local.rioxx.contributorIkeda, K|en_UK
local.rioxx.contributorLee, M|en_UK
local.rioxx.contributorLiu, D|en_UK
local.rioxx.freetoreaddate2016-12-14en_UK
local.rioxx.licencehttp://www.rioxx.net/licenses/all-rights-reserved|2016-12-14|en_UK
local.rioxx.filenamepaper488.pdfen_UK
local.rioxx.filecount1en_UK
local.rioxx.source978-3-319-46687-3en_UK
Appears in Collections:Computing Science and Mathematics Conference Papers and Proceedings

Files in This Item:
File Description SizeFormat 
paper488.pdfFulltext - Accepted Version615.85 kBAdobe PDFView/Open


This item is protected by original copyright



Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.