Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/24030
Full metadata record
DC FieldValueLanguage
dc.contributor.authorOfek, Niren_UK
dc.contributor.authorPoria, Soujanyaen_UK
dc.contributor.authorRokach, Lioren_UK
dc.contributor.authorCambria, Eriken_UK
dc.contributor.authorHussain, Amiren_UK
dc.contributor.authorShabtai, Asafen_UK
dc.date.accessioned2016-08-16T02:15:07Z-
dc.date.available2016-08-16T02:15:07Z-
dc.date.issued2016-06en_UK
dc.identifier.urihttp://hdl.handle.net/1893/24030-
dc.description.abstractSentiment analysis in natural language text is a challenging task involving a deep understanding of both syntax and semantics. Leveraging the polarity of multiword expressions—or concepts—rather than single words can mitigate the difficulty of such a task as these expressions carry more contextual information than isolated words. Such contextual information is the key to understanding both the syntactic and semantic structure of natural language text and hence is useful in tasks such as sentiment analysis. In this work, we propose a new method to enrich SenticNet (a publicly available knowledge base for concept-level sentiment analysis) with domain-level concepts composed of aspects and sentiment word pairs, along with a measure of their polarity. We process a set of unlabeled texts and, by considering the statistical co-occurrence information, generate a direct acyclic graph (DAG) of concepts. The polarity score of known concepts is propagated and used to compute polarity scores of new concepts. By designing and implementing our exhaustive algorithm, we are able to use a seed set containing only two sentiment words (goodandbad). In our evaluation conducted on a dataset of hotel reviews, SenticNet was enriched by a factor of three (from 30,000 to nearly 90,000 concepts). The experiments demonstrate the merit of the concepts discovered by our method at improving sentence-level and aspect-level sentiment analysis tasks. Results of the two-factor ANOVA statistical test showed a confidence level of 95%, verifying that the improvements are statistically significant.en_UK
dc.language.isoenen_UK
dc.publisherSpringeren_UK
dc.relationOfek N, Poria S, Rokach L, Cambria E, Hussain A & Shabtai A (2016) Unsupervised Commonsense Knowledge Enrichment for Domain-Specific Sentiment Analysis. Cognitive Computation, 8 (3), pp. 467-477. http://link.springer.com/article/10.1007/s12559-015-9375-3; https://doi.org/10.1007/s12559-015-9375-3en_UK
dc.rightsThe publisher does not allow this work to be made publicly available in this Repository. Please use the Request a Copy feature at the foot of the Repository record to request a copy directly from the author. You can only request a copy if you wish to use this work for your own research or private study.en_UK
dc.rights.urihttp://www.rioxx.net/licenses/under-embargo-all-rights-reserveden_UK
dc.subjectSentiment analysisen_UK
dc.subjectSentiment lexiconen_UK
dc.subjectSenticNeten_UK
dc.subjectSentic patternsen_UK
dc.titleUnsupervised Commonsense Knowledge Enrichment for Domain-Specific Sentiment Analysisen_UK
dc.typeJournal Articleen_UK
dc.rights.embargodate2999-12-13en_UK
dc.rights.embargoreason[CogComp-paper-PDF-June2016.pdf] The publisher does not allow this work to be made publicly available in this Repository therefore there is an embargo on the full text of the work.en_UK
dc.identifier.doi10.1007/s12559-015-9375-3en_UK
dc.citation.jtitleCognitive Computationen_UK
dc.citation.issn1866-9964en_UK
dc.citation.issn1866-9956en_UK
dc.citation.volume8en_UK
dc.citation.issue3en_UK
dc.citation.spage467en_UK
dc.citation.epage477en_UK
dc.citation.publicationstatusPublisheden_UK
dc.citation.peerreviewedRefereeden_UK
dc.type.statusVoR - Version of Recorden_UK
dc.contributor.funderThe Royal Society of Edinburghen_UK
dc.identifier.urlhttp://link.springer.com/article/10.1007/s12559-015-9375-3en_UK
dc.author.emailahu@cs.stir.ac.uken_UK
dc.citation.date12/02/2016en_UK
dc.contributor.affiliationBen-Gurion University of the Negeven_UK
dc.contributor.affiliationUniversity of Stirlingen_UK
dc.contributor.affiliationBen-Gurion University of the Negeven_UK
dc.contributor.affiliationNanyang Technological Universityen_UK
dc.contributor.affiliationComputing Scienceen_UK
dc.contributor.affiliationBen-Gurion University of the Negeven_UK
dc.identifier.isiWOS:000376284900007en_UK
dc.identifier.scopusid2-s2.0-84957999058en_UK
dc.identifier.wtid554157en_UK
dc.contributor.orcid0000-0002-8080-082Xen_UK
dc.date.accepted2015-12-07en_UK
dcterms.dateAccepted2015-12-07en_UK
dc.date.filedepositdate2016-08-15en_UK
dc.relation.funderprojectCognitive SenticNet and Multimodal Topic Structure Parsing Techniques for Both Chinese and English Languagesen_UK
dc.relation.funderrefABEL/NNS/INTen_UK
rioxxterms.apcnot requireden_UK
rioxxterms.typeJournal Article/Reviewen_UK
rioxxterms.versionVoRen_UK
local.rioxx.authorOfek, Nir|en_UK
local.rioxx.authorPoria, Soujanya|en_UK
local.rioxx.authorRokach, Lior|en_UK
local.rioxx.authorCambria, Erik|en_UK
local.rioxx.authorHussain, Amir|0000-0002-8080-082Xen_UK
local.rioxx.authorShabtai, Asaf|en_UK
local.rioxx.projectABEL/NNS/INT|The Royal Society of Edinburgh|en_UK
local.rioxx.freetoreaddate2999-12-13en_UK
local.rioxx.licencehttp://www.rioxx.net/licenses/under-embargo-all-rights-reserved||en_UK
local.rioxx.filenameCogComp-paper-PDF-June2016.pdfen_UK
local.rioxx.filecount1en_UK
local.rioxx.source1866-9956en_UK
Appears in Collections:Computing Science and Mathematics Journal Articles

Files in This Item:
File Description SizeFormat 
CogComp-paper-PDF-June2016.pdfFulltext - Published Version567.37 kBAdobe PDFUnder Embargo until 2999-12-13    Request a copy


This item is protected by original copyright



Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.