Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/20574
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPoria, Soujanya-
dc.contributor.authorGelbukh, Alexander-
dc.contributor.authorCambria, Erik-
dc.contributor.authorHussain, Amir-
dc.contributor.authorHuang, Guang-Bin-
dc.date.accessioned2014-12-15T23:11:38Z-
dc.date.available2014-12-15T23:11:38Z-
dc.date.issued2014-10-
dc.identifier.urihttp://hdl.handle.net/1893/20574-
dc.description.abstractEmotions play a key role in natural language understanding and sensemaking. Pure machine learning usually fails to recognize and interpret emotions in text. The need for knowledge bases that give access to semantics and sentics (the conceptual and affective information) associated with natural language is growing exponentially in the context of big social data analysis. To this end, this paper proposes EmoSenticSpace, a new framework for affective common-sense reasoning that extends WordNet-Affect and SenticNet by providing both emotion labels and polarity scores for a large set of natural language concepts. The framework is built by means of fuzzy c-means clustering and support-vector-machine classification, and takes into account different similarity measures, such as point-wise mutual information and emotional affinity. EmoSenticSpace was tested on three emotion-related natural language processing tasks, namely sentiment analysis, emotion recognition, and personality detection. In all cases, the proposed framework outperforms the state of the art. In particular, the direct evaluation of EmoSenticSpace against the psychological features provided in the ISEAR dataset shows a 92.15% agreement.en_UK
dc.language.isoen-
dc.publisherElsevier-
dc.relationPoria S, Gelbukh A, Cambria E, Hussain A & Huang G (2014) EmoSenticSpace: A novel framework for affective common-sense reasoning, Knowledge-Based Systems, 69, pp. 108-123.-
dc.rightsPublished in Knowledge-Based Systems by Elsevier; Elsevier believes that individual authors should be able to distribute their AAMs for their personal voluntary needs and interests, e.g. posting to their websites or their institution’s repository, e-mailing to colleagues. However, our policies differ regarding the systematic aggregation or distribution of AAMs to ensure the sustainability of the journals to which AAMs are submitted. Therefore, deposit in, or posting to, subject-oriented or centralized repositories (such as PubMed Central), or institutional repositories with systematic posting mandates is permitted only under specific agreements between Elsevier and the repository, agency or institution, and only consistent with the publisher’s policies concerning such repositories. Voluntary posting of AAMs in the arXiv subject repository is permitted.-
dc.subjectSentic computingen_UK
dc.subjectopinion miningen_UK
dc.subjectsentiment analysisen_UK
dc.subjectemotion detectionen_UK
dc.subjectpersonality detectionen_UK
dc.subjectfuzzy clusteringen_UK
dc.titleEmoSenticSpace: A novel framework for affective common-sense reasoningen_UK
dc.typeJournal Articleen_UK
dc.identifier.doihttp://dx.doi.org/10.1016/j.knosys.2014.06.011-
dc.citation.jtitleKnowledge-Based Systems-
dc.citation.issn0950-7051-
dc.citation.volume69-
dc.citation.spage108-
dc.citation.epage123-
dc.citation.publicationstatusPublished-
dc.citation.peerreviewedRefereed-
dc.type.statusPost-print (author final draft post-refereeing)-
dc.author.emailamir.hussain@stir.ac.uk-
dc.contributor.affiliationNanyang Technological University-
dc.contributor.affiliationNational Polytechnic Institute-
dc.contributor.affiliationNanyang Technological University-
dc.contributor.affiliationComputing Science - CSM Dept-
dc.contributor.affiliationNanyang Technological University-
dc.identifier.isi000344131100011-
Appears in Collections:Computing Science and Mathematics Journal Articles

Files in This Item:
File Description SizeFormat 
emosenticspace.pdf1.41 MBAdobe PDFView/Open


This item is protected by original copyright



Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.