Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/16515
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCambria, Eriken_UK
dc.contributor.authorHupont, Isabelleen_UK
dc.contributor.authorHussain, Amiren_UK
dc.contributor.authorCerezo, Evaen_UK
dc.contributor.authorBaldassarri, Sandraen_UK
dc.contributor.editorEsposito, Aen_UK
dc.contributor.editorEsposito, AMen_UK
dc.contributor.editorMartone, Ren_UK
dc.contributor.editorMuller, VCen_UK
dc.contributor.editorScarpetta, Gen_UK
dc.date.accessioned2014-05-27T23:10:02Z-
dc.date.available2014-05-27T23:10:02Zen_UK
dc.date.issued2011en_UK
dc.identifier.urihttp://hdl.handle.net/1893/16515-
dc.description.abstractThe capability of perceiving and expressing emotions through different modalities is a key issue for the enhancement of human-computer interaction. In this paper we present a novel architecture for the development of intelligent multimodal affective interfaces. It is based on the integration of Sentic Computing, a new opinion mining and sentiment analysis paradigm based on AI and Semantic Web techniques, with a facial emotional classifier and Maxine, a powerful multimodal animation engine for managing virtual agents and 3D scenarios. One of the main distinguishing features of the system is that it does not simply perform emotional classification in terms of a set of discrete emotional lables but it operates in a continuous 2D emotional space, enabling the integration of the different affective extraction modules in a simple and scalable way.en_UK
dc.language.isoenen_UK
dc.publisherSpringeren_UK
dc.relationCambria E, Hupont I, Hussain A, Cerezo E & Baldassarri S (2011) Sentic Avatar: Multimodal Affective Conversational Agent with Common Sense. In: Esposito A, Esposito A, Martone R, Muller V & Scarpetta G (eds.) Toward Autonomous, Adaptive, and Context-Aware Multimodal Interfaces. Theoretical and Practical Issues: Third COST 2102 International Training School, Caserta, Italy, March 15-19, 2010, Revised Selected Papers. Lecture Notes in Computer Science, 6456. Berlin Heidelberg: Springer, pp. 81-95. http://link.springer.com/chapter/10.1007%2F978-3-642-18184-9_8; https://doi.org/10.1007/978-3-642-18184-9_8en_UK
dc.relation.ispartofseriesLecture Notes in Computer Science, 6456en_UK
dc.rightsThe publisher does not allow this work to be made publicly available in this Repository. Please use the Request a Copy feature at the foot of the Repository record to request a copy directly from the author; you can only request a copy if you wish to use this work for your own research or private study.en_UK
dc.rights.urihttp://www.rioxx.net/licenses/under-embargo-all-rights-reserveden_UK
dc.subjectAIen_UK
dc.subjectSentic Computingen_UK
dc.subjectNLPen_UK
dc.subjectFacial Expression Analysisen_UK
dc.subjectSentiment Analysisen_UK
dc.subjectMultimodal Affective HCIen_UK
dc.subjectConversational Agentsen_UK
dc.subjectInternet Moral and ethical aspects.en_UK
dc.subjectSocial Environmenten_UK
dc.subjectBiometric Identificationen_UK
dc.titleSentic Avatar: Multimodal Affective Conversational Agent with Common Senseen_UK
dc.typePart of book or chapter of booken_UK
dc.rights.embargodate3000-12-01en_UK
dc.rights.embargoreason[Sentic Avatar.pdf] The publisher does not allow this work to be made publicly available in this Repository therefore there is an embargo on the full text of the work.en_UK
dc.identifier.doi10.1007/978-3-642-18184-9_8en_UK
dc.citation.issn0302-9743en_UK
dc.citation.spage81en_UK
dc.citation.epage95en_UK
dc.citation.publicationstatusPublisheden_UK
dc.citation.peerreviewedRefereeden_UK
dc.type.statusVoR - Version of Recorden_UK
dc.identifier.urlhttp://link.springer.com/chapter/10.1007%2F978-3-642-18184-9_8en_UK
dc.author.emailahu@cs.stir.ac.uken_UK
dc.citation.btitleToward Autonomous, Adaptive, and Context-Aware Multimodal Interfaces. Theoretical and Practical Issues: Third COST 2102 International Training School, Caserta, Italy, March 15-19, 2010, Revised Selected Papersen_UK
dc.citation.isbn978-3-642-18183-2en_UK
dc.publisher.addressBerlin Heidelbergen_UK
dc.contributor.affiliationUniversity of Stirlingen_UK
dc.contributor.affiliationAragon Institute of Technologyen_UK
dc.contributor.affiliationComputing Scienceen_UK
dc.contributor.affiliationUniversity of Zaragozaen_UK
dc.contributor.affiliationUniversity of Zaragozaen_UK
dc.identifier.wtid830023en_UK
dc.contributor.orcid0000-0002-8080-082Xen_UK
dcterms.dateAccepted2011-12-31en_UK
dc.date.filedepositdate2011-07-12en_UK
rioxxterms.typeBook chapteren_UK
rioxxterms.versionVoRen_UK
local.rioxx.authorCambria, Erik|en_UK
local.rioxx.authorHupont, Isabelle|en_UK
local.rioxx.authorHussain, Amir|0000-0002-8080-082Xen_UK
local.rioxx.authorCerezo, Eva|en_UK
local.rioxx.authorBaldassarri, Sandra|en_UK
local.rioxx.projectInternal Project|University of Stirling|https://isni.org/isni/0000000122484331en_UK
local.rioxx.contributorEsposito, A|en_UK
local.rioxx.contributorEsposito, AM|en_UK
local.rioxx.contributorMartone, R|en_UK
local.rioxx.contributorMuller, VC|en_UK
local.rioxx.contributorScarpetta, G|en_UK
local.rioxx.freetoreaddate3000-12-01en_UK
local.rioxx.licencehttp://www.rioxx.net/licenses/under-embargo-all-rights-reserved||en_UK
local.rioxx.filenameSentic Avatar.pdfen_UK
local.rioxx.filecount1en_UK
local.rioxx.source978-3-642-18183-2en_UK
Appears in Collections:Computing Science and Mathematics Book Chapters and Sections

Files in This Item:
File Description SizeFormat 
Sentic Avatar.pdfFulltext - Published Version5.1 MBAdobe PDFUnder Embargo until 3000-12-01    Request a copy


This item is protected by original copyright



Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.