Please use this identifier to cite or link to this item:
http://hdl.handle.net/1893/32381
Appears in Collections: | Computing Science and Mathematics Journal Articles |
Peer Review Status: | Refereed |
Title: | A Novel Context-Aware Multimodal Framework for Persian Sentiment Analysis |
Author(s): | Dashtipour, Kia Gogate, Mandar Cambria, Erik Hussain, Amir |
Contact Email: | kia.dashtipour@stir.ac.uk |
Keywords: | Multimodal Sentiment Analysis Persian Sentiment Analysis |
Issue Date: | 7-Oct-2021 |
Date Deposited: | 8-Mar-2021 |
Citation: | Dashtipour K, Gogate M, Cambria E & Hussain A (2021) A Novel Context-Aware Multimodal Framework for Persian Sentiment Analysis. Neurocomputing, 457, pp. 377-388. https://doi.org/10.1016/j.neucom.2021.02.020 |
Abstract: | Most recent works on sentiment analysis have exploited the text modality. However, millions of hours of video recordings posted on social media platforms everyday hold vital unstructured information that can be exploited to more effectively gauge public perception. Multimodal sentiment analysis offers an innovative solution to computationally understand and harvest sentiments from videos by contextually exploiting audio, visual and textual cues. In this paper, we, firstly, present a first of its kind Persian multimodal dataset comprising more than 800 utterances, as a benchmark resource for researchers to evaluate multimodal sentiment analysis approaches in Persian language. Secondly, we present a novel context-aware multimodal sentiment analysis framework, that simultaneously exploits acoustic, visual and textual cues to more accurately determine the expressed sentiment. We employ both decision-level (late) and feature-level (early) fusion methods to integrate affective cross-modal information. Experimental results demonstrate that the contextual integration of multimodal features such as textual, acoustic and visual features deliver better performance (91.39%) compared to unimodal features (89.24%). |
DOI Link: | 10.1016/j.neucom.2021.02.020 |
Rights: | This item has been embargoed for a period. During the embargo please use the Request a Copy feature at the foot of the Repository record to request a copy directly from the author. You can only request a copy if you wish to use this work for your own research or private study. Accepted refereed manuscript of: Dashtipour K, Gogate M, Cambria E & Hussain A (2021) A Novel Context-Aware Multimodal Framework for Persian Sentiment Analysis. Neurocomputing, 457, pp. 377-388. https://doi.org/10.1016/j.neucom.2021.02.020 © 2021, Elsevier. Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/ |
Licence URL(s): | http://creativecommons.org/licenses/by-nc-nd/4.0/ |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Persian_MMD.pdf | Fulltext - Accepted Version | 3.24 MB | Adobe PDF | View/Open |
This item is protected by original copyright |
A file in this item is licensed under a Creative Commons License
Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/
If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.