http://hdl.handle.net/1893/28063
Appears in Collections: | Computing Science and Mathematics Conference Papers and Proceedings |
Author(s): | Walber, Tina Neuhaus, Chantal Scherp, Ansgar |
Contact Email: | ansgar.scherp@stir.ac.uk |
Title: | Tagging-by-search: Automatic image region labeling using gaze information obtained from image search |
Citation: | Walber T, Neuhaus C & Scherp A (2014) Tagging-by-search: Automatic image region labeling using gaze information obtained from image search. In: Proceedings of the 19th international conference on Intelligent User Interfaces (IUI '14) 19th International Conference on Intelligent User Interfaces (IUI '14), Haifa, Israel, 24.02.2014-27.02.2014. New York: ACM, pp. 257-266. https://doi.org/10.1145/2557500.2557517 |
Issue Date: | 31-Dec-2014 |
Date Deposited: | 22-Oct-2018 |
Conference Name: | 19th International Conference on Intelligent User Interfaces (IUI '14) |
Conference Dates: | 2014-02-24 - 2014-02-27 |
Conference Location: | Haifa, Israel |
Abstract: | Labelled image regions provide very valuable information that can be used in different settings such as image search. The manual creation of region labels is a tedious task. Fully automatic approaches lack understanding the image content sufficiently due to the huge variety of depicted objects. Our approach benefits from the expected spread of eye tracking hardware and uses gaze information obtained from users performing image search tasks to automatically label image regions. This allows to exploit the human capabilities regarding the visual perception of image content while performing daily routine tasks. In an experiment with 23 participants, we show that it is possible to assign search terms to photo regions by means of gaze analysis with an average precision of 0.56 and an average F-measure of 0.38 over 361 photos. The participants performed different search tasks while their gaze was recorded. The results of the experiment show that the gaze-based approach performs significantly better than a baseline approach based on saliency maps. |
Status: | VoR - Version of Record |
Rights: | The publisher does not allow this work to be made publicly available in this Repository. Please use the Request a Copy feature at the foot of the Repository record to request a copy directly from the author. You can only request a copy if you wish to use this work for your own research or private study. |
Licence URL(s): | http://www.rioxx.net/licenses/under-embargo-all-rights-reserved |
File | Description | Size | Format | |
---|---|---|---|---|
Walber et al 2014.pdf | Fulltext - Published Version | 2.67 MB | Adobe PDF | Under Permanent Embargo Request a copy |
Note: If any of the files in this item are currently embargoed, you can request a copy directly from the author by clicking the padlock icon above. However, this facility is dependent on the depositor still being contactable at their original email address.
This item is protected by original copyright |
Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/
If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.