Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/54
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorHancock, Peter J B-
dc.contributor.authorFrowd, Charlie David-
dc.date.accessioned2005-11-14T13:06:04Z-
dc.date.available2005-11-14T13:06:04Z-
dc.date.issued2001-
dc.identifier.urihttp://hdl.handle.net/1893/54-
dc.description.abstractThis thesis details the development and evaluation of a new photofitting approach. The motivation for this work is that current photofit systems used by the police - whether manual or computerized - do not appear to work very well. Part of the problem with these approaches is they involve a single facial representation that necessitates a verbal interaction. When a multiple presentation is considered, our innate ability to recognize faces is capitalized (and the potentially disruptive effect of the verbal component is reduced). The approach works by employing Genetic Algorithms to evolve a small group of faces to be more like a desired target. The main evolutionary influence is via user input that specifies the similarity of the presented images with the target under construction. The thesis follows three main phases of development. The first involves a simple system modelling the internal components of a face (eyes, eyebrows, nose and mouth) containing features in a fixed relationship with each other. The second phase applies external facial features (hair and ears) along with an appropriate head shape and changes in the relationship between features. That the underlying model is based on Principal Components Analysis captures the statistics of how faces vary in terms of shading, shape and the relationship between features. Modelling was carried out in this way to create more realistic looking photofits and to guard against implausible featural relationships possible with traditional approaches. The encouraging results of these two sections prompted the development of a full photofit system: EvoFIT. This software is shown to have continued promise both in the lab and in a real case. Future work is directed particularly at resolving issues concerning the anonymity of the database faces and the creation of photofits from the subject's memory of a target.en
dc.format.extent7723910 bytes-
dc.format.mimetypeapplication/pdf-
dc.language.isoen-
dc.publisherUniversity of Stirlingen
dc.relation.hasversionFrowd, C.D., Hancock, P.J.B., & Carson, D. (2004). EvoFIT: A holistic, evolutionary facial imaging technique for creating composites. ACM Transactions on Applied Psychology (TAP), 1, 1-21.en
dc.subject.lcshFace perceptionen
dc.subject.lcshWitnessesen
dc.subject.lcshPhotomontageen
dc.subject.lcshFace Physiologyen
dc.subject.otherFacial compositeen
dc.subject.otherWitnessen
dc.subject.otherHolisticen
dc.subject.otherCrimeen
dc.subject.otherE-FITen
dc.subject.otherEvoFITen
dc.titleEvoFIT: A Holistic, Evolutionary Facial Imaging Systemen
dc.typeThesis or Dissertation-
dc.contributor.sponsorThe Engineering and Physical Sciences Research Councilen
dc.type.qualificationlevelDoctoral-
dc.type.qualificationnameDoctor of Philosophy (PHD(R))-
dc.contributor.affiliationSchool of Natural Sciences-
dc.contributor.affiliationPsychology-
Appears in Collections:Psychology eTheses

Files in This Item:
File Description SizeFormat 
CharlieFrowdThesisPlusCorrectionsDR.pdf7.54 MBAdobe PDFView/Open


This item is protected by original copyright



Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.