Shape and Texture Combined Face Recognition for Detection of Forged ID Documents

Daniel Sáez-Trigueros, Heinz Hertlein, Li Meng, Margaret Hartnett

Research output: Chapter in Book/Report/Conference proceedingConference contribution

148 Downloads (Pure)

Abstract

This paper proposes a face recognition system that can be used to effectively match a face image scanned from an identity (ID) doc-ument against the face image stored in the biometric chip of such a document. The purpose of this specific face recognition algorithm is to aid the automatic detection of forged ID documents where the photography printed on the document’s surface has been altered or replaced. The proposed algorithm uses a novel combination of texture and shape features together with sub-space representation techniques. In addition, the robustness of the proposed algorithm when dealing with more general face recognition tasks has been proven with the Good, the Bad & the Ugly (GBU) dataset, one of the most challenging datasets containing frontal faces. The proposed algorithm has been complement-ed with a novel method that adopts two operating points to enhance the reliability of the algorithm’s final verification decision.
Original languageEnglish
Title of host publicationthe 39th Intl. ICT (Information and Communication Technology) Convention MIPRO 2016
Place of PublicationCroatia
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages1343-1348
ISBN (Print) 978-953-233-087-8
DOIs
Publication statusPublished - 28 Jul 2016
EventThe 39th International ICT Convention - MIPRO 2016 - Grand hotel Adriatic Congress Centre and Hotel Admiral, Opatija, Croatia
Duration: 30 May 20161 Jun 2016
http://mipro-proceedings.com/

Conference

ConferenceThe 39th International ICT Convention - MIPRO 2016
Country/TerritoryCroatia
CityOpatija
Period30/05/161/06/16
Internet address

Fingerprint

Dive into the research topics of 'Shape and Texture Combined Face Recognition for Detection of Forged ID Documents'. Together they form a unique fingerprint.

Cite this