You are here

FaceMatch: visual search of missing persons during a disaster event.

Printer-friendly versionPrinter-friendly version
Borovikov E, Vajda S, Lingappa G, Candemir S, Antani SK, Gill MJ, Thoma GR
NIH Intramural Research Festival, Bethesda MD, November 6-8, 2013.
Abstract: 

We report on our FaceMatching research and development that aims to provide robust image near-duplicate detection and face localization/matching on digital photos of variable quality, as an integral part of PEOPLE LOCATOR (PL) and developed by NLM as a Web-based system for family reunification in cases of natural or man-made disasters. PL collects photos and brief text meta-data (name, age, etc.) of missing or found persons. Currently supported text queries may be insufficient because text data are often incomplete or inconsistent. Adding an image search capability can significantly benefit the user experience. Face localization is done via skin-tone/landmarks enhanced gray-scale face detector, more accurate than many open source and commercial detectors. Face matching is done via an ensemble of image descriptors (HAAR, LBPH, SIFT, SURF, ORB), using a smart re-ranking procedure. We describe the integration of our face matching system with PL and report on its performance. Unlike other face recognition systems often having many good quality well-illuminated sample images for each person, ours can handle the lack of training examples for individual faces, as those are unlikely in a disaster setting.

Borovikov E, Vajda S, Lingappa G, Candemir S, Antani SK, Gill MJ, Thoma GR. FaceMatch: visual search of missing persons during a disaster event. NIH Intramural Research Festival, Bethesda MD, November 6-8, 2013.