You are here

FaceMatch: visual search for pictures of missing persons during a disaster event.

Printer-friendly versionPrinter-friendly version
Borovikov E
NIH Research Festival. October 2012.
Abstract: 

NLM's People Locator (PL) system allows the posting of photos and simple metadata (name, age, location) for persons missing (or found) in the wake of a disaster. To extend the current text-based search method with a visual search for people's faces, we developed FaceMatch, a system to match faces in a query image to those in the stored photos. Face matching is a two-stage process: faces in photos sent as queries are first localized using an improved Viola-Jones face detector, and then image features (SIFT, SURF, ORB and HAAR) are extracted, combined and matched against an index of features extracted from the stored photos. Face matching in this context is challenging because of the lack of training data, low-resolution photos, wide variability in lighting, facial expression, head pose, ethnicity, occlusions and deformed faces due to injury. Ongoing research includes exploring more discriminating features, modeling skin color for more accurate face localization, a Haar wavelet-based technique to eliminate near-duplicate photos, and image normalization. The approach is tested on images collected from the 2010 earthquake in Haiti (HEPL collection) and Labeled Faces in the Wild (LFW dataset). Current FaceMatch speed and accuracy performance results are presented.

Borovikov E. FaceMatch: visual search for pictures of missing persons during a disaster event. NIH Research Festival. October 2012.