Abstract
Face de-identification (or “masking”) algorithms have been developed in response to the prevalent use of video recordings in public places. We evaluated the success of face identity masking for human perceivers and a deep convolutional neural network (DCNN). Eight de-identification algorithms were applied to videos of drivers' faces, while they actively operated a motor vehicle. These masks were pre-selected to be applicable to low-quality video and to maintain coarse information about facial actions. Humans studied high-resolution images to learn driver identities and were tested on their recognition of active drivers in low-resolution videos. Faces in the videos were either unmasked or were masked by one of the eight algorithms. When participants were tested immediately after learning (Experiment 1), all masks reduced identification, with six of eight masks reducing identification to extremely poor performance. In a second experiment, two of the most effective masks were tested after a delay of 7 or 28 days. The delay did not further reduce identification of the masked faces. In all masked conditions, participants maintained stringent decision criteria, with low confidence in recognition, further indicating the effectiveness of the masks. Next, the DCNN performed an identity-matching task between high-resolution images and masked videos-a task analogous to that done by humans. The pattern of accuracy for the DCNN mirrored some, but not all, aspects of human performance, highlighting the need to test the effectiveness of identity masking for both humans and machines. The DCNN was also tested on its ability to match identity between masked and unmasked versions of the same video, based only on the face. DCNN performance for the eight masks offers insight into the nature of the information in faces that is coded in these networks.
Original language | English |
---|---|
Article number | 3 |
Journal | ACM Transactions on Applied Perception |
Volume | 18 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2021 |
Funding
K. D. Orsten Hooge and A. Baragchizadeh contributed equally to this research. This work was supported through collaboration with Oak Ridge National Laboratory and the Federal Highway Administration under the Exploratory Advanced Research Program (Contracting Officer’s Representative: Lincoln Cobb). The human experiment and analysis was subcontracted to the University of Texas at Dallas from Oak Ridge National Laboratory. The CNN feature extraction was carried out at the University of Maryland by C. C., who was supported by the Intelligence Advanced Research Projects Activity (IARPA). The UMD part of the research is based upon work supported by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via IARPA R&D Contract No. 2014-14071600012. Authors’ addresses: K. D. Orsten Hooge and A. Baragchizadeh, The University of Texas at Dallas, 800 West Campbell Road, Richardson, Texas, 75080; email: {kdoh, asal.baragchizadeh}@utdallas.edu; T. P. Karnowski, D. S. Bolme, and R. Ferrell, Oak Ridge National Laboratory, 1 Bethel Valley Road, Oak Ridge, Tennessee, 37831; emails: {karnowskitp, bolmeds, ferrellrk}@ornl.gov; P. R. Jesudasen, The University of Texas at Dallas, 800 West Campbell Road, Richardson, Texas, 75080; email: [email protected]; C. D. Castillo, University of Maryland, 8600 Datapoint Drive, College Park, Maryland, 20742; email: [email protected]; A. J. O’Toole, The University of Texas at Dallas, 800 West Campbell Road, Richardson, Texas, 75080; email: [email protected]. Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of the United States government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only. © 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM. 1544-3558/2020/12-ART3 $15.00 https://doi.org/10.1145/3422988 This work is motivated in part by the Second Strategic Highway Research Program (SHRP2) Naturalistic Driving Study (NDS) [12]. The focus of the project was to directly observe and understand driver behavior. To do so, over 3,000 drivers were digitally recorded as they drove in their daily routine(s), using a low-cost data acquisition system (DAS) featuring four inexpensive analog cameras. No lighting control was included in the study, with the exception of an infrared light to allow night-time exposures and a corresponding filter to accentuate the infrared light (which causes an alteration in the wavelengths obtained by the camera in day-time images as well).
Funders | Funder number |
---|---|
Second Strategic Highway Research Program | |
Oak Ridge National Laboratory | |
Federal Highway Administration | |
Office of the Director of National Intelligence | 2014-14071600012 |
Intelligence Advanced Research Projects Activity |
Keywords
- DCNN
- De-identification
- Masking algorithms
- Privacy