Skip to main content

Technical and Legal Challenges of the Use of Automated Facial Recognition Technologies for Law Enforcement and Forensic Purposes

  • Chapter
  • First Online:
Artificial Intelligence, Social Harms and Human Rights

Abstract

Biometrics covers a variety of technologies used for the identification and authentication of individuals based on their behavioral and biological characteristics. A number of new biometric technologies have been developed, taking advantage of our improved understanding of the human body and advanced sensing techniques. They are increasingly being automated to eliminate the need for human verification. As computational power and techniques improve and the resolution of camera images increases, it seems clear that many benefits could be derived through the application of a wider range of biometric techniques for security and surveillance purposes in Europe. Facial recognition technology (FRT) makes it possible to compare digital facial images to determine whether they are of the same person. However, there are many difficulties in using such evidence to secure convictions in criminal cases. Some are related to the technical shortcomings of facial biometric systems, which impact their utility as an undisputed identification system and as reliable evidence; others pertain to legal challenges in terms of data privacy and dignity rights. While FRT is coveted as a mechanism to address the perceived need for increased security, there are concerns that the absence of sufficiently stringent regulations endangers fundamental rights to human dignity and privacy. In fact, its use presents a unique host of legal and ethical concerns. The lack of both transparency and lawfulness in the acquisition, processing and use of personal data can lead to physical, tangible and intangible damages, such as identity theft, discrimination or identity fraud, with serious personal, economic or social consequences. Evidence obtained by unlawful means can also be subject to challenge when adduced in court. This paper looks at the technical and legal challenges of automated FRT, focusing on its use for law enforcement and forensic purposes in criminal matters. The combination of both technical and legal approaches is necessary to recognize and identify the main potential risks arising from the use of FRT, in order to prevent possible errors or misuses due both to technological misassumptions and threats to fundamental rights, particularly—but not only—the right to privacy and the presumption of innocence. On the one hand, a good part of the controversies and contingencies surrounding the credibility and reliability of automated FRT is intimately related to their technical shortcomings. On the other hand, data protection, database custody, transparency, accountability and trust are relevant legal issues that might raise problems when using FRT. The aim of this paper is to improve the usefulness of automated FRT in criminal investigations and as forensic evidence within the criminal procedure.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    FRT in relation to criminal investigations has been implemented in 11 EU member states and in two international police cooperation organizations, Europol and Interpol. Currently, 7 member states expect to implement it until 2022 (TELEFI, 2021, p. 22). FRT is much more frequent in the USA. Already in 2012 the FBI launched the Interstate Photo System Facial Recognition Pilot project in three states, a system fully deployed as of June 2014, now integrated in the Next Generation Identification System, which provides the US criminal justice community with the world’s largest electronic repository of biometric and criminal history information. For other applications at state and local level, see New York City Bar Association (2020).

  2. 2.

    Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/89.

  3. 3.

    Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA [2016] OJ L119/89.

  4. 4.

    This section concentrates on system vulnerabilities which are part of the biometric processing itself. Since biometric systems are implemented on server computers, they are vulnerable to all cryptographic, virus and other attacks which plague any computer system. For example, biometric data may be stored locally on hardware within the organization, or externally at an unknown location within the cloud (Tomova, 2009), both vulnerable to hacking. The training dataset may be subject to intentional manipulations, such as data poisoning attacks (Papernot et al., 2018) and backdoor injections (Chen et al., 2017). Vulnerabilities of data storage concern modifying the storage (adding, modifying or removing templates or raw data), copying data for secondary uses (identity theft or directly inputting the information at another stage of the system to achieve authentication) and modifying the identity to which the biometric is assigned. We are aware of these issues, but do not intend to cover them in this paper.

  5. 5.

    For example, in S & Marper v United Kingdom [2008] ECHR 1581, the European Court on Human Rights found the retention by the British police of DNA samples of individuals who had been arrested but had later been acquitted, or who had had the charges against them dropped, to be a violation of their right to privacy under Article 8 ECHR (Sampson 2018). Furthermore, in some EU member states the indefinite retention of biometric samples, including DNA evidence and fingerprints of data subjects, has been successfully challenged, except for in exceptional circumstances. See, for the (pre-Brexit) UK, R (on the application of GC & C) v The Commissioner of Police of the Metropolis [2011] UKSC 21. Also in the UK the High Court of Justice (England and Wales) was called upon to determine whether the current legal regime in that country was ‘adequate to ensure the appropriate and non-arbitrary use of automated facial recognition in a free and civilized society’. In R (Bridges) v Chief Constable of the South Wales Police [2019] EWHC 2341 (Admin), the judgment was that the use of FRT was not ‘in accordance with law’ and implied a breach of Article 8 (1) and (2) ECHR and of data protection law, and it failed to comply with the public sector equality duty.

  6. 6.

    The Constitutional Court in France stated that the keeping of a database with biometric identity information allowing identification interferes with the fundamental right to respect of privacy. Cons. const. (France) no. 2012–652, 22 March 2012 (Loi protection de l’identité), Article 6.

  7. 7.

    ECtHR, S. and Marper v. United Kingdom, nos. 30562/04 and 30566/04, 4 December 2008, Articles 84 and 86.

  8. 8.

    ECtHR, M.K. v. France, no.19522/09, 18 April 2013, Articles 44–46 (‘ECtHR, M.K. 2013’).

References

  • Abdurrahim, S. H., Samad, S. A., & Huddin, A. B. (2018). Review on the effects of age, gender, and race demographics on automatic face recognition. The Visual Computer, 34, 1617–1630. https://doi.org/10.1007/s00371-017-1428-z

    Article  Google Scholar 

  • Arigbabu, O. A., Ahmad, S. M. S., Adnan, W. A. W., & Yussof, S. (2015). Recent advances in facial soft biometrics. The Visual Computer, 31, 513–525. https://doi.org/10.1007/s00371-014-0990-x

    Article  Google Scholar 

  • Baggili, I., & Behzadan, V. (2019). Founding the domain of AI forensics. arXiv:1912.06497v1.

  • Benzaoui, A., Adjabi, I., & Boukrouche, A. (2017). Experiments and improvements of ear recognition based on local texture descriptors. Optical Engineering, 56, 043109. https://doi.org/10.1117/1.OE.56.4.043109

    Article  Google Scholar 

  • Beveridge, J. R., Givens, G. H., Phillips, P. J., & Draper, B. A. (2009). Factors that influence algorithm performance in the face recognition grand challenge. Computer Vision and Image Understanding, 113(6), 750–762. https://doi.org/10.1016/j.cviu.2008.12.007

    Article  Google Scholar 

  • Bichard, M. (2004). The Bichard Inquiry. Report (No. HC 653). The Stationary Office.

    Google Scholar 

  • Blackstone, W. (1893). Commentaries on the laws of England (p. 1769). J. B. Lippincott Co.

    Google Scholar 

  • Bonastre, J.-F., Kahn, J., Rossato, S., & Ajili, M. (2015). Forensic speaker recognition: Mirages and reality. In S. Fuchs, D. Pape, C. Petrone, & P. Perrier (Eds.), Individual Differences in Speech Production and Perception (pp. 255–285). Peter Lang.

    Google Scholar 

  • Bouchrika, I. (2016). Evidence Evaluation of Gait Biometrics for Forensic Investigation. In A. E. Hassanien, M. M. Fouad, A. A. Manaf, M. Zamani, R. Ahmad, & J. Kacprzyk (Eds.), Multimedia Forensics and Security: Foundations, Innovations, and Applications (pp. 307–326). Springer.

    Google Scholar 

  • Browne, S. (2015). B®anding Blackness: Biometric Technology and the Surveillance of Blackness. In S. Browne (Ed.), Dark Matters: On the Surveillance of Blackness (pp. 89–130). Duke University Press.

    Google Scholar 

  • Champod, C. & Tistarelli, M. (2017). Biometric Technologies for Forensic Science and Policing: State of the Art. In M. Tistarelli, M., & C. Champod (Eds.), Handbook of Biometrics for Forensic Science (pp. 1–15). Springer.

    Google Scholar 

  • Chen, X., Liu, C., Li, B., Lu, K., & Song, D. (2017). Targeted backdoor attacks on deep learning systems using data poisoning. arXiv preprint arXiv:1712.05526.

  • Cole, S. A. (2004). Fingerprint Identification and the Criminal Justice System. In D. Lazer (Ed.), DNA and the Criminal Justice System. The Technology of Justice (pp. 63–89). MIT Press.

    Google Scholar 

  • Cooke, D. J., & Michie, C. (2013). Violence risk assessment: From prediction to understanding—or from what? To why? In C. Logan & L. Johnstone (Eds.), Managing Clinical Risk (pp. 22–44). Routledge.

    Google Scholar 

  • Cummings, M. L. (2014). Automation bias in intelligent time critical decision support systems. American Institute of Aeronautics and Astronautics.

    Google Scholar 

  • Dantcheva, A., Velardo, C., D’Angelo, A., & Dugelay, J.-L. (2011). Bag of soft biometrics for person identification. New trends and challenges. Multimedia Tools and Applications, 51, 739–777. https://doi.org/10.1007/s11042-010-0635-7

    Article  Google Scholar 

  • Dilek, S., Çakır, H., & Aydın, M. (2015). Applications of Artificial Intelligence Techniques to Combating Cyber Crimes: A Review. International Journal of Artificial Intelligence and Applications, 6(1), 21–39.

    Article  Google Scholar 

  • Esposito, A. (2012). Debunking some myths about biometric authentication. ArXiv abs/1203.03333.

    Google Scholar 

  • Eubanks, V. (2018). Automating Inequality. How high-tech tools profile, police, and punish the poor. St. Martin’s Press.

    Google Scholar 

  • Fish, J. T., Miller, L. S., & Braswell, M. C. (2013). Crime Scene Investigation. Routledge.

    Book  Google Scholar 

  • FRA European Union Agency for Fundamental Rights. (2019). Facial recognition technology: Fundamental rights considerations in the context of law enforcement. Available at https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper-1_en.pdf

  • Freeman, K. (2016). Algorithmic injustice: how the Wisconsin Supreme Court failed to protect due process rights in state V. Loomis. North Carolina Journal of Law and Technology, 18(5), 75–106.

    Google Scholar 

  • Galbally, J., Ferrara, P., Haraksim, R., Psyllos, A. I., & Beslay, L. (2019). Study on Face Identification Technology for its Implementation in the Schengen Information System. Publications Office of the European Union.

    Google Scholar 

  • Garrett, B., & Mitchell, G. (2013). How Jurors Evaluate Fingerprint Evidence: The Relative Importance of Match Language, Method Information, and Error Acknowledgment. Journal of Empirical Legal Studies, 10(3), 484–511.

    Article  Google Scholar 

  • Howard, J. J., & Etter, D. (2013). The Effect of Ethnicity, Gender, Eye Color and Wavelength on the Biometric Menagerie. 2013 IEEE International Conference on Technologies for Homeland Security (HST), IEEE.

    Google Scholar 

  • Jacquet, M., & Champod, C. (2020). Automated face recognition in forensic science: Review and perspectives. Forensic Science International, 307, 110124. https://doi.org/10.1016/j.forsciint.2019.110124

    Article  Google Scholar 

  • Keenan, T. P. (2015). Hidden Risks of Biometric Identifiers and How to Avoid Them. In Canadian Global Affairs Institute, Black Hat USA 2015 (pp. 1–13). University of Calgary.

    Google Scholar 

  • Kindt, E. J. (2013). Privacy and Data Protection Issues of Biometric Applications. Springer.

    Book  Google Scholar 

  • Kindt, E. J. (2018). Having yes, using no? About the new legal regime for biometric data. Computer Law & Security Review, 34, 523–538. https://doi.org/10.1016/j.clsr.2017.11.004

    Article  Google Scholar 

  • Kotsoglou, K. N., & Oswald, M. (2020). The long arm of the algorithm? Automated Facial Recognition as evidence and trigger for police intervention. Forensic Science International: Synergy, 2, 86–89. https://doi.org/10.1016/j.fsisyn.2020.01.002

    Article  Google Scholar 

  • Lieberman, J. D., Carrell, C. A., Miethe, T. D., & Krauss, D. A. (2008). Gold versus platinum: Do jurors recognize the superiority and limitations of DNA evidence compared to other types of forensic evidence? Psychology, Public Policy, and Law, 14(1), 27–62. https://doi.org/10.1037/1076-8971.14.1.27

    Article  Google Scholar 

  • Maeder, E. M., Ewanation, L. A., & Monnink, J. (2017). Jurors’ Perceptions of Evidence: The Relative Influence of DNA and Eyewitness Testimony when Presented by Opposing Parties. Journal of Police and Criminal Psychology, 32, 33–42. https://doi.org/10.1007/s11896-016-9194-9

    Article  Google Scholar 

  • Magnet, S. (2011). When Biometrics Fail: Gender, Race, and the Technology of Identity, Duke University Press.

    Google Scholar 

  • Mordini, E., & Massari, S. (2008). Body, Biometrics and Identity. Bioethics, 22(9), 488–498.

    Google Scholar 

  • Morrison, G. S., Sahito, F. H., Jardine, G., Djokic, D., Clavet, S., Berghs, S., & Goemans Dorny, C. (2016). INTERPOL Survey of the Use of Speaker Identification by Law Enforcement Agencies. Forensic Science International, 263, 92–100.

    Article  Google Scholar 

  • Murphy, E. (2007). The New Forensics: Criminal Justice, False Certainty, and the Second Generation of Scientific Evidence. California Law Review, 95(3), 721–797. https://doi.org/10.15779/Z38R404

  • New York City Bar Association (2020). Power, Pervasiveness and Potential: The Brave New World of Facial Recognition Through a Criminal Law Lens (and Beyond). Available at http://documents.nycbar.org.s3.amazonaws.com/files/2020662-BiometricsWhitePaper.pdf

  • National Research Council (2010). Biometric Recognition: Challenges and Opportunities. The National Academies Press. https://doi.org/10.17226/12720

  • Nigam, I., Vatsa, M., & Singh, R. (2015). Ocular biometrics: A survey of modalities and fusion approaches. Information Fusion, 26, 1–35. https://doi.org/10.1016/j.inffus.2015.03.005

    Article  Google Scholar 

  • Papernot, N., McDaniel, P., Sinha, A., & Wellman, M. P. (2018). SoK: Security and Privacy in Machine Learning. 2018 IEEE European Symposium on Security and Privacy (EuroS&P) (pp. 399–414). Institute of Electrical and Electronics Engineers.

    Google Scholar 

  • Riggan, B. S., Short, N. J., & Hu, S. (2018). Thermal to Visible Synthesis of Face Images using Multiple Regions. arXiv:1803.07599 [cs.CV].

  • Ross, A. A., Nandakumar, K., & Jain, A. K. (2006). Handbook of Multibiometrics. Springer.

    Google Scholar 

  • Saini, M., & Kapoor, A. K. (2016). Biometrics in Forensic Identification: Applications and Challenges. Journal of Forensic Medicine, 1(2), 1–6. https://doi.org/10.4172/2472-1026.1000108

    Article  Google Scholar 

  • Sarangi, P. P., Mishra, B. S. P., & Dehuri, S. (2018). Fusion of PHOG and LDP local descriptors for kernel-based ear biometric recognition. Multimedia Tools and Applications, 78, 9595–9623. https://doi.org/10.1007/s11042-018-6489-0

    Article  Google Scholar 

  • Sharp, L. (2000). The Commodification of the Body and Its Parts. Annual Review of Anthropology, 29, 287–328.

    Article  Google Scholar 

  • Singh, S., & Prasad, S. V. A. V. (2018). Techniques and Challenges of Face Recognition: A Critical Review. Procedia Computer Science, 143, 536–543.

    Article  Google Scholar 

  • Sulner, S. (2018). Critical Issues Affecting the Reliability and Admissibility of Handwriting Identification Opinion Evidence. Seton Hall Law Review, 48(3), 631–717.

    Google Scholar 

  • Sutrop, M. (2010). Ethical Issues in Governing Biometric Technologies. In Proceedings of the Third International Conference on Ethics and Policy of Biometrics and International Data Sharing, ICEB’10 (pp. 102–114). Springer. https://doi.org/10.1007/978-3-642-12595-9_14

  • TELEFI. (2021). Summary Report of the project “Towards the European Level Exchange of Facial Images”. https://www.telefi-project.eu/sites/default/files/TELEFI_SummaryReport.pdf

  • Thompson, E. (2018). Understanding the Strengths and Weaknesses of Biometrics [WWW Document]. Infosecurity Magazine. https://www.infosecurity-magazine.com:443/opinions/strengths-weaknesses-biometrics/. Accessed 26 September 2018.

  • Tistarelli, M., Grosso, E., & Meuwly, D. (2014). Biometrics in forensic science: Challenges, lessons and new technologies. In V. Cantoni, D. Dimov, & M. Tistarelli (Eds.), Proceedings of the First International Workshop on Biometric Authentication (BIOMET 2014), Sofia, Bulgaria, June 23–24 (pp. 153–164). Springer. https://doi.org/10.1007/978-3-319-13386-7_12

  • Tome, P., Vera-Rodriguez, R., Fierrez, J., & Ortega-Garcia, J. (2015). Facial soft biometric features for forensic face recognition. Forensic Science International, 257, 271–284. https://doi.org/10.1016/j.forsciint.2015.09.002

    Article  Google Scholar 

  • Tomova, S. (2009). Ethical and Legal Aspects of Biometrics. In E. Mordini & M. Green (Eds.), Identity, Security and Democracy: The Wider Social and Ethical Implications of Automated Systems for Human Identification (pp. 111–114). IOS Press.

    Google Scholar 

  • Wevers, R. (2018). Unmasking Biometrics’ Biases: Facing Gender, Race, Class and Ability in Biometric Data Collection. TMG Journal for Media History, 21(2), 89–105.

    Article  Google Scholar 

  • Working Group for Human Factors in Handwriting Examination. (2020). Forensic Handwriting Examination and Human Factors: Improving the Practice Through a Systems Approach. U.S. Department of Commerce, National Institute of Standards and Technology. NISTIR 8282.

    Google Scholar 

  • Završnik, A. (2020). Criminal justice, artificial intelligence systems, and human rights. ERA Forum, 20, 567–583.

    Article  Google Scholar 

  • Zeinstra, C. G., Meuwly, D., Ruifrok, A. C. C., Veldhuis, R. N. J., & Spreeuwers, L. J. (2018). Forensic face recognition as a means to determine strength of evidence: A survey. Forensic Science Review, 30(1), 21–32.

    Google Scholar 

  • Zhou, S., & Xiao, S. (2018). 3D face recognition: A survey. Human-Centric Computing and Information Sciences, 8, 1–27. https://doi.org/10.1186/s13673-018-0157-2

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Patricia Faraldo Cabana .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Faraldo Cabana, P. (2023). Technical and Legal Challenges of the Use of Automated Facial Recognition Technologies for Law Enforcement and Forensic Purposes. In: Završnik, A., Simončič, K. (eds) Artificial Intelligence, Social Harms and Human Rights. Critical Criminological Perspectives. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-19149-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-19149-7_2

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-031-19148-0

  • Online ISBN: 978-3-031-19149-7

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics