Federal joint agency report on law enforcement use of biometrics released

Federal joint agency report on law enforcement use of biometrics released

The rapid evolution and deployment of biometric technologies have profoundly reshaped law enforcement practices in the United States. While these advancements promise significant benefits, including enhanced public safety and streamlined criminal investigations, they also raise critical concerns about privacy, civil rights, and civil liberties. A new report, prepared under the auspices of the U.S. Department of Homeland Security (DHS), the Department of Justice (DOJ), and the White House Office of Science and Technology Policy, delves into these complexities, offering an incisive examination of biometric technology’s dual-edged implications.

The 137-page report underscores the importance of balancing these competing priorities through robust legal frameworks, rigorous oversight, and a commitment to transparency and equity. While biometrics technologies have proven indispensable for various law enforcement functions, their increasing ubiquity has also brought issues of equity, accuracy, and accountability to the fore, as highlighted by the report’s findings.

The report was prepared by Janice Kephart, director of DHS’s Biometric Interagency Working Group and former counsel to the 9/11 Commission. The report fulfills Section 13(e) of the May 25, 2022 Executive Order 14074, Advancing Effective, Accountable Policing and Criminal Justice Practices to Enhance Public Trust and Public Safety.

Kephart said she “was tasked with compiling and drafting a comprehensive report on the implementation of biometrics in federal law enforcement. Working from lessons learned in drafting the 9/11 Commission’s terrorist travel and biometric recommendations, I collaborated closely with key departments within DHS, DOJ, and National Institute of Standards and Technology (NIST), as well as receiving oversight and input from the White House Office of Science and Technology.”

The report delves deeply into the privacy implications of biometric systems. It highlights the extensive data collection required for these technologies and the inherent risks of storing such sensitive information. Under the Privacy Act of 1974, federal agencies are mandated to secure personal information. But despite these safeguards, there is a persistent risk of unauthorized access to databases, which could lead to identity theft, surveillance abuses, or breaches of civil liberties.

“Like any technology, biometric systems must be continually informed by the latest and best science to optimize benefits, accuracy, and efficiency for the public’s rights and privacy,” the report says, noting that “adjudication of biometric results must not discriminate based on actual or perceived race, ethnicity, national origin, religion, sex (including sexual orientation and gender identity), or disability. Accordingly, guardrails to protect civil rights, civil liberties, and privacy are vital.”

Continuing, the report says, “On the policy side, significant effort has been made to put proper guardrails in place such as using only algorithms that meet high accuracy in testing, applying finely tuned due diligence processes and procedures, and protecting biometric information both prior to and during system implementation.”

Transparency emerges as a cornerstone for mitigating these privacy risks. The report calls for agencies to make public disclosures about the types of biometric systems they use, the purposes they serve, and the policies governing their implementation. Such transparency not only holds agencies accountable, but also fosters public trust. However, the report notes that transparency practices remain uneven and, in some cases, absent, undermining public confidence in these technologies.

Examples of a lack of transparency by vendors were brought to light during the workshop, including Probabilistic genotyping software developers, commercial forensic DNA phenotyping companies, and public genealogy databases,” the report says, pointing out that “a lack of transparency was noted regarding how conclusions of these technologies are used to implicate people. Failure to communicate risks and limitations of technologies to those impacted by their use decreases transparency.”

Facial recognition technology (FRT) epitomizes the challenges of balancing utility with privacy. The report identifies FRT’s frequent use without individuals’ explicit knowledge or consent as a significant concern. Moreover, the technology’s propensity for errors, particularly among minority demographics, underscores the potential for privacy violations and discriminatory practices. The report stresses that the use of FRT must be backed by clear legal authority and subject to stringent oversight.

“As public servants, law enforcement officers should employ FRT and other biometric tools responsibly, respect the public’s rights, inspire public trust, and ensure public safety,” the report states, adding that “it is imperative to ensure that any use of FRT and other biometric technologies is done in a way that respects peoples’ rights, does not result in discriminatory outcomes, avoids re-enforcing historical injustice, and increases public trust in law enforcement.”

Consequently, civil rights issues are a recurring theme throughout the report. Historically, biometric systems, particularly facial recognition, have shown performance disparities based on race, gender, and other demographic factors. Such disparities can result in wrongful identifications and deepen systemic inequities. The report details efforts by federal agencies, in collaboration with the NIST, to refine algorithms and minimize these biases. These initiatives, while promising, are not foolproof, and the report emphasizes the necessity for ongoing vigilance.

This report provides guidance to federal, state, local, tribal, and territorial law enforcement agencies who use, or seek to use, biometric technologies and articulates this guidance as a set of best practices and guidelines.

“There is diverse FRT use at the state and local level,” the report says, adding that, “Where some state and local legislatures have been active in attempting to restrict or implement oversight on oversight on law enforcement use of FRT, other state executive agencies have stood up biometric identity management systems that include facial recognition, drafted appropriations justifications, and more than half have signed FBI Memorandums of Understanding (MOUs) to share face images with the Federal Bureau of Investigation’s NGI.

At the time of the report’s writing, “39 states have not restricted face recognition technologies, while eleven have limited or qualified its use in some manner. At least 11 state or local law enforcement agencies have invested in their own internal standalone identity management systems that include face recognition.”

The report says, “eleven states and the District of Columbia have restricted face recognition, although there is a wide swath of allowances and disallowance amongst those who have passed legislation addressing LE use of face recognition technologies.” The report also says, “the strongest thread running through state and local law and policies – whether technically restricting or permitting face recognition technology use – is that the technology is not to be the ‘sole basis for probable cause.’”

While no state fully bans law enforcement use of face recognition, the report notes that “some are highly restrictive.”

“Some of the most nuanced information related to FRT at the state and local level are proactive activities to provide guidance, policies, and appropriations justifications for law enforcement face recognition activities,” the report adds. “Of importance is that the issue of law enforcement use of face recognition technologies has resulted in at least two major law enforcement associations drafting lengthy and detailed reports, one providing guidance on implementing a responsible face recognition identity system, the other cataloguing use cases related to face recognition.”

Civil rights concerns extend beyond technical performance. The report highlights the potential for biometrics to be used in ways that infringe on constitutional rights, such as the freedom of assembly and protection against unlawful searches. To counter these risks, it recommends robust guardrails, including manual review processes to prevent over-reliance on automated systems. It also advocates for clear prohibitions against using unlawfully obtained data or systems trained on such data, ensuring compliance with ethical and legal standards.

Legal and policy frameworks play a pivotal role in shaping the use of biometric systems. The report highlights Executive Order 14074 as a foundational directive, emphasizing the importance of advancing fair and accountable policing practices. This order highlights the imperative that law enforcement activities respect constitutional rights and adhere to principles of equity and justice. In line with this, the report provides detailed guidance for law enforcement agencies, including maintaining meticulous records of biometric system usage and ensuring independent evaluations of these systems’ performance.

Oversight and accountability are central to the report’s recommendations. The DHS Office for Civil Rights and Civil Liberties and the Privacy Office are tasked with ensuring that biometric technologies comply with privacy and equity standards. These offices conduct initial assessments, ongoing evaluations, and impact analyses to ensure that the deployment of biometrics aligns with constitutional and statutory requirements. Despite these mechanisms, the report acknowledges the fragmented nature of oversight across jurisdictions and calls for a more integrated approach that incorporates feedback from civil organizations, technical experts, and affected communities.

The report’s vision for the future is grounded in ethical principles and a commitment to transparency. It advocates for regular audits and public reporting on biometric system usage, which would enhance accountability and public trust. Additionally, it calls for continuous research to address emerging challenges, such as the integration of artificial intelligence into biometric technologies. The report’s emphasis on equity is evident in its recommendations to design systems that perform reliably across diverse demographic groups and to establish safeguards against potential abuses.

Biometric technologies offer transformative potential for enhancing public safety and efficiency in law enforcement. However, as the report compellingly argues, their deployment must not come at the expense of individual rights and freedoms. By embedding privacy, transparency, and equity considerations into every stage of biometric system development and deployment, law enforcement agencies can navigate the complex interplay between innovation and accountability. This balanced approach is essential for building a just and trustworthy system that serves all members of society.

18 best practice recommendations

The best practices enumerated in the report are familiar to those with experience in the field of biometrics, but the extent to which they engender trust will depend largely on how thoroughly and transparently they are implemented.

U.S. law enforcement agencies should use face biometrics for investigative leads, not as a sole factor in identification or as probably cause. This recommendation aligns with common practice and law, but American police have been known to depart from it, raising the specter of how recommendations can be effectively put into practice.

Other best practice recommendations include manual reviews of lists of multiple match candidates, rather than simply selecting the top candidate as determined by the algorithm, a prohibition against using probe images that are illegally collected, or algorithms trained with illicitly captured images, and the retention of logs for compliance auditing. Law enforcement agencies should be specific about the legal basis for collecting biometrics, set minimal data quality criteria, establish “strict criteria to govern the acceptable use of FRT systems in investigations,” which should be made public, and require both technical and policy training for personnel using the technology.

Policies to define improper uses of facial recognition and enforce consequences should be in place, Federal guidance and scientific standards for AI use should be observed, and automation and confirmation biases should be minimized.

The report recommends minimum similarity thresholds for match candidates, which can “only be overridden in exigent circumstances.”

Commercial facial recognition systems should be used only if approved by the agency in question, independent assessments and benchmarks should be used and publicly disclosed, ISO biometrics standards should be implemented, and public documentation on the type and purpose of facial recognition use made available. Performance and testing results should be retained, and vendors required to provide performance information about the version being procured, not a legacy or altered version. Finally, grants for facial recognition and other biometric technologies should be dependent on following the requirements of Office Management and Budget M-24-10.

By Anthony Kimery, with additional reporting by Chris Burt

Article Topics

biometric identification  |  biometrics  |  criminal ID  |  Department of Justice  |  DHS  |  facial recognition  |  Janice Kephart  |  law enforcement  |  NIST  |  police  |  U.S. Government

Latest Biometrics News

 

Biometrics were asked to do more than ever before in 2024, with some use cases, like age assurance, graduating to…

 

Proof of Personhood (PoP) is a concept that’s set to enter the mainstream in 2025. Several firms, including Civic, Humanity…

 

As the year 2025 begins, anti-fraud and cybersecurity companies such as iProov, World and Pindrop are laying out their predictions…

 

Regulated and legal sports betting has launched in Brazil as 2025 gets under way. Bettors can open accounts if they’re…

 

India-based biometric liveness detection provider Spoofsense has passed an evaluation by iBeta Quality Assurance for presentation attack detection with Level…

 

A legal dispute between face biometrics and liveness detection providers is turning nasty, with accusations of bad-faith dealing and dishonesty…


link