NACDL’s Task Force on Predictive Policing
In September 2017, the National Association of Criminal Defense Lawyers (NACDL) convened a Task Force on Predictive Policing to learn about, analyze, and respond to the myriad issues that arise from data-driven policing technologies. The Task Force’s charge was two-fold. First, to study the issues surrounding the use of various data collection tools and analytical techniques that purport to prospectively identify where criminal activity is likely to occur and the people likely to be involved. Second, to evaluate the impact of these techniques on privacy and other individual constitutional rights, issue recommendations for best practices to safeguard those rights, and provide legal and technical assistance to educate defense practitioners in addressing the use of these tools and techniques.
The Task Force held meetings virtually and in Washington D.C., Chicago, Los Angeles, San Francisco, and New York City, meeting with community stakeholders, privacy and civil liberties advocates, technologists, law enforcement personnel, academics, and defense attorneys, to learn about the different types of technologies that police departments are using in communities throughout the United States and the impact of these technologies on individuals, families, communities, and justice. The Task Force also read extensive literature on data-driven policing technologies, including books, academic articles, studies, reports, and news articles. Notably, the Task Force attempted to speak with several representatives from police departments in each city where the meetings were held. However, most affirmatively declined to meet or were otherwise not available.
To arrive at these recommendations, the Task Force focused on the stated goals of the various technologies being used, evidence of their effectiveness, and their potential shortcomings. The Task Force’s recommendations ultimately revolved around the implications of predictive policing algorithms and data-driven policing technologies on how individuals are surveilled, investigated, charged, and prosecuted, specifically addressing the impact of racial profiling, policing, and prosecution on historically overpoliced groups and communities of color.
1. Top-Line Recommendation
Police departments must not utilize data-driven policing technologies 1 1 The Task Force defines "data-driven policing" as including, but not limited to, the surveillance technologies, tools, and methods employed by law enforcement to visualize crime; target “at-risk” individuals and groups; map physical locations; track digital communications; and collect data on individuals as well the communities they patrol. “Data-driven policing” also encompasses place-based predictive models that rely on historical crime data, geographic data, and demographic data; person-based predictive models that rely on personal data and social network analysis; and any databases, lists, and systems that subject individuals to increased police surveillance and monitoring. because they are ineffective; lack scientific validity; create, replicate and exacerbate “self-perpetuating cycles of bias”2 2 Brief of Amicue Curiae Public Justice Center, American Civil Liberties Union of Maryland, and Washington Lawyers’ Committee for Civil Rights and Urban Affairs, Sizer v. Maryland (Md. Ct. App.) 9 (2017) (quoting Kelly Koss, Leveraging Predictive Policing Algorithms to Restore Fourth Amendment Protections in High-Crime Areas in a Post-Wardlow World, 90 CHI-KENT L. REV. 301, 312 (2015). hyper-criminalize individuals, families, and communities of color; and divert resources and funds from communities that should be allocated towards social services and community-led public safety initiatives.
While the Task Force does not believe these technologies should be used, it is clear that these technologies are being considered or have been implemented in cities and towns across the country. The following recommendations are for areas that are considering or already using these technologies.
2. Governing Use
Police departments seeking these tools must not adopt any data-driven policing technology without first meaningfully engaging the impacted communities where it would be deployed and without first securing approval for the technology from the elected governing bodies that represent the communities, such the city or county council.
Impacted communities include the residents of the communities where the data-driven policing technologies would be deployed, community organizations, organizations focused on youth from the impacted communities, and attorneys with expertise in upholding the constitutional rights and civil liberties of residents from impacted communities.
As part of engaging impacted communities about the proposed data-driven technology, resources must be allocated to local governing bodies to host forums to present and describe the proposed law enforcement technology to the residents of the impacted communities. These forums would also provide a space for impacted communities and law enforcement to discuss the law enforcement need for the proposed technology, detailing how the policies governing the use, scope, and limitations of the technologies would be implemented within the defined law enforcement need. Resources and space should also be allocated to enable and empower community members to provide feedback about the technology, and to address community concerns about transparency, racial bias, and the impact of the proposed technology on civil liberties and constitutional rights. If there is a majority consensus by state or local governments and impacted communities that the proposed technology should not be used by law enforcement, then the technology should be prohibited from present and foreseeable use.
Prior to implementing any data-driven policing technology, law enforcement departments must adopt written policies that detail the parameters, requirements, and conditions of use of the technology. Before adopting these policies, law enforcement departments must make draft policies available to the public, provide the public with opportunities to provide comments to the draft policies orally and/or in writing, and incorporate public comments into the final policies. For any technology already in use lacking such policies, law enforcement departments should immediately implement clear public policies that detail the parameters, requirements, and conditions of use.
Tech companies and developers of data-driven policing technologies have previously been shown to assert trade secret evidentiary privileges before trial, citing “trade secrets” as reason to deny defense discovery requests and subpoenas.3 3 Wexler, Rebecca, Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System (February 21, 2017). 70 Stanford Law Review 1343 (2018), 1368. To avoid the abuse of trade secrets and the exclusion of highly probative evidence4 4 Wexler, Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System (2017), 1429., companies that create and supply data-driven policing technology must waive, or otherwise not assert, claims of “trade secret privilege”5 5 When new surveillance technologies are kept secret because of non-disclosure agreements, they cannot be challenged by criminal defendants and these challenges cannot be decided by judges, regardless of the merits of the defendants’ claims. The use of a new surveillance technology may or may not be considered a Fourth Amendment search, but a private company’s insistence on secrecy removes the legal issue from judicial review. and must disclose the methodologies used to build the technology to law enforcement, the impacted communities where law enforcement departments intend to deploy the technology, the legislative bodies that represent the impacted communities, and the attorneys within the jurisdiction who specialize in criminal defense and civil liberties to ensure that the technologies are scientifically sound, are employed as intended, and are limited in scope to meet the articulated law enforcement need.
Any data-driven policing technologies that are used should be subjected to validation studies that allow them to be subjected to a Daubert or Frye analysis.
All individuals, as matters of constitutional due process rights guaranteed by the Fifth and Fourteenth amendments, must be notified of their presence on data-driven databases that law enforcement departments access and utilize, including gang databases, strategic subject lists, and other data collected through social media monitoring. These individuals must also be provided the opportunity, through a private attorney or, if they cannot afford an attorney, an appointed attorney, to challenge their inclusion on such databases, the data accumulated from the databases, law enforcement’s interpretation of the data, as well as to seek removal from the databases.
All individuals, in accordance with constitutional due process rights guaranteed by the Fifth and Fourteenth Amendments, must be notified of their removal from any data-driven databases that law enforcement departments access and utilize, including, but not limited to, gang databases and strategic subject lists.
4. Race Equity
The analysis that jurisdictions undertake when considering whether to adopt any data-driven policing technology must be conducted through a race equity lens and include a racial impact statement. The “racial equity impact assessment”6 6 Laura M. Loy, How Police Technology Aggravates Racial Inequality: A Taxonomy of Problems and a Path Forward. must be conducted by experts trained in institutional and structural racism, as well as the history of racialized policing, and have experts work with legislators, law enforcement, and community members to examine the racialized impact of the proposed data-driven policing technology. If the racial equity impact assessment of the proposed data-driven policing technology concludes that use of the technology would harm the impacted community, the technology should be prohibited from use by law enforcement.
If, through the processes detailed in Recommendations #2, #3, and #4, data-driven policing technologies have been approved, law enforcement departments must adopt and issue written protocols of integrity and accountability, to ensure that the departments and the impacted communities can continuously monitor and otherwise gauge the use and effectiveness of these technologies.
Integrity and accountability measures must include data-keeping, annual departmental reports on the use and accuracy of the technology, measuring and evaluating the effectiveness of the technologies through auditing, and, based on the results of these accountability measures, determining whether the use of the technology should be modified or discontinued. All reports, evaluations, data, and accountability measures produced in relation to and in response to data-driven policing technologies implemented by law enforcement departments should be made available to the public.
6. Resources and Access for Defense Attorneys
In accordance with the constitutional rights to discovery and confrontation guaranteed by the Sixth Amendment, prosecutors must provide to defense counsel notice of and describe any data-driven policing technology that law enforcement has employed or has otherwise relied upon in the case, as well as provide any data based on the technology that the officers relied upon, assessed, or otherwise used in relation to the accused, including Brady material and any other data accumulated against the accused. Defense counsel must then be afforded time and resources to engage experts to analyze and interpret the data.
Defense lawyers must be notified of and trained on the data-driven policing technologies employed by law enforcement departments in their jurisdictions, including the federal and state constitutional rights implicated by the technologies.
Defense lawyers should collaborate with other attorneys, technologists, and experts who understand the data-driven policing technologies employed against their clients and incorporate law enforcement’s use of the relevant tool(s) against their clients through all aspects of their legal representation.
Defense lawyers must have access to data-driven technology experts who can break down the technologies and consult on defense strategies vis-à-vis the data-driven tools that law enforcement relied upon to suspect, surveil, approach, arrest, or otherwise employed against the accused.
Resources for public defenders and court-appointed counsel must be increased to respond to and otherwise incorporate data-driven policing technologies into all aspects of client representation and to meet their constitutional obligation to provide zealous representation to clients impacted by these technologies.
Courts and prosecutors must be trained annually on the data-driven policing technologies employed by law enforcement departments, including the federal and state constitutional rights implicated by the technologies.
Judges must assess the reliability of a data-driven policing technology employed against the accused before determining whether it justified a Fourth Amendment intrusion. Data-driven technology must not form part of an officer’s calculation of reasonable suspicion, unless the technology can be shown through typical evidentiary burdens that it is reliable.
Law enforcement authorities cannot utilize or otherwise rely upon data-driving technologies, such as gang databases, in any way that infringes upon the right to association guaranteed by the First Amendment.
8. Children and Youth
State and local jurisdictions must enact laws, policies, and protocols that protect the federal constitutional rights, state constitutional rights, and dignity interests of children and youth who are implicated or otherwise at risk of being criminalized by data-driven policing technology.
Law enforcement authorities should not include children under the age of 18 on any law enforcement database, or otherwise accumulate or access data specific to children under the age of 18 through social media monitoring or other data gathering practices.
Young people between the ages of 18 and 25 must be provided notice of their presence on any databases that law enforcement departments access and utilize, including gang-databases, strategic subject lists, and other data collected through social media monitoring. Individuals must be provided the opportunity, through a private attorney or, if they cannot afford an attorney, an appointed attorney, to challenge their inclusion on such databases, the data accumulated, law enforcement’s interpretation of the data, and, also, to seek removal from the databases.
An individual’s ability to challenge their designation and inclusion on such databases, the data accumulated, and law enforcement’s interpretation of the data should be ongoing, particularly given the impact of law enforcement interactions with children and youth on their personal development, self-esteem, and educational outcomes—including school attendance, suspensions, expulsions, and matriculation—as well as the correlation between these outcomes and involvement with the juvenile and criminal legal systems.
Any data, records, or other information contained in any law enforcement database through any data-driven policing technology and/or social media monitoring should be sealed and purged when the individual reaches 25 years of age, at which point the adolescent brain is fully formed.7 7 Arain, Mariam et al. “Maturation of the adolescent brain.” Neuropsychiatric disease and treatment vol. 9 (2013): 449-61, 453.
Resolution of the Board of Directors
October 24, 2020