U.S. flag

An official website of the United States government, Department of Justice.

In Acceptance of the SEARCH 2022 Justice Policy Leadership Award

Search 2022 Annual Meeting
Denver, Colorado
NIJ Director Nancy La Vigne, Ph.D., gives acceptance speech at 2022 SEARCH Meeting
NIJ Director Nancy La Vigne, Ph.D.

Thank you, Dave, and thanks to the SEARCH Board for choosing me as this year’s recipient of the Justice Policy Leadership Award. I am tremendously honored. I also want to thank the wait staff here for taking such good care of us this evening — please join me in thanking them all!

If I’m not mistaken, this is the first time that SEARCH has awarded its Justice Policy Leadership Award to a researcher. This in the same year as honoring the leader among leaders in criminal justice research, Dr. Al Blumstein. These decisions speak volumes for how far the field has come in acknowledging the crucial ways that research can contribute to improvements in criminal justice policies and practices.

And I can think of no more critical time to bring research to bear on the pressing issues of safety, justice, and equity in this country. We are experiencing violent crime waves in many U.S. cities, mass shootings have become increasingly commonplace, and demands for violence reduction efforts are competing with calls for all manner of police and criminal justice reforms. Let me be clear: we do not need to choose between public safety and criminal justice reform; the two can and should go hand in hand.

But where does research fit into all of this? And how can we ensure that it leads to the outcomes we are seeking? Much of it rests on data. As this audience knows better than most, we have many data challenges to navigate in order to promote change.

The quest for greater data access and data quality is integral to my five priorities as NIJ director. And while this may seem like a quixotic goal, it’s my belief that we can move the needle in meaningful ways by improving both data and the research process.

My first priority is to foster both rigorous and inclusive research. By “inclusive,” I mean research that takes the time to consult with and learn from the people who are closest to the issue or problem under study. They could be prosecutors, probation officers, victims service providers, families who have lost loved ones to gun violence, people who have experienced incarceration — you name it. It occurs to me that we researchers like to think we are the experts on the topics we study. That’s not the case. We may be the experts on research methodology, but the true experts are those with professional and lived experience.

In the context of data integrity, inclusive research is not just the right thing to do, it’s also an important strategy. Data are generated by people. And we can’t disentangle data from the people who help generate it. That means that data, like people, are both full of promise and inherently flawed. I believe there’s much we can do to improve flaws in data while we are collecting it, and taking an inclusive approach can help.

One thing we can do is to engage the people who help generate the data so that they are invested in the process (data collection) and the outcome (accurate data and the evidence-based policies and practices generated with it). I’m thinking back to my days of leading NIJ’s Crime Mapping Research Center when we advised agencies on how to increase the accuracy of geocoding — assigning x/y coordinates to the crime events being mapped. Some of the problems with assigning x/y coordinates started with the patrol officers. They didn’t always record the exact address when making reports, sometimes just using the nearest intersection or even using the police precinct address. I recall early maps depicting hot spots of crime and discovering that some of the biggest spots were at police headquarters. Not exactly a resounding endorsement of the PD’s crime control effectiveness! We began counseling agencies to meet with officers at roll call, show them some maps with both inaccurate and accurate data, and explain the importance of recording addresses properly. It improved the quality of the data tremendously.

We need to bring these same strategies to officers who are tasked with generating new and more detailed data, like on use of force and the nature of their interactions with members of the public. That’s not just about explaining to them why the data are important to collect — it is difficult to show how rare use of force is without documentation of all the times conflicts are resolved peacefully — but also engaging line officers and supervisors in what data to collect and how.

My second priority is to elevate studies that promote a racial equity lens. That means assessing criminal justice programs, policies, and practices for racially disparate outcomes. It means evaluating strategies to reduce racial and other disparities in the criminal justice system. And it also means recognizing that oftentimes “race” is serving as a proxy for structural inequality and other societal factors. This requires scrutinizing the data we use and distinguishing and minimizing data sources that have baked in biases.

Unfortunately, we do a very poor job of accurately documenting people’s race and ethnicity. Criminal justice agencies should adapt their data collection tools to mirror the Census Bureau’s categories and, when possible, enable people to self-identify.

When thinking about issues of racial equity, we know that the threat of racial bias comes into play across many facets in the criminal justice system. One way is in the context of AI applications like predictive algorithms and facial recognition software. I’ve seen a lot of focus on tests of bias in the actual algorithms used. In fact, a few years ago NIJ ran a “Challenge” — a research- competition that invited new risk assessment algorithms that minimized racially biased outcomes. The results were somewhat instructive, but I think we place too much emphasis on the algorithms and not enough on the underlying data.

Again, because data is generated by humans, we can’t always take it at face value that it is accurate. In the context of racial bias, we need to interrogate data sources. Is it the facial recognition algorithm that fails us in yielding inaccurate or biased results, or is it the humans who fed the algorithm the inaccurate data to begin with? Instead of throwing the baby out with the bathwater, we need to strive to reduce bias in the source data.

For example, many recidivism risk prediction algorithms use arrests, yet it’s well established that people are more likely to be arrested for certain types of offenses based on where they live and the color of their skin. We know this from traffic stop research and the well-established sundown effect; biases in traffic stops disappear when analyzing nighttime stop data, when officers can’t discern the race of the driver.

My third priority is to infuse all research, and particularly that pertaining to criminal justice technologies, with a strong implementation science component. We need to support evaluations of the implementation of technology in the field. We tend to overlook issues of implementation fidelity, and yet they are essential. Take the issue of the recent transition to the 988 suicide and crisis life line, for example. That involves technology to route the calls to the right mental health responder, but it also requires data to anticipate demand. What share of calls will continue to come into 911 and how will call takers determine which should be transferred to mental health professionals? How can we anticipate the demand and how can we track it over time? Call takers navigate a tremendous volume and most emergency call centers have dozens and dozens of codes and when in doubt or given insufficient info may check “other.”

A recent study of calls for service in nine jurisdictions found that a small fraction — under 4% — of calls that lead to dispatch are coded as mental health issues. But it’s likely much more than that. The solution is not simply to develop a new software interface. We need to connect with the people who are generating the data to give them the tools and resources they need to code calls as accurately as possible. It’s this kind of attention to implementation factors that researchers can give to help inform changes in practice.

My fourth priority as NIJ director is to promote more interdisciplinary research teams so that we can learn from the best economists, engineers, sociologists, and forensic scientists who ideally work in partnership with practitioners and bring complementary skills to the table. Far too often, these disciplines work in isolation rather than collaboration.

Take police reform, for example. Members of the public, advocates, and public officials are all demanding for more data transparency — and rightly so. But, having worked with police agencies of all sizes over the years, I can assure you that lack of compliance with reporting isn’t usually some nefarious conspiracy to hide the ball. Typically, it’s a lack of capacity owing to being short staffed and not having enough resources. And let’s face it, the people we task with extracting and sharing these data are overworked, underappreciated, and often undercompensated. They are tasked with so much and need to be better supported.

We need to provide them with training and technical assistance and help them invest in systems that enable compliance. In the spirit of promoting interdisciplinary teams, perhaps that could include teaming with engineers who are embedded in law enforcement agencies for a period of time to help revamp data systems as was done in the D.C. police department several years ago.

It is also useful to have practitioners who embrace researchers, as well as civilians who want to engage in the type of useful and applied research that practitioners need. One way to bridge the divide between researchers and police practitioners is to treat them as one and the same. In 2014, NIJ partnered with the IACP to establish the Law Enforcement Advancing Data and Science (LEADS) Scholars Program. LEADS was designed to support and empower the integration of evidence and data into law enforcement policy and practice. That integration may come in the form of officers partnering with researchers, independently conducting their own research, or infusing research into law enforcement policies and practices.

LEADS scholars have conducted research on reducing gun violence and traffic fatalities, identifying optimal investigator caseloads, developing predictive policing algorithms, and many other impactful projects. Some LEADS alumni have gone on to obtain Ph.D.s, while others have had an accelerated path to promotion owing to their contributions to their agencies through their research-informed practices.

By continuing to support the LEADS program, NIJ is not only promoting interdisciplinary collaboration, but it is also helping to ensure that the people who generate the data have a vested interest in its accessibility and quality.

My fifth and final priority is to ensure that research evidence results in actionable information to promote change in the field.

In order for research to promote change, it has to be credible. And data integrity is essential to that credibility. It is difficult — if not impossible — to achieve improvements and reforms without confidence in the underlying data. And you all know this better than anyone.

Yet despite best efforts, criminal justice records in many jurisdictions remain incomplete, lacking disposition data. These incomplete and inaccurate data are nonetheless used for employment and other background check purposes, and that unfairly affects people’s lives. We need to keep working to collect complete and up to date data, both for public safety purposes as well as to clear people who have never been convicted and give those with antiquated records a clean slate.

Similarly, in the context of program evaluation, our measures of recidivism are usually severely flawed. Researchers often used arrests as an easily accessed metric, but arrests don’t necessarily measure what we think they do. As I mentioned earlier, people have increased odds of arrest based on where they live and their race and ethnicity. Return to prison is not a sufficient recidivism metric because a substantial share of those returns are for violations of supervision conditions. Given that the average number of conditions of supervision per person ranges from 10 to 20, there are far too many ways to violate conditions of supervision without posing a threat to public safety. Studies also need to look not just at whether people reoffended but what type of reoffending occurred. Timing matters too: if we can delay the onset of recidivism, we are serving the interests of public safety. All of these outcomes warrant better data and further inquiry.

In closing, it occurs to me that, despite justice policy becoming increasingly evidence-based over the years, there remains a disconnect in terms of how that evidence is being generated and with what data. This makes it all the more important that this group — SEARCH members, supporters, and affiliates — promote data access and data integrity. I’ve been so pleased to observe that the discussions today affirm your commitment to this work. Make no mistake: your efforts are undoubtedly serving the interests of safety, equity, and justice jurisdictions throughout the county.

Thank you.