School Safety Implementation Challenges - Roundtable Discussion, NIJ Virtual Conference on School Safety
Review the YouTube Terms of Service and the Google Privacy Policy
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video presents a roundtable discussion from the conference.
>> Alright, looks like our numbers are starting to level out a little bit now.
So again, welcome everyone, my name is Dr. Caleb Hudgins, and I'm at a Research Fellow at the National Institute of Justice through the American Association for the Advancement of Science or AAAS in their science and technology policy fellowship.
So my particular background is in Behavior Analysis and Behavioral Neuroscience, but I've also worked at the private sector translating research into practice through building partnerships and data sharing networks to inform the decision making.
And I've been really excited to work with the team...
on this project, 'cause it's a critically important role that education plays not just in public safety, but in public health, and in building a vibrant, more equitable society.
So to that end I'm happy to foster a conversation around some of the challenges of translating research into practice, and particularly thinking about implementing programs which impact school safety, and so I -- I know we kind of already had an additional kind of session on this this morning.
And so I want to make sure that we still have plenty of time for discussion.
And so other people can get involved in this conversation.
I do want to spend some time talking about some different implementation approaches, and some of which you may already be familiar with or currently already utilizing in your school.
So we'll briefly discuss some of the specific challenges to implementing school safety as a potential approach to addressing some of those challenges I'll introduce implementation science, and then I'll show a couple of its frameworks, and talk about what it is, and show a couple of its frameworks, and also talk about the concept of fidelity.
I also want to talk about...
spend some time talking about the PBIS framework, and how overlapping components of school safety may be leveraged with other kinds of school support used in that framework.
And then I'd like to open up the conversation to specific implementation challenges or opportunities in school safety, and how others have approached implementing school safety programs.
And so while I'm getting to talk here, feel free to raise your hand or chime in if you have any comments at all.
This is much less formal than the...
and breakout sessions, and so you can unmute yourself, and your video will come on, and you can go ahead and ask questions or just contribute to the conversation.
So if you have any comments along the way, feel free to raise your hand, and I can unmute you and get your video turned on.
So first, research from the comprehensive school safety initiative identifies three interconnected components to help promote school safety using clear physical safety, student behavior, and school climate.
Interventions in one area are known to have an impact on others.
So this innerconnectivity can make it difficult for school officials to develop an informed school safety implementation strategy.
This means that the solution is not just a certain number, a program or a certain kind of support program or policy, but that you really need a system in place to -- a really a system in place to monitor the quality of interventions, as well as the intended and unintended outcomes.
...in the last session spoke to this in mentioning that we only know the intended...of a program if we have a process in place to measure that before hand.
So does school safety face unique implementation challenges and opportunities? So some potential challenges that we might see in implementing these kinds of programs, it's really the data system we're required for measuring climate it's going to be something that is quite complex.
So this has been discussed in a number of different sessions, because climate is a -- is this kind of this moving kind of measurement that we're currently adapting.
You need to have an adaptive data system in order to be able to monitor climate, and monitor climate changes as we implement other kinds of programs.
And I think continuously we've seen this issue of balancing the needs of schools, so identifying those needs, but then balancing that with resources available, and being able to match that with fit -- is certain program a good fit for the school and the individuals in the school or at need? -- and then capacity.
So what capacity does this program have to actually reach all the different members of the school that are in need.
And then we've consistently been talking about the issue of buy in being an important factor for implementing any kind of large-scale change.
We need to have those -- these champions, we need to have these individuals who are excited about school safety and can be advocates for us and advocates for the kinds of programs that we want to implement.
So as much as it's -- sometimes we get challenged to get those advocates and get that buy in.
It might be that because we are talking about school safety here that this auction provides a different opportunity.
Perhaps it provides an opportunity to bring in different stakeholders that maybe wouldn't necessarily be involved in the conversations around how to make schools more effective places for our students.
So this might allow more individuals that may be would not be in the school...network interested in trying to contribute in this area.
I think there's another interesting potential opportunity is that we've identified that there is a lot of programs that are needed in order to promote school safety, which includes supporting student's behavior, their emotional behavior, mental health support, as well as dealing with issues of trauma.
And this, so those same kind of issues, those same kinds of programs overlap with some of our education needs as well.
We know that if we addressed these same kinds of mental health and trauma issues, we can actually also promote academic achievement, as well.
So as a first opportunity kind of opened up the floor, I would like to see -- is anyone else thinking about any unique challenges or opportunities potentially that implementing school safety programs in...might produce? Well, again, I think that potentially this idea of overlapping the programs might allow us to link some of the safety interventions that are being suggested with some of the other important interventions that schools might be investing in as well, including academic supports, and other kinds of support programs for their students.
So another common theme that I've heard about is really how do -- how our schools are going to really prioritize, and implement a given evidence-based practice.
And so one way of kind of trying to address that issue and to think about it is to think about implementation science.
I think implementation science really was kind of developed around trying to think about these very same issues.
So what is Implementation Science? Implementation Science is really an approach to selecting, implementing, and sustaining an evidence based program that is based on evidence based practices.
So these terms are sometimes used interchangeably, but other times people do talk about a difference where the evidence based practice is really the core intervention components, you know the core kernels of the intervention that have been well-researched in their own...
Where the evidence based program is more this structured multifaceted, coordinated intervention in program that is designed to meet a specific kind of goal.
And so you could think that maybe you could have an evidence based program, so something like response to intervention or RTI, and that program has multiple evidence based practices embedded within it, and so an example of that would be direct instruction, as being part of our RTI.
Implementation Science is really about connecting research to innovation and innovation to application, and then to fidelity.
So research might suggest to a given set of practices, and those research practices are translated into an applied tool or program or policy, which is also -- kind of needs to be researched in its own setting.
So now let's say that we are part of some kind of implementation planning team at a school or school district, and we want to implement this or some other kind of evidence based program.
Well how do we select the right program? How do we insure fidelity of the practices within those programs? Well, first, there is really no one right answer because the answer really comes from the implementation science approach to making these decisions.
Ultimately implementation science works by helping organizations develop their own evidence based decision making framework through which they can identify, implement, and sustain other evidence based programs which address their needs and goals.
So just to give a little bit of background on some of this work -- there are a couple papers I'd like to talk about real briefly.
Duncan Meyers and colleagues in their Quality Implementation Framework where they reviewed 25 different implementation frameworks to try and come up with a concise model, and so they have some really good suggestions in their -- in their paper there.
Dr Rachel Tabak and colleagues also did a similar review of 61 different implementation models, and looked at a number of different common features among them to try and identify a kind of best practice model kind of moving forward.
So these are good researches that are good resources if you're looking for a place to start trying to understand how some of these implementation frameworks might best serve your organization.
So there's also some current implementation science resources that are really active right now.
One is NIRN, the National Implementation Research Network out of North Carolina Chapel Hill and AIRN which is the Active Implementation Research Network.
And so I do want to share a couple of these models with you, and so maybe we can talk about them a little bit, and hopefully that will help spur some discussion and conversation around implementation.
And so, this model is from NIRN, and I really like this model, first because it talks about a number of integrated systems that are all needing to kind of work together in order to support some kind of implementation.
And particularly I really like these data support systems.
And because as I think something that we heard in the last talk is that we really are trying to develop a data base way of addressing whether or not our programs are having the impact that they need to have.
And so we need to have some kind of data decision framework that's going to be heart of any kind of implementation system that we put in place.
I think I really like this model as well because they talk about issues of coaching here, and I think coaching is another really important issue which has been brought up in a number of other implementation models where really needing to make sure that you have a good coaching team to work with, teachers and staff, of what it is that are the expectations for implementing...intervention to its full fidelity.
And so again, this concept of fidelity is also important, it's something that was brought up in the previous talk.
How can we make sure that when we do selective given evidence process that it actually is being implemented to fidelity.
And so fidelity is interesting because fidelity isn't the outcome data.
Fidelity isn't, you know, did this intervention and have an effect, and what was the effect? It's not about the effect.
It's not a performance review, and so it's not an assessment of how well the teacher do this or were they actually implementing the program the way that they were? And I think this is important because we don't want -- because if someone isn't implementing the program the way that they should, that shouldn't reflect on their -- on their performance in the school, because this really has to do with us understanding the evidence based practice itself, and that's really what fidelity is about, is really an analysis of the evidence based practice.
And it lets us know, well, first is it actually being used? Are your staff actually using the evidence based practice? And then we can also see what kind of potential supports might be needed in order for individuals to actually fulfill the -- or be able to implement the practice with the right kind of fidelity.
And then ultimately -- is it actually working? And it really can be used to guide all kinds of decisions about the impact of your intervention, and resource allocation.
And so just to give some little examples of that, so lets say that we have an evidence based bullying intervention program.
So this program might have multiple components, multiple evidence based practices.
And so after training, teachers are randomly assigned to see if they are implementing all these different components.
And so I imagine that the most ideal outcome is going to be high fidelity and you're also reaching your target outcomes.
So this suggests that your evidence based practice is having the expected impact.
However fidelity and outcome can change, and so we do need to continuously monitor these kinds of measurements as regularly as we can really...
So we could also see high fidelity, but poor outcomes.
And so this evidence would suggest that maybe there is a mismatch between the selected program and the desired outcome.
And that maybe that there's potentially other drivers for our...of concern that we are missing.
However, this is a good indication that your training and coaching processes are potentially effective.
Even if you're not reaching your outcome, you're still showing high fidelity.
So you can also have poor fidelity, but then show evidence that you are reaching your target outcomes.
So this again also suggests a potential mismatch between the selected program and the outcome of interest.
But it also gives us a different kind of way to look at it because if we are still reaching the outcomes, it means something is going right, it means we maybe should probably look maybe to our teachers a little more specifically and see how they adopted a different practice that maybe seems to be resulting in that target outcome.
Or maybe they are still adopting part of the evidence based program, but maybe they're not adopting all of the practices within the program, and so we see a little fidelity score, but it's still effective.
And so that suggests that maybe your evidence based program could be reevaluated in terms of what actual practices within it are most critical to reach the outcomes that are needed.
And then of course we can also have a poor fidelity and poor outcomes, and...this would maybe just suggest that oh, well, we need to improve fidelity so we need to improve training, but if we see consistent fidelity issues across multiple staff members, then that might suggest to us that it it is not necessarily like a training issue, and it really might be a time or resource issue.
The intervention isn't able to be implemented, not due to any fault of any of the teachers, but they just literally don't have that extra bandwidth to be able to get it done.
And so this is another implementation model and so this is another complex one, and so within this one I really kind of like these -- the external cyclical components of it, of exploration, preparation, implementation, and sustainment, because it really kind of highlights the cyclical ongoing process of implementation.
And so for the sake of simplicity and discussion I just kind of made...model and merged the two together, and so -- so we can maybe kind of talk about some of these issues.
And how to think about implementing at these various stages.
And so we have an exploration phase.
So during our exploration phase we might be thinking about what programs or changes are even available? And then how do we prioritize the value of those different programs based on the needs of our current community.
And then also trying to understand capacity of that intervention are we actually going to be able to provide that intervention to all of the individuals who need it? This actually even speaks to the need of a lot of kind of upfront work here in just this initial exploration.
There's been a lot of discussion on readiness assessments, and so we need to understand what resources are available and what are the specific needs that the school has, and does a given solution fit for that population that we are trying to address? And do we have the capacity to be able to expand an intervention to all the individuals that could benefit from it? And then we have preparation, we have other issues, essentially establishing our data infrastructure because we need to have that to inform our program's success.
We need to know what needs to be in place for this program to be successful.
What level of supports are necessary? What kind of -- thinking about these competency drivers here -- what kind of training or coaching is going to be required in order to maintain our intervention? And then creating implementation teams is another hugely important concept.
Some plans say you probably do this during the exploration phase.
So during implementation itself, we can also think about how do we respond to fidelity.
What are -- how are we going to approach coaching and supervision during the implementation in order to provide that effective feedback? How are we evaluating this process that we have implemented, and what are our feedback mechanisms that we have put together? And then in sustainment we really need to be thinking about what is our long-term plan.
How do we deal with changes in fidelity whenever we see some kind of slippage or a drift.
And this might be accompanying turnover.
How do we address turnover? We know that the student body is going to transition, we know that teacher's are potentially going to transition, that administrators will transition.
How do we build those concerns into a sustainable model? And then specifically two issues of school safety.
How does implementing certain programs potentially impact others.
Talk about this notion of wanting to address and think about school climate because we might make some sort of intervention to address physical safety, but that might end up having a negative impact on some of the other student's perceptions of an outer school.
Alright, so it looks like I see a couple of questions here coming in from the chat, so let me see if I can address some of these.
So here is a school safety specialist in a large district in Atlanta, also retired firefighter and fire inspector.
"Is there anyone out there getting pushback on implementation and best practices? It feels like selling aspirin to someone that does not have any headache.
...that's a very interesting point.
So does anyone -- can anyone speak to that? Interested in answering? And again if you -- you can unmute yourself, and just go ahead and start talking if you do have an opinion on this.
You don't necessarily have to wait for me to call.
>> Hi there, my name is Sarah.
I'll kind of chime in on that.
I really like that analogy of selling aspirin to someone that is not having a headache.
In some of the work that I've done in schools, I think it's important that they do recognize the potential presence of that headache, so to speak.
So that might be uncovering particular data sources to demonstrate a need, or in other ways, trying to talk to school leaders about ways that they could improve the operation within the school.
Because I'm really not certain in moving forward if the school doesn't see that there's a need, I would question the degree to which we would be able to move forward with some of those larger-scale implementation efforts.
And I think that gets to some of the other issues that folks have brought up throughout the conference with relating to -- to buy in, the relationship between buy in, and the degree to which things are implemented with fidelity.
So I don't want it to sound like we are trying to be a sales person, and convince them they need something when they don't, but I think it's, in my experience, it is more illuminating that need that they do have, but maybe just haven't prioritized.
>> Thank you, Sarah.
Those are very insightful comments.
Anyone else have any comments on this idea about how to deal with kind of pushback on this idea of best practices? Well, again, I -- so -- >> Oh.
>> I'm sorry, go ahead.
>> Caleb, this is Katie from...
and I was going to say it's all about data, and working with administrators and districts.
They're data driven, they're data driven in programs, with reading and with math, and with other curriculum that they choose to put in place.
And so what data do you have to support what's needed within that district? And so, kind of to Sarah's point, what is it that you're trying to illuminate? I think that's the best word.
Sarah saying that, what are you trying to illuminate them to? What need? And then what data do you have to back that that need is there? Because once you start to talk to the admin, and to the superintendent, and to everybody about the need, and being able to show that need through data and how it affects the kids, then there -- they will be 100% on board with what you are trying to give them.
And then it won't feel like a sale's pitch.
It just feels like a natural, "These is what our students need." And if they see that you are putting the students first, then they'll be on board.
>> No, I, I really like that.
And I think that speaks again to that idea that those needs in a way also need to come from the community it's going to be very difficult to go in, and try, and tell someone what their need is if that is not shared.
But the idea that we can bring data to the table to promote this shared understanding of this common need, and then once we have that, that then kind of gives us hopefully some of that buy in to be able to move forward.
I like that.
Those are great comments.
So I have another question here.
Any insight from the group about how to get a system such as a school system and step back and take a fresh look at what framework and programs they are using.
People who can get...
to particular things which may or may not be the best fit addressing current needs.
I think that's a huge issue because it's one thing to build into a system where people are open to ideas, and you're able to introduce them to them, but if they already are embedded within a given set of systems and programs, how do you get them to shift from one set of programs to a different way of doing things? >> Again, what -- I hate to be the broken record, but what -- how are you evaluating the programs that you have now? How are you looking at those programs, and saying, "Are they effective?" And is the -- has the implementation been effective? Are we actually helping our students? How are we helping our students? And are we lacking in areas where some other programs could be better? And is it worth looking at some alternative programs that are being more effective in other districts with similar demographics, and looking at that data and how can we implement that here? You have to think of it from the school perspective of, like when they adopt a new curriculum, they go out and research a ton of curriculum to say, "Okay, this is the best history curriculum, and the reason this is the best history curriculum is because it's -- it's backed by "A", "B", and "C", these other states have used it, it comes with the supplemental materials, it's shown to raise kids' test scores.
So again, you have to approach it as the mind space of an educator.
So if you're looking at a safety program, where is it working? How does it benefit the student? And then looking at where our gaps and needs and how does that new program fills the gaps and needs that we have.
And so then being able to actually evaluate it in a way, because if you're not evaluating your program's effectiveness, then you're never going to be able to push new programs that could potentially be more effective.
>> No, I, I think that's a great point, and it kind of makes me think about something that...
said in the previous session is almost that the process is the product.
It's like we're really trying to promote a process of data, and can you...incorporating data into all of the decision makings that schools have.
And so that includes not only new decisions that they are making, but why have they made the decisions that they've made in the past and it's there a way in which data should be informing those kinds of decisions.
And so then again, it almost kind of speaks to this idea of, in some cases, beginning to promote almost kind of a culture change.
If that isn't the way that people are typically thinking about their programs and trying to understand their impact.
>> Yeah, just to kind of piggy-back off what Katie was saying, and you too, Caleb, this is Sarah Pinkelman again.
One thing that I have found to be useful and a lot of the work that I do is implementing PBIS in schools, just to give a little context, but one thing that I've found helpful when working with administrators at either the full or district level, it's to talk about not only outcome data, but cost data.
Meaning like, how much are you investing in program "X", and program "Y", and how -- what are you getting out of it? And so what you are getting out of it is coming to the outcome data, and you'll look at those measures that are meaningful to administrators.
But the cost can be really significant with minimal gain, and so cost related to not only financial cost for if the program requires funds, but also cost for training, time for how long it takes people to implement that throughout their day when they are very busy people working in schools.
And so doing, and having a conversation about a cost benefit analysis, and you know, there's different methodologies people are proposing for ways to do that, but even just talking to them about what are you getting out of how hard you're working at this program right now? And maybe this program isn't the best fit because you're working your little tail off but you're really not getting to the outcomes that you're hoping for.
>> I think that's a great point, Sarah, and then again points to the importance of having a kind of general data decision system to really be able to look at those things, and again, not just looking at what are my new programs that I'm going to try to implement, but can we try and evaluate what it is that we already have going on.
Alright, so some great conversation there.
So there seems to be a lot of, again, discussion around how do you select the right kind of program, and how do you evaluate evidence in a program, which can be kind of tricky, but actually these individuals at NIRN have their own tool that they have developed there, and alright, here's my next slide., And so here, this is their hexagon tool, which I have found really kind of conceptually interesting because it really does get at a lot of these different issues that people have been bringing up about when you're thinking about the value of a given evidence based program.
And so, first, you need to be looking at -- What is that evidence? How strong is it? How many studies have actually been done? And are the studies on practices that make up a program or are they on the actual program itself? And then again, that evidence needs to be thought about in terms of, well, does that fit? Would that evidence also be applicable to the population that you're interested in implementing it in? And so is it a good fit for an intervention in a different space? Usability -- how easy can we actually train people to be able to use this program? Or is it something really kind of complicated and convoluted? And that's going to be related to capacity, and what level of individuals need to be engaged with this program, need to be trained on it for it to be effective? And do we have...capacity? And then again that's related to, again, all of these things have kind of been interrelated supports.
Do you have that expertise to be able to implement the program and then to be able to get that assistance for implementing it, and that training? And then again, all of these needs to be conceptualized within, "What is the need?" Because kind of speaking to some of the comments that were made, we can't come in and tell people what their need is.
You almost need that to come from the community and the people to want, be needing, wanting something, and needing something, and then that makes it a little bit easier to come in and try, and establish a framework to move forward.
So, I don't know, has anyone used the hexagon tool at all in evaluating different evidence based programs? I know a lot of times in an academic research space you usually have a set of programs that we maybe already have that we're looking to implement, but I feel like maybe the hexagon kind of model is where maybe you don't have that preset idea about what programs you want to implement.
You...need to evaluate kind of a laundry list of them.
So it provides an interesting tool for doing that.
They have a number of resources on their web site for how to actually actively use this tool in an evidence based way.
And so it's very interesting.
So I did want to lastly talk about the PBIS framework.
Because we've been talking about implementation frameworks, and this idea that implementation has these multiple components and drivers to consider that is happening in time or you're needing to explore what resources, and programs are available to prepare, and implement, and then they need to be sustained.
So we have all of these different levels of concerns, and then I think when we think about programs in schools, and particularly about the PBIS model, you then have to recognize that some of these programs, and evidence based practices aren't fit for all of the kids, and you know for all of the students we're looking to have an intervention on.
And so I think this is where we can kind of look to the PBIS framework potentially as a framework to think about implementing the variety of behavioral support programs that seem to promote school safety.
And so does PBIS provide an implementation framework which could guide practitioners and researchers...
safety interventions.
And I think this is potentially valuable to think about because PBIS is already a habit in over 20,000 almost...getting close to 30,000 different schools that have adopted some kind of PBIS program.
And so since they already kind of have an implementation framework there, instead of saying, "Oh, here's another set of things you need to do for school safety, and here's a whole other framework to figure out how to implement the school safety interventions", is there a way to adopt your existing framework that you have, and integrate in these other kinds of interventions in order to promote school safety.
And so I do know that there's a number of talks just at the conference today on people using PBIS for a number of the behavioral management aspects of school safety.
And then also using PBIS really as a data collection framework as well.
We talked about needing to measure school climate in order to get a good idea about how students are getting impacted by other kinds of interventions that are going on in the schools to address school safety.
Well, the whole kind of notion of PBIS is that it is a data informed framework where students are potentially moving between these different tiers based on what their needs are.
Through that we can be incorporating in those same kinds of support programs which address issues of trauma, which address issues of mental health issues, and help teachers with classroom behavioral management.
We can adopt those same kind of programs into a PBIS kind of model and then even hopefully be able to extract some of that climate data trying to understand what this new climate is also from that same kind of framework.
And so it's just something that I have been thinking about, and so I just didn't know if anyone else had been trying to think about the framework in which you would like to implement school safety interventions, and if PBIS has been a helpful framework for anybody else? >> Yeah, I think -- this is Sarah again, I'll just chime in really quickly...
>> Thank you, Sarah.
>> Yeah.
So one of the things that I think were super important that you mentioned, Caleb, is that the PBIS is not a curriculum or program in its essence.
It is a framework for implementation and data based decision making, and I don't feel like a lot of schools that are implementing school-wide PBIS often necessarily see it that way.
Some might see it as more of a prescribed program where tier one behavior support looks exactly like this, and tier two looks like this, and tier three looks like this at all schools, which is not at all what PBIS aims to do.
It aims to be flexible and contextually relevant depending on the particular schools, students being served, and the community in which the schools are nested.
So it's a framework for being able to embed any other different initiative like trauma, and form of care, and others that are really gaining a lot of momentum within the PBIS sphere.
And it's a way to incorporate other programs and curricula, but do so with an implementation lens so that you are collecting the data, so that you are doing sufficient team based coaching and producing outcomes that you are meaning to produce.
I think it's a great -- I think it is definitely something to consider.
>> Good, thank you, Sarah.
And yeah, and again, ...be the way in which -- it seems to me that a PBIS tier system is basically already looking at as best as it can that kind of continuous measurement of student performance.
And, and has students move between, potentially move between different tiers of support that they might need that, you know, can be a reflection of the school climate, and can we incorporate those measurements into the same kind of PBIS framework? And while maybe certain times safety interventions might not necessarily fall onto that same tiered approach in the same way.
Especially if there are physical changes to the schools or emergency management programs, we wouldn't necessarily think about them in the same way, but when they are implemented, you can still monitor the data that's coming out of a PBIS live framework to see if you know -- are we actually having some kind of impact on school climate that we are able to get out of those same kind of PBIS methods.
Well, so, that's why I think that Implementation Science Framework, as well as other frameworks like PBIS, really can be adapted to help school safety advocates and stakeholders develop research-informed action plan, which considers the innerconnectivity of school climate, student behavior, and physical safety interventions and programs.
So we just have a few minutes left here.
And so, I know we've been talking about implementation a lot, this -- during this conference here, but it's an important part not to just do the research and to understand some of these best practices, but to really identify how to integrate these best practices into schools in an active way.
Well, if no one else has any other questions or comments, then I can go ahead and start to close out the session.
Please come back at two o'clock where we'll have our breakout sessions, and so you'll be able to select them on three of those, and I hope that you do join us for that.
And so that's in 15 minutes, and so you have time to get a quick snack and have a bathroom break.
So thank you all so much, and I look forward to seeing you throughout the rest of the session.
Disclaimer:
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.
- Meet the OJP Science Directors: Nancy La Vigne and Alex Piquero Discuss the Future of Research and Statistics at the 2023 NIJ Research Conference
- Population and Subgroup Differences in the Prevalence and Predictors of Campus Sexual Assault
- Desistance from Crime: Interventions to Help Promote Desistance and Reduce Recidivism