Developing and Implementing Evidence-Based Practices in School Safety - Plenary Discussion, NIJ Virtual Conference on School Safety
Review the YouTube Terms of Service and the Google Privacy Policy
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video shows the plenary session "Developing and Implementing Evidence-Based Practices in School Safety."
Though we have learned a great deal about what can improve school safety, putting that knowledge to practice can be a demanding task. This discussion will address issues related to the development and implementation of evidence-based approaches to improve school safety. Though the programs/ strategies discussed in this session were not always successful in achieving desired outcomes, what panelists learned will help others seeking to improve the approaches they take to keep schools and students safe. Each panelist has a unique story to tell about their work to use research and evidence to improve school safety.
>> Good morning and welcome to Day 3 of the NIJ Virtual Conference on School Safety.
Again, my name is Mary Carlton.
I'm a social science analyst with NIJ, and I'm happy that you're joining us for the third day.
I hope that you've been finding that the sessions and the discussions, so far, have been both useful and inspiring your work.
I expect that that's going to continue today, the last day of our conference, and we'll have a full day.
Before we move to this morning's discussion, I'd like to share some preliminary thoughts I have about the conference and review a few housekeeping items, and I'll get started with a couple of those housekeeping items.
So, on the conference website, two things I want to bring your attention to.
First is that you'll find a certificate of attendance that you're welcome to complete and print for your records.
Also, note that, starting at noon today, you'll have an opportunity to complete a conference feedback form.
Please let us know about what you think about this NIJ's first large-scale virtual conference.
I'd like to reflect with you for just a moment on some of the themes and issues that I've learned about over the past two days.
I've heard a lot about successes and challenges that folks have faced trying to develop and assess the impacts of school safety programs, including stories about well-intentioned programs that were ineffective.
I've heard about approaches to identifying how to best implement programs and strategies to ensure that schools are able to identify and respond to safety threats, as well as both simple and complex approaches that are being used and tested to improve school safety.
I could go on, and I will at the end of the day, at our closing session, and during that time, I will share additional thoughts with you about what I've learned, and we'll invite you to share your thoughts, including where you think we need to go from here to advance school safety.
I hope that you'll join me.
At the outset of the conference, I mentioned how this morning's plenary session, developing and implementing evidence-based practices in school safety, goes directly to our conference theme, bridging research and practices to safeguard our schools.
This morning's discussion, moderated by NIJ's Phelan Wyrick, includes discussions with diverse backgrounds who has worked to advance school safety has been influenced in different ways by research evidence.
Phelan.
>> Thank you, Mary.
It's such a pleasure to be here today, and I want to also welcome all of you who have been participating in this virtual conference on school safety that we're doing here.
I am honored to be able to facilitate this discussion session that we're going forward with today, and today we're going to talk about developing and implementing evidence-based practices in school safety.
Really, evidence-based practices and bringing research and evidence to improve school safety practices is what this whole initiative is about, and, really, what in whole conference is about, and it's what all of us are focused on doing.
It's why you're here.
But it's tricky and challenging work, and we've brought four people together today who bring a range of experiences and expertise to the table to discuss their perspectives and experiences trying to do this work, helping us to identify some of the challenges and strategies for overcoming those challenges as well, based on their experiences.
So, today, we have Deborah Tempin from Child Trends.
We have Daniel Mears from Florida State University.
We have Paula Fynboh from Sandy hook Promise, and Cheryl May from the Arkansas Center for School Safety.
Now I will mention a couple of things before we get going in our discussion, and that is that many of you are aware and directly impacted by some pretty severe weather events that are going on across the country.
Currently, as I look out my window, I'm looking out on the white ice and hail that has just fallen for about the last couple hours.
It's just been dropping ice out of the sky, so I'm very happy that we're doing a virtual conference, because this would be a travel nightmare if we were doing an in-person conference herein Washington, D.C., right now.
But this created challenges for some people in terms of connectivity, and certainly our hearts go out for those folks who are in Texas and other parts of the country that may be experiencing outages, and I know we've had some difficulty with those issues.
One of our panelists today, Cheryl May, is in Arkansas, where they are getting some extreme weather, so her connectivity has been a little bit compromised, and I think that she'll be participating primarily through voice today.
So, bear with us as we work with all those issues.
I'll also mention that we do have a question-and-answer function for this session, so you will see you have the ability to put questions in.
This will be a discussion format, so I'll moderate the discussion and ask questions of our presenters for a period of time, and then we'll turn to questions from the audience.
When you submit your questions, they will come to me.
I will share those questions with the presenters towards the end of the discussion session.
So, with that, I think we'll go ahead and get started.
Again, welcome to the panelists, and thank you so much for being a part of this.
And let's just open up by -- I just want to bring a question to each of you to help all the people in the audience get to know you a little better and sort of where your experience is.
So, really, I'm just going to ask each of you to answer the question of, could you take a minute or two to tell us about your experience in terms of developing and implementing evidence-based practices in school safety or school safety practices more generally.
Can we begin with you, Deborah? >> Yeah, sure.
Thank you again for having me.
I'm really appreciative to NIJ, both for having this conference and also funding some of my work, as well as all of the work that I've heard about over the conference.
I'm Deb Temkin.
I am the vice president for Youth Development and Education Research at Child Trends.
And if you're not familiar with Child Trends, we are a nonprofit research organization based out of Bethesda, Maryland, focused on improving outcomes for children, youths, and their families.
I have been in that role for about the last seven years.
But before that, I was at the U.
S.
Department of Education where I was leading the Federal Building Initiative, so I got to work with youth.
A lot of the researchers that I know are attending the conference from the policy role, as well as from a researcher role.
I transitioned back into research after I left the department and have since done a number of different studies, including one funded by the CSSI program, working in Washington, D.C., as well as doing a lot of work on policy.
So just to give a preview, we've been working with the Robert Wood Johnson Foundation and the National Association of State Boards of Education to code policies at the state level across 200 different topics related to health, which includes school safety, and that database will be launching early next week.
So, if you're interested in seeing what your states are doing around some of these really important issues, everything from threat assessment to social/emotional learning, you can find that information on that website.
>> Thank you, Deborah.
Daniel, same question.
>> Sure thing.
Thank you again for having me here.
My experience comes from an NIJ evaluation that my colleagues and I undertook of a delinquency prevention program in Palm Beach County, which is home to one of the largest school districts in the country.
The project was unique in pursuing a school-based approach to working with youth who are engaged in delinquency or who are known to be at risk of doing so.
It focused on matching sanctions and treatments to each youth's unique risks of offending, the causes of their offending, and tailoring wraparound services to them.
Interestingly, a main focus of the program was not youth services, it was just more coordinated leveraging of existing resources in the school or in the community.
Excuse me.
It also focused on school-wide safety.
The idea was that the school district would try to send a message to all the students that we care, but we're also watching as well.
So, our focus was on looking did the intervention, in fact, reduce suspensions? Did it reduce arrests or reduce police contact? In the end, what we found were significant implementation challenges.
We found consistent evidence of impacts at the individual level.
Qualitatively we found some evidence, at the system-wide level there were improvements in how the school was approaching working with youth.
We also learned a great deal about, again, the implementation challenges, one of which is just the school district is extremely large.
I come from New Hampshire.
This school district is larger than -- it serves a population larger than the entire State of New Hampshire School Districts combined.
We learned that test mandates created pressures for a lot of the teachers and principals.
We learned that student poverty is a significant issue.
So, there are a range of issues that bubbled up that raise questions about what affected the effectiveness of the implementation program, but also what would affect its potential effectiveness in other places.
>> Great.
Thank you, and I look forward to of delving into a number of those issues as we go forward with our conversation.
So, Paula, I'd like to turn to you.
Could you tell us a little bit about your experience developing and implementing evidence-based practices.
>> Thank you, Phelan, and good morning everyone.
My name is Paula Fynboh, and I'm the vice president of Program Delivery and Sustainability with Sandy Hook Promise.
And at Sandy Hook Promise, we work to empower youth to know the signs of potential school violence, and, through that, unite all people to value the protection of children and to take meaningful actions in schools, homes, and communities to prevent gun violence and stop the tragic loss of life.
I joined Sandy Look Promise in 2014, and at that time, facilitated conversations with educators, school administrators, school mental health workers, parents and students across the country to help inform the type of violence prevention programs schools wanted to see based on the needs of their young people and the day-to-day realities of a school environment.
We then worked with subject-matter experts to develop two evidence-informed youth violence prevention programs, our Start with Hello program and our Say Something and Say Something Anonymous Reporting System for students.
Since 2015, we've partnered with school districts across the country to implement these two programs and have collectively trained over 12 million students in our school safety programs, and we also partner with schools and school districts to help sustain these programs, both throughout the school year, and year to year via our Students Against Violence Everywhere or Save Promise Clubs, and I'm de delighted to be with you all today.
Thank you.
>> Thank you, Paula.
So, I'm going to turn to Cheryl now, and Cheryl is coming to us as a representative from a State School Safety Center.
Well, if we can get audio from Cheryl, I'll double back.
In the meantime, I think I'll move on to, really, talking a little bit about this really basic issue, this basic question about evidence-based practices, and that is, what do we mean by that? It's a term that a lot of people use, and think it can be kind of tossed around a little bit, and we want to be careful not to use it improperly.
So, when we say evidence-based practice, what is it that we mean by that? I'll just sort of open it to the panel, anyone who wants to jump into to give your thoughts.
>> Yeah, I think, I mean it is a basic question, and it's important, and it is used inconsistently, I think.
Sometimes people think because one study was done, that the program's evidence-based, or two or three studies, so there's a lot of variation in youth about what is evidence-based, and it's unfortunate, because sometimes, you know, one or two, three studies, even if well done, don't necessarily tell us if the program would be effective in other places or other contexts.
So, one of things I would argue is that evidence-based, when we're thinking about impact, should be focused on a large number of well-done studies, but there also should be a penchant to evidence about the need for the program, about whether they're actually identifying the causes of the problem in a particular area, evidence about implementation, and at least evidence about, well, is this the best option for us, the most cost effective one relative to the other options that exist.
>> Yeah, I fully agree with Daniel.
And I like to ask three additional questions too.
Is it evidence based, and talk about, is it evidence-based for who, for what, and in what context? And just to expand on that a little bit, when we're talking about evidence-based, typically we're talking about it in aggregate, that in an evaluation.
We saw a change in the outcome based on the logic model of what whatever program, and that's usually based on a control sample versus an intervention sample.
Hopefully they're matched, whether it's experimental or non-experimental.
But we're not necessarily digging down into subgroup analyses and understanding exactly who is this working for, and if it's not working for certain groups.
I will say some studies do do this, but it often is the case that we don't have the data to do even do it for some subgroups.
For example, very few studies are actually including measures of sexual orientation and gender identity, even though we know that LGBTQ youth have higher rates of many forms of violence.
The other thing I'll point out with the for what, is, really, our evidence is only limited to what researchers are including as their outcomes.
I'll give a very specific example.
There's a lot of debate right now about whether schools should have school resource officers.
This is an ongoing debate.
And there was just a study released by the Cybersecurity and Infrastructure Security agency that I won't go into the full details of the study, because there's a lot of method logical issues of it.
But, basically, they did a virtual reality simulation to conclude that SROs are great and every school should have an SRO because they protect against school shootings.
The problem is, they didn't measure all the other impacts that SROs might have on a school climate, particularly for youth of color, knowing that SROs may actually increase fear among students; that SROs may actually increase a climate of hostility for youth.
So when we are thinking especially about equity issues and where current interventions might have negative impacts on students, we really have to be cognizant of those when we're thinking about what is or is not evidence-based.
>> Right.
So this idea of evidence-based isn't so clean cut as maybe we'd like it to be; right? And one of the other things that I think we would also point out is that this idea of evidence really exists on this continuum; right? So, we don't start up with a program and just sort of hop right into, you know, being an evidence-based program, you know.
You can't just run one study and then suddenly you're in that sort of elite category or what have you.
It's really more of a process that begins with ideas that don't just come out of nowhere but that are based on research and using research to inform program development, and then using research and evaluation to sort of continue to refine, over time, our sense for how does this program work and to get to those kind of questions Deborah and Daniel were sort of talking about, you know, for what populations and so on and so forth.
So, we often talk about the way the English language works and the way we use our words, we talk about things being evidence-based, sort of is it or isn't it.
But it's really not quite so simple a phenomenon.
The dynamic is really something where we start with something that maybe is grounded in theory and it's grounded in research findings and data.
Maybe it's data driven.
It's research informed, and we implement it and we put it in place, we test it and we begin to develop evidence over a process that continue and can really last and carry forward for years as we better understand the effectiveness of the program.
And, Paula, I'd like to turn to you, because I really listened to what you were talking about in the way that Sandy Hook Promise developed programs, and I see that in the way that you've done your work.
>> Sure.
I think, for us at Sandy Hook Promise, one of the things that was critical in our work is understanding the day-to-day realities of a school and how they must be set up for success.
And often, it's better to do a limited number of programs well and be able to sustain them and build them into the fabric of the school district and the school culture, versus trying to take on too many programs.
And so this is where I think it's important in the design and the research to understand the competing priorities that a school may have.
What we have found is a best practice is kind of identifying this trifecta, where it includes champions at, like, the district level, such as superintendents, people who really champion and shepherd the program throughout the year, year to year, allow the space, for it to be implemented and to take on.
And then also working with those who are going to be facilitating and implementing the program in the day-to-day, such as educator, counselors, mental health, to make sure that they have the support that they need to be able to implement it in the day-to-day.
And then the third critical triangle are the students themselves and making sure that students are the ears and eyes of school districts and schools, and when engaged, they will partner with adults to keep themselves, to keep their friends, to keep their school safe.
And then, also, within their understanding the realities of a school district, that you can have any one of those triangles leave with staff turnover, graduation rate year over year, so looking at the design of the program as something where there is opportunity to build champions, to build local capacity and to be able to sustain the program so that schools aren't starting new every year.
>> Yeah.
So, we're really talking about such a multi-faceted topic here, where we, you know, we start off with just a simple idea of, hey, we want to find programs and practices that work, and we want them to be effective to improve school safety.
It seems so straightforward.
But as you start to scratch and get below the surface and really think about what it takes to put these into place, you start to see that it's so much more complex as we think about not just sort of what research tells us are the types of outcomes we should go for and the way we should design our programs, but then it's how, as Paula was describing, how are we going to implement these and really put them into a real-world school setting in a way that is going to fit their way of doing business, and not just their way of doing business at the start but over time so that, as schools go through the changes that they go through, they're still able to address and respond and carry out this type of programming in a way that is still effective.
So, you know, those lessons that you're starting to learn about that triangle that you referred to of key players is, you know, the kind of issue that gets into what we talk about with regards to implementation and, you know, a lot of times, folks, when they come from sort of the research side and they're talking about evidence-based practices, they want to talk about implementation with fidelity, and they have this idea of a model and it's going to be implemented with fidelity.
It's like you get a product at the store, are you following the instructions in terms of how you use this? And that's the concept behind implementation with fidelity.
But, again, our schools have so much variation.
There's so much variety, even just within one school district, but then, of course, you go across thousands and thousands of school districts across the country and the different dynamics they face.
So, we're always in that kind of tension between trying to say, hey, can we use something that's been tested and we know is effective, but, also, can we fit it to the dynamic? >> Yeah, so one thing to add to that, although this study is a little dated, back in 2011, the U.
S.
Department of Education did a study of all the different prevention programs that schools were using.
And they found that, on average, schools were using around nine different programs.
Now that doesn't mean that they were implementing nine programs of fidelity, but it means they were picking and choosing the types of lessons and things they were doing for each of those programs.
And this is really problematic.
When we test our programs, we try to control the environment as much as we possibly can, to the extent we can.
So, often, that means we're going into school districts that are a little bit more stable.
We're going into places where we're able to actually ensure that programs are being implemented as we designed them.
The problem is, when we shift from an efficacy model to an effectiveness model, things can get really complicated.
So just to give you an example, if you heard my colleague, Dr. Ryberg's, talk yesterday on our work in Washington, D.C., we were in a school district where I don't think we worked at a single school that didn't have a principal turnover at least once during our study, so across the four years of our study, which, when a principal turned over, we had to restart and make sure that we had their buy-in, and sometimes we lost their buy-in, because they wanted to go in a different direction.
So, there is a lot of challenges.
I think another challenge to this is that often when we're designing programs, we're designing the whole program, and we're not thinking about what's actually inside the black box, of which components are actually working or not, to make sure that we can make it as simple as possible for schools to adapt, that they need to adapt, without changing the core ingredients we're actually needing to change.
>> I would echo that.
I mean, in the study that we did in Palm Beach County, you have similar kinds of issues, where it's extraordinary.
Schools are busy places.
They have test mandates.
They often don't have enough funding to meet the test mandates.
They've got large numbers of youth with significant challenges.
And I have yet to meet a teacher or principal who looked and said, gosh, I've got all this extra time, could you give me something to do.
They're already busy.
And a lot of funders, NIJ, foundation, others have all these good ideas about, hey, you could do this, and we'll just plop it into your setting.
And for most people from the inside, a typical response would be, we'd love to, but we don't have the ability to do that.
A problem I think that comes from that attention is that it creates a focus on really narrowly focused programs, because that's the easier thing to implement in a study, something that's very small and narrow, touches a small number of kids and you can do a more rigorous study, and that's great.
And it's more likely to be implemented well.
But you do have this penchant where that means you're probably not dealing with the most at-risk youth and the large number of them.
So we certainly faced that in Palm Beach County, where there were some places where it was a little easier to implement the program in some schools.
They had a little bit more resources, a little more buy-in, some of the needs weren't quite as stark for some of the youth.
So, you could see, you know, if it was to continue over time, maybe they'd do it in that area because it's more viable.
But that's not necessarily where it's most needed either.
But, again, there's a real penchant.
If you can't implement it well in another place, you can't study it very well.
It's hard to develop that evidence.
I did want to highlight one thing you said, Phelan, I think that's important, is a research-based approach to developing an intervention.
You know, Deborah talked about this, and Paula did, but, I mean, a lot of times an issue that schools face is they get funding and they have to hurry up and implement something, and that can leave by the wayside a focus on having research to inform every step of what's going on.
What do the teachers actually think about this, parents, teachers? That's all critical information for ensuring we develop something that actually has a chance, a fighting chance of being implemented well, and, ultimately, being effective.
>> Yeah.
And it's a shame that, unfortunately, I think our connection broke with Cheryl May.
But I want to share a perspective that I think she would bring, based on my conversations with her in advance.
And, you know, she works in State of Arkansas, and she worked at the State School Safety Center.
And what that means is that she ends up sort of in that nexus between state government, like the governor's office and the legislature and the school districts.
And so they're in this other kind of challenging space, where they're trying to think about legislation, state-level laws and resources, and they want to base it on research, but at the same time, you know, they're trying to pass policy that has applicability across their state.
And thinking of, of course, that and how to do that best, they're capturing information from, you know, schools and teachers and administrators, and that's based on their perceptions and all these other things.
So they're trying to use data to try to drive towards effective strategies, but, you know, where does evidence-based fall into that, you know? And it's very challenging to think of how, at these various levels, we can support evidence-based work.
At our level at NIJ, we can focus very much sometimes on what does the most rigorous research say? But to Daniel's point, a lot of times, in order to do the most rigorous research, you've got to focus on a narrowly-focused program.
And, you know, we know that what schools are doing, often, is broad school safety efforts, and so the narrowly focused programs, maybe we can have more success in evaluating those, because we can really define what those outcomes are, and inputs.
But, again, it may not be what the practitioners most need or what the policy-makers most need.
>> Phelan, I mean, I would echo that.
I just have a very brief add-on.
I think that one of the biggest opportunities to improve policy, and also one of the biggest gaps currently, nationally state by state, is inadequate research infrastructure to feed into legislative decision-making.
You know, for those who follow legislative processes, a lot of time, it's lawyers, and lawyers are great, but they're not researchers, and so the more direct infrastructure for helping, it's doable.
Sometimes there's impact assessments that are done.
So, the dean of my college, Tom Blomberg, led an effort to help inform the legislatures assessment of about ten different bills, where they conducted impact assessments, racial impact assessments.
It's doable, but it costs money, and you have to have infrastructure for that.
And when you don't have that, that means that legislators who are pressed to make decisions quickly have to rely more on that instinct or whoever happens to be lobbying them at a given moment.
>> Sure.
>> Yeah.
And I'll just add to that.
We actually looked at the course of school safety laws at the state level from 1999, after the Columbine shooting, through 2019, so 20 years of school safety legislation.
And what we saw is that, really, there's a reactionary mode, where, right after a major school shooting incident that has happened in those periods, so Virginia Tech and Sandy Hook, there is a flurry of legislation about things we have very little research about.
So, things about arming teachers, things about implementing -- required armed security guards, requiring of hardening of schools.
These things where there's either little evidence or a lot of mixed evidence, and some evidence they actually make things worse, where these are now codified laws in these states, requiring schools to do these activities, which are not evidence-based by our definition.
So, I think there is this notion that we need to make sure that we are commenting when these bills are passed about what is and is not evidence-based and what the potential implications, and particularly equity implications, are on a lot of these new proposals.
>> I couldn't agree more.
I think prioritizing equity in the legislation and rollout in the implementation so that we know that the programs that we are standing behind and pushing, we have a responsibility to make sure that those meet the needs of the most vulnerable students and don't have an adverse effect.
And I think, you know, in addition, is engaging those closest, students, parents, teachers, educators, you know, within lower income and racially diverse schools to help inform what programs work for them, what programs haven't, what are the day-to-day realities, bright spots, challenges and obstacles, and making sure the programs are designed with those that are the most vulnerable in mind.
>> Yeah, so we're really getting into that part of the discussion where we're talking about some of the challenges and sort of how we try to fuse evidence into this work, but we're also beginning to talk about what kind of advice do we have to offer for folks in terms of overcoming those implementation challenges and overcoming the types of things, you know, whether it's the availability of research or whether it's buy-in, and all the different conflicting priorities and challenges that we run into at the school or at the school-district level.
Can we say more? Any other thoughts about advice we can offer for overcoming those challenges? >> Collect data.
I think, of course, the research in me is going to say collect data and actually look at what you're doing.
But I think you can't just rely on labeling something being evidence-based to think that it's going to work in your school.
You need to actually consistently assess it.
You need to look not only at the aggregate but looking at if it's furthering inequities in the schools, furthering disparities, and really make some decisions.
The other thing I'd point out is that the population of a school turns over every year, and so the particular needs of the particular students in a school in any given year might look different than the year prior, and will maybe look different than the school down the street, and so continuing to reassess what is needed, what are we spending money on, what is working, what is not can help schools get there.
The other thing that I would also stress is, the other things data do is also point out that when things are having negative impacts.
A lot of things that we think are common sense, once we've actually evaluated them, show, actually, they don't work or cause negative impacts.
So, think about Scared Straight, for instance, which actually increased delinquency.
Think about the D.A.R.E.
from the 1990s, and this is the part where I have to say D.A.R.E.
has changed their formula, so it's not the same D.A.R.E.
But the D.A.R.E. of the 1990s, which actually increased drug use with that formulation.
These are things we have to know, and we only know them by collecting the data.
>> Yeah.
>> I would certainly echo that.
I know it's pretty general and vague to say, but I think there's a commitment to good process.
It's very difficult when you have principals who never mention constant turnover.
Paula identified a lot of context-specific issues.
I mean, if you have lots of turnover with principals, if that's going to be a real phenomenon, then somehow the commitment to good process for developing information, developing good policies, ensuring good implementation, it doesn't magically happen.
If you don't institutionalize the process for that and ensure constant monitoring of it, it's about as effective as -- I don't know -- a kindergarten teacher not going outside and watching the kids and hoping that they magically stay in the pen, in the play area.
It's not going to happen.
And I do think a commitment to research.
I'm bias, but I think it's a very real problem, but I think a lot of people don't really know what research is.
I've been at so many meetings -- to me the most prominent was actually NIJ conference -- where there was about 20-or-so heads of corrections, and they would frequently talk about, well, we need research that does this.
It's kind of like they had a few bu.
words they could throw out, but they weren't really comfortable talking about research.
And that's problematic, because they should be telling researchers what to do.
I've been around a lot of researchers where you just walk in and you say, well, we should do this and you should do this and give us your data, and they're ready to go up and running right away.
But that's not in the best interest of the schools.
But I think the commitment to good process, and I think a good commitment to research.
I know that's pretty general.
There's a lot of ways to double, and a lot of ways to get that.
But without it, I don't think you can move very far.
>> I agree.
I agree with that, and speaking from more of an implementer versus a researcher, you know, what I would add is, I think what Daniel said is right, is often the process is the product, and so how does research partner with community to really inform that best process.
And, for us, what we have seen is that is building in sustainability at the onset of implementation, so considering and identifying how schools will build capacity to be able to maintain the program over time.
How are they building champions and skills within the district and the school level that can help weather, again, some of the turnover from year to year, and also have these champions who can implement the programs are ways that we've seen kind of best practices, where community or schools come together with research to help make that process as strong as possible.
>> Yeah.
I'm so glad you -- I really appreciate all those comments and just a few of things I picked up in there, first of all, I'm just, of course, completely in support of the idea of continuing to collect data, and, you know, what you measure is -- you know, if you're measuring it, then it's important to you; right, and sharing that information on a regular basis.
And, you know, Daniel's comment about commitment to process, this is, in fact -- you know, we've seen this in research.
We've seen this time and time again.
It's this commitment to, hey, we've got to make sure we're doing this the right way.
And sometimes, unfortunately, people kind of string together what they're going to do, you know.
They pull ideas together and they say, okay, well, let's just do this, and then you sort of scratch below the surface and you find, well, have you written this out? Do you have a game plan? Do you have an actual plan that you are implementing against it? And if you change your path along the way, do you change the plan? Are you updating and revisiting your plan? Are you checking in on it, and do you have sources of data that are touching bases? They might not be perfect.
In most cases, they're not.
It's not a perfect indication of is your program working or not.
But there are sources of data you can look at and just see, for example, well, wait a minute, is one of our indicators is really going in a different direction than we anticipated.
You know, these types of questions.
And then, you know, to Paula's point about -- I'm glad she used the word "champions," because you see that so often in, really, the applied setting.
And we alluded to it earlier, when you have so much turnover and you have these schools that are in so much flux that, you know, it's hard to have a champion.
It's hard to be a champion in that environment, you know, because this happens whether you're talking about schools or whether you're talking about police departments.
You get a whole new police chief and they're going to say, well, you know, we're going to make some change around here.
We're going to do things.
I want to burnish my legacy, or whatever it is.
And that's just a reality, and that's the way people operate, and so you can have champions in those environments.
It's just much more challenging; right? But knowing up front that, hey, if we're going to do this and if school safety is a real priority for us, we're going to have to have a champion.
And if you've got a really well defined, tightly defined program you're trying to put in place in that environment, that's going to be real challenging, so you better package it within maybe something broader that's really tailored well to that environment.
So you've got that one set program that's maybe an evidence thing, evidence-based program you're trying to implement with fidelity, but you're also packaging it within a broader framework of, this is our overall school safety effort, based on all the dynamics of the specific school, and we've really done our assessment here to tailor it to what we're doing, et cetera, and we know this is a good fit for us.
Go ahead.
>> If I could just interject, I'm going to tie that to something that Deborah had said, and it ties into several comments Paula said.
But, you know, sometimes advocates of programs and champions of programs, they're really focused on their program, and many schools have lots of programs underway, so that commitment to the process involves a champion for what is best for the school or school district, and that means a way of being able to take stock of all the interventions at once, where are the gap, what could we be doing, what would be the potential impacts of that.
You know, if we do this program, what's going to happen? And, you know, you shouldn't have to wait two years and a half-a-million-dollar study to say, yeah, there was a lot of resistance to that program because it pulled dollars away from this and that and deprioritized XY.
ZSo, again, I think that attention to a champion not just for a program but for that broader context of what the school's trying to do.
>> And I would just add to that that we have to recognize the schools have history, and the people in those schools have history, and so understanding that bringing in a new program may not actually be new for schools.
So, just to give the example of Washington D.C., for a variety of reasons, D.C. schools tend to be test cases for a lot of different programs.
So, a lot the things that we heard in talking to schools was, oh, yeah, this will come and go.
You have a five-year grant.
This will come for five years, and it will go away, and then we will be left without this again.
And I think we have to understand and recognize that history of schools have tried programs and they failed, and they have resistance to this, and they have resistance to data.
You know, I just advocated that you should collect data.
At the same time, schools are a little scared of data.
They're scared of data being used against them for accountability.
They're scared that if they have negative findings, they don't necessarily want to share those out, because, at least in the case of D.C., we have a school choice environment, where there's competition between the public schools and the public charter schools.
So, if data is shared that, hey, we have a bullying problem or we have an issue with students feeling not engaged in our schools, it may result in students opting to leave and go to another school.
So, we also have to recognize that there are historic and contextual issues that, in some ways, prevent us from actually doing the type of models that you're talking about as champions, that we have to actually directly address, and be explicit about those historical structural issues that are in place.
>> Great.
So, we're getting some questions from the audience.
I'd like to sort of shift to that, and then before we break, at the end, I'm going to come back to -- so keep this in mind, I'm going to come back to a question for each of you, which is going to be, what are your three most important take away messages for the audience.
But, first, I want to do some audience questions, and the first one is, much of the -- and this is in quotes -- evidence and school safety comes from commissions and active shooter after-action reports, and these reports often identify interventions as evidence-based or best practices without a lot of real support for these assertions.
How should we consider this type of evidence? >> Skeptically, with a great deal of skepticism.
I think it's understandable.
I mean, in policy research, it's very easy to point to a program and say, you can't tell me this isn't effective.
95 percent of the kids don't recidivate, and you end up saying, yeah.
But in fact, you might have got that had you don't nothing.
So, I think, just skeptically.
I don't think dismissing it by any stretch.
I think there's just skepticism, and try to read carefully and understand precisely what can and cannot be claimed for that study.
>> Yeah, I think it's a misuse of evidence-base is sort what we've been talking about.
I also think we have to realize that it is also challenging.
You can't exactly randomize a school to have a school shooting.
That would neither be ethical nor desirable to do.
So we have to make some assumptions of what will and will not work in a very statistically rare event.
That said, a lot of things that come out of commissions, especially commissions that come out of a government, are going to be politically motivated, and they include things that have absolutely nothing to do with events at hand.
For instance, disciplinary policies and their role in school shootings is a little bit of a stretch for example.
>> So, one of the ways that I've talked with people about this topic more generally around evidence-based is that, you know, we deal with a lot of questions and issues in real time, and, of course, to get to the point of having evidence-based programs, as we've discussed, it can take years of study and so on and so forth.
And, of course, in response to these types of incidents, there's usually a strong interest in trying to come up with some answers that we can kind of get behind.
And so, for me, I put that in the category of expert judgment.
And so, you can bring experts together and say, okay, the experts have come together, and this is what they feel the best path forward is.
It doesn't make it evidence-based.
It also doesn't make it wrong.
They might be absolutely right.
It's just if there's no evidence behind it, we just don't know.
And so, skeptically, I think, it's the best one-word answer in terms of how you respond to that.
You respond with skepticism.
Not necessarily cynicism, but skeptical, and we're going to, like, let's -- does that merit further testing and so on? But it falls into the category, I think, of, like I said, expert judgment, which is what we go on in many cases when we don't have evidence, and it needs to be tested further.
So, there's a related question that's come in, and it says, a recurring theme from each of the presenters is prioritizing equity concerns; however, in the initial reactions to high-profile incidents of school violence, equity has not seemed to be a high priority.
So, how can we encourage policymakers to take a longer view and consider the unintended consequences to policies they are seeking to implement, especially as it concerns equity? >> I think we just need to keep speaking up and pointing out the unintended consequences of a lot of things that people point to as common sense.
And I say that because, again, there's not a lot of evidence on a lot of these things.
And while they may lead to some of their desired impacts, we often aren't talking enough at the unintended consequences, and a lot of solutions that people come up with.
So, the more that we talk, the more that we put that evidence out there, I think the better we can focus in on equity.
>> And I think, Phelan, to your last point on experts is, it's all of our responsibility to ask the question and to raise who are we considering experts in this space, and are we engaging those most impacted by the issue, because they are experts and have that lived experience to bring to the table.
>> Yeah.
>> I would echo that.
I think that -- I used to work at the Urban Institute.
It's a non-partisan, nonprofit institute, and I value deeply the non-partisan nature.
You're looking for subjective information you get.
Definitely, I think, the boots on the ground information, the cool thing about life today is, we can collect information much more cost effectively from people with boots-on-the-ground experience.
It's very easy to do surveys with students and teachers for example, and principals, and get real-time information on what's going on from their perspective.
I would echo what Deborah was saying, that I think as long as we keep talking out loud about these issues, eventually, more people pay attention to it.
But I'll come back to just institutionalized attention.
Policy-making is a very reactive undertaking, and either we change that or you build in a governor of sorts.
And I think, for me, that would be institutionalized research processes that are really well funded and that really create a way of getting information out there.
That's the part of me that's a little naive.
But I think more information, better information that people trust that can be more rapidly access, that should help policymakers do a better job.
That should help the public keep policymakers on their toes.
It should help make sure that people with boots on the ground experience, their voices are more directly heard, and I also think that without that it's very hard to see significant progress, because, essentially, certain voices are silenced.
>> Yeah.
And so this kind of ties into a related question that's come from the audience, which is, you know, we're talking about adding programs to schools that are evidence-based, but any advice on how to facilitate transition away from programs which are not effective or conflict with other programs the schools are using? So, any thoughts on those topics? >> Well, I'm going to sound like a broken record, but I mean, I've been heartened by -- I've been in a number of settings where people believe in a program or policy, they're ardent about it, and then, you know, when they're presented with good information in an objective manner without an agenda, other than trying to improve practice, then -- I do a lot of work on solitary confinement restrictive housing, and it's a very contentious issue.
And I've met many corrections officials who aren't necessarily wed to using that.
They're wed to doing what they think is best, and when given information, for example, that says, gosh, we're putting the wrong people in there, then they go, maybe we don't need to be doing that.
So, I think that getting good information into people's hands is one of the best cures for brushing away things that don't work and opening our eyes towards new possibilities.
>> This is a challenge whether it's a school or a business.
People fall to the status quo.
So even if something isn't working, it's what you're used to.
It's what you used to doing.
People struggle, I struggle with change.
And I think that the key is making sure that they have a voice, they're talking about what they like, and there's actually presented with information of how they can start to shift that and really create a strategy around that.
The other piece is, make sure that if you are taking something away, you're replacing it with something, another tool in the tool box.
So, for instance, as we heard about school systems or states trying to limit the use of exclusionary discipline, some of the things that we've heard are, well, then what can I do? How can I control my classroom? I don't have any way of actually doing this now that you've taken away suspension as a consequence.
So, we have to make sure that it's not just taking away things that we think are problematic; that we're giving the school officials, teachers, et cetera, actual tools they can use to address the issues they were using that procedure for in the first place.
>> Right.
If there's an underlying issue or problem that needs to be addressed, you can't just pull something away that is identified as ineffective and then not bring something in to address it.
And, you know, this next question kind of, you know, maybe touches on attention that comes up in at least one area of school safety, and that is regarding school resource officers, and they were raised a little bit earlier in the presentation, so this might have prompted the question here.
The question is, if the panelist is familiar with the National Police Foundation's Averted School Violence Database? And it says -- it doesn't exactly have a question; it basically says it contains interesting information regarding the work of SROs and other school professionals regarding averted school violence.
So, if you're not aware of it, I'll just sort of quickly say, for the benefit of the audience, that the Averted School Violence database captures information about incidents where school violence was averted; right? So, somehow there was a tip or some kind of information occurred that helped people sort of figure out, like, something was being planned and so they moved into make some sort of steps to address that, some sort of intervention.
And I actually facilitated a session earlier in the conference where we talked about threat assessment, we talked about tip lines, these kinds of things, and I know that that's work that, for example, Sandy Hook Promise has been supportive of in various ways.
So, any thoughts or comments on that work? >> Again, I would just point to the question of for what? So, it's true that some of interventions, whether it's school resource officers or other things, can lead to positive outcomes, but we also have to look at the full picture and understand, you know, are they also leading to things that we want to directly address? Are they leading to more arrests in schools? Are they leading to increased feelings of disconnection or hostile climate, particularly for students of color? We need to address those.
Those are not thing where we can just say, well, that doesn't matter because they're increasing perceived school safety.
>> Definitely, and it's an emerging literature of trying to understand SROs and their role.
My colleagues are working on a paper recently, identifying some of the issues.
You know, one of the issues is confusing role.
The police on the street have a slightly different role.
But on a school campus, you could have an officer who views their job as, really, a traditional officer role, or they could be more of a social worker, a friend to the students.
It creates lots of opportunities to do good and, also, opportunities to create harm.
So I think there's a lot that needs to be understood nationally in school by school about how SROs actually do their job, what they're actually doing, what impacts they have on different groups.
It's not the Wild West, but, definitely -- there's a lot to be learned to date about what's going on with SROs.
>> All right.
So, thank you for addressing that.
We have other sessions as well at the conference, both sort of yesterday and the day before, I think, that address SROs, and we may have some additional today -- I don't recall -- for the schedule.
But we are running close to the end of our time, so I just want to kind of run around the three of you, and maybe I'll turn it to Paula.
What sort of three most important takeaways would you like to share with the audience? >> So, in terms of the most important takeaways, I would say you consider your audience, know your audience when you're trying to either implement or build a research project, so to really understand the day-to-day realities of the school district, of the students where you're trying to have an intervention.
I think also engaging decision-makers within each level, as Daniel said, the process, the process is the product, so considering not just how it gets handed to a school district but what does the engagement look before, during, and after the program or intervention takes place.
And consider sustainability at the onset so that we are setting our schools, our educators, our students, and our administrators up for ultimate success.
>> Great.
Thank you, Paula.
Daniel, same question, three most important items you'd like to leave listeners with? >> Sure.
Well, the first is research, we need credible research.
We need good research on a wide array of programs and policies, not just one or two kinds of efforts but a spectrum.
We talked about a continuum earlier in the session.
We need a spectrum, a medley of things that can fit different schools and their particular needs.
I think more broadly, we definitely need more investment in research, and I think everybody plays a role in that, in demanding it.
There is no free lunch.
If you're not going to spend money on research and have a real research division with a commitment to collecting data and a commitment to how to analyze it, a commitment to how to present that information in an accessible manner, because researchers often are terrible at that, there's no free lunch.
So, I think the emphasis on the infrastructure for research.
The third, to me, would be empowering the schools to really think about systems.
They get pushed, sometimes, into adopting programs, narrowly focused policies, and I would encourage much more of a focus on systems thinking, and that means that you don't have experimental evidence because you don't have entire systems that make a lot of different changes and get -- you can't do experimental design study with that.
But systems kinds of thinking.
In the Palm Beach County project, it was clear -- I think it's clear from today -- there's lots of ways that schools are these really complicated places, and in some ways, adding a program or policy is equivalent to letting an elephant go into a China shop.
It just messes things up.
So, I think systems thinking that includes lots of stakeholders.
And I know that's fu.
y, but it has to have a champion like a superintendent or principal.
It's definitely doable to help ensure that the totality of efforts really give us the best bang for the buck.
>> Thank you.
And so now I'll turn over to Deborah.
What are the three most important things you would like to share with folks? >> Yeah, I think, number one, there's not any sort of magic solution to school safety.
And I think we always have to answer the question of, why is the program that we need? What is the fit? What are we fully capable of doing? Who does this program target? What are the potential negative effects? We need to think through all of those in order to select what is actually going to be in place.
The other thing that I'll point out, one of the big findings that we had from our study in D.C.
is that some of the schools that needed the work the most were prevented from doing the work precisely because they needed the work the most.
In other words, we were doing an organizational capacity-building process to help them build their capacity to select programs and review data and do things of that nature.
But it was really those that didn't even have a baseline of organizational capacity to even accommodate and move from being reactive and putting out fires to thinking proactively really could not engage in the process.
So we need to be thinking, how do we, given what we know about the history of schools, the history of the people in the schools, how do we actually build something that can reach the most needy of schools and the schools that are struggling the most with school violence and safety issues? >> Thank you.
So, actually, I note that we actually do have some additional time we can work with, and I appreciate -- first of all, I do want to thank all of our speakers, but I also want to say that we can actually have additional Q&A if there's further Q&A that comes in.
Pardon me.
I'm just letting you know that we have a little more time to work with, but it looks like we don't have any further questions.
I apologize that we weren't able to have Cheryl May join us for this session.
I think her perspective would have also really contributed to this discussion.
And I'll say, for those who don't know, that school safety centers don't necessarily exist -- the state-level school safety centers don't necessarily exist in every state, but an increasing number of states have them.
They each have some level of variety in terms of how they're constructed and what their expectations are.
But these centers, I think, sit in this very interesting sort of pivotal space, between sort of the state-level government and the school districts around the country -- I'm sorry -- around the state.
So they do important work that I think would also, you know, speak to these questions of how to implement evidence-based practices.
So, I think with that, again, I'm going to thank all of our presenters.
It's been a great discussion.
We'll go ahead and close.
We don't have further questions now, but I appreciate everyone's participation.
And if we were live, I'm sure we would be getting a round of applause for everything that you've been able to share.
But for now, I'll just go ahead and give you a round of applause and thank you very much as we close this session.
I hope everyone in the audience will enjoy the rest of the conference for the remainder of the day.
Thank you.
>> Thank you, Phelan.
>> Thank you.
Disclaimer:
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.
- Celebrating 10 Years of NIJ’s Law Enforcement Advancing Data and Science (LEADS) Scholars Program - 2024 NIJ Research Conference
- From Research to Reality: Recruiting More Women into the Policing Profession: Preliminary Results of an NIJ-funded Applied Research Project
- Embodying Evidence to Action: Tracking the Impact of Three Key NIJ Research Investments; Opening Plenary of the 2023 NIJ Research Conference