Overcoming School Safety Intervention Implementation Challenges - Breakout Session, NIJ Virtual Conference on School Safety
Review the YouTube Terms of Service and the Google Privacy Policy
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video includes the following presentations:
Implementation of a Trauma-Informed Approach to PBIS, Ryan Fink
This presentation will share the rationale, design, and implementation of a project currently underway in the School District of Philadelphia entitled "PBIS in Challenging Contexts: Evaluating a Replicable Implementation Approach in Philadelphia". This project has developed a trauma-informed approach to implementing Positive Behavioral Interventions and Supports (PBIS) which is currently being implemented and evaluated in a small set of schools. The presentation will focus on describing both the implementation supports which have been developed during this project as well as the research design being used to evaluate its potential impact.
Studying Implementation of PBIS in Rural Schools During Challenging Times, Lindsey Turner
Schools in rural areas often have limited access to resources for implementation of evidence-based programs. Positive Behavioral Interventions & Supports (PBIS) is an evidence-based framework for universal and targeted prevention of problem behaviors. Many rural schools lack capacity for implementation of new initiatives (e.g., funding, staffing expertise), and have fewer external resources for implementation support. The RK-12 Rural Schools Research project is a cluster-randomized trial of strategies to support scale-up of PBIS specifically in rural settings. Among the 40 schools participating in this trial, 20 were randomized to receive basic PBIS training, and 20 were randomized to receive training plus advanced implementation support strategies (external coaching, online resources, a virtual learning community). The project was completing Year 1 of the trial when the COVID-19 school closures occurred in the spring of the 2019-2020 school year. Because a key goal of the project was to test remote strategies for implementation support, we have been able to continue with the trial despite COVID-19 disruptions to the K-12 school system. This presentation will describe the study's original design and adaptations that were made due to school transitions to virtual or hybrid virtual/in-person instruction in the 2020-2021 school year, as well as adaptations to our data collection plans.
Integrated Supports for Vulnerable Middle School Students: Importance of Implementation and Context in Randomized Controlled Trials, Lauren Decker-Woodrow
Randomized control trials (RCTs) are widely viewed as the gold standard in evaluation research, but are sometimes challenging to conduct. Too often these challenges are not clearly documented, making it difficult to assess outcomes and derive lessons. This presentation will focus on implementation issues, because they are critically important, yet often not discussed. The context of the presentation is a student-level RCT of a truancy intervention program provided within Communities In Schools® (CIS). The study included more than 1,800 middle school students across three academic years who were randomly assigned to receive either the typical CIS services or a more intensive version of those services. Discussion will focus on some of the key challenges that appeared and recommendations to inform future evaluation efforts.
National Institute of Justice’s Comprehensive School Safety Framework, Mary Carlton, Nadine Frederique, and Caleb Hudgins
Congress tasked the National Institute of Justice (NIJ) with developing a comprehensive school safety framework. The framework has three components: school climate, student behavior, and physical safety. This aligns with the other frameworks found in the literature and is underpinned by findings from CSSI research projects. Schools that employ comprehensive approaches to addressing violence are not immune from school violence. However, implementing a comprehensive approach minimizes incidents of serious violence and prepares schools to recover from them when they occur. The goal of this presentation is to discuss the major components of NIJ's framework, the relationships between them, and the research that is the basis of the framework.
>> Hello, all.
Welcome to breakout session number six: Overcoming School Safety Intervention Implementation Challenges.
There are four talks scheduled for the session.
We'll have Dr. Ryan Fink, Dr. Lindsey Turner, Dr. Lauren Decker-Woodrow and then in the presentation with NIJ scientist Dr. Mary Carlton, Nadine Frederique and myself.
So during this session, you're more than welcome to use the chat option if you have any questions about technical issues, but then use the Q and A to submit questions for specific speakers.
Questions will be answered only after all speakers have presented, so if you have a question for a specific speaker, go ahead and try and indicate that in your question when you submit it to Q and A.
And so for more information on the various speakers, please see speaker bios in the conference website.
So first, we'll hear from Dr. Ryan Fink on implementation of a trauma-informed approach to PBIS.
>> Great, thanks.
Everybody hear me okay? Everybody see my slides okay? Wonderful.
So, yeah, thanks for the introduction, be talking about a project we've been working on for the last 3 years now which is to design, implement and evaluate a trauma-informed approach to PBIS.
I'm at the Consortium for Policy Research in Education, CPRE, based out of University of Pennsylvania.
Now I will find out how to advance my slides.
There we go.
So the official title of our project is "PBIS In Challenging Contexts: Evaluating a Replicable Implementation Approach in Philadelphia." It does build off of a previous NIJ-funded study which I'm going to talk, too, a little bit about because the findings and our experiences in that first project sort of set us up and provided us direction with the current project.
So the first study, again, it was a partnership with the School District of Philadelphia, and really what it was looking at was, they had recently revised their code of conduct to really discourage schools from using out-of-school suspension, and so we sort of looked at the impacts of that policy change at the school level with a particular focus on PBIS because it was such a large initiative within the school district.
They had really invested in supporting many schools and implementing PBIS, so we wanted to...
We sort of brought that extra lens to it to say, "Are the responses of schools that are implementing PBIS any different from schools that are not?" And a couple of things that we uncovered in that first study sort of led us to reapplying for a second project that has directed our work in it.
The first one, what we saw in PBIS schools is that there was a need to be more comprehensive and to include not just classroom teachers in PBIS supports and professional development but to include what we call sort of noninstructional staff, and that includes everyone from office staff to counselors to what's referred to in the School District of Philadelphia as sort of school climate staff so folks that are spending time in hallways, in cafeterias, unstructured places, playgrounds, so that was one that we uncovered.
The second was PBIS as a school-wide intervention, we saw lots of evidence of PBIS as a school-wide intervention.
What we saw less evidence of was what PBIS looked like inside of classrooms, so you could walk the hallways.
You could see PBIS posters up in the hallways.
You could talk to teachers and students about the school-wide expectations, and they could respond and talk to about that.
However, what we saw was when teachers would go into their individual classrooms as is with the case with many things, they would close their classroom door, and sometimes they would continue to adhere to the PBIS principles, but other times they kind of made it their own and were doing their own things, and so we really wanted to penetrate in there and do some classroom coaching.
And the last one that we heard a lot from teachers was, "Yes, PBIS is good for the most part.
You know, we think it has a positive impact on our school climate.
However, you know, this is an urban inner city with lots of poverty, and the kids are coming to school with more severe needs than what PBIS can address, more severe needs than what we as a school can address," and so some of it just felt sort of really overwhelming for them, and so we wanted to work with schools to figure out ways to address these more severe needs from their students.
And so that sort of, again, sort of provided us direction for the current study, and the current study is sort of divided up into two parts.
We have a large-scale RCT going on in the school district which was supposed to be designed around sort of a light-touch trauma training for school staff in addition to training of school safety officers, and this was all K-8 schools in the district other than the nine in the QED that's described below were included in randomized treatment and control conditions, and I know that part of the focus of this session here is around implementation, and most of the rest of my talk will focus on the QED because that's where most of our time and efforts have gone, and I think it is sort of the more interesting piece of this, but I will say in terms of implementation and working with large school districts, one of the things that came out of this obviously is...
I don't know if you want to call it sort of competing priorities or sort of competing initiatives, but again the idea in the first year of the study was to give this large group of schools who were assigned to the treatment condition this sort of trauma training, and it was sort of like a one-on-one trauma, you know.
It wasn't sort of real in-depth, but it was just sort of giving school staff sort of the basics of this and trying to get them to start to think about the ways that these types of things may be impacting their students and their student behavior.
Well, in the course of us planning that and trying to get it started and working with for this portion of the RCT, the school district came in and said, "As a matter of fact, we want to give this to all of our schools now, so you don't have that portion of your RCT anymore because we don't want to withhold it from anybody," right, which makes perfect sense.
It's really important.
It's so timely but kind of threw a wrench in what we were doing, and so we had to sort of pivot on that and really focus on the school safety officer training which does have some overlap into the QED, so I can talk a bit more about it, but the QED is really taking those different components that we had identified in that first study and trying to build.
This is why I said at the beginning we're not just evaluating this.
We are developing it.
We are implementing it, and we are trying to evaluate it with a ton of partners that I'll get to but trying to figure out how we create sort of a trauma-informed approach to PBIS that sort of addresses the needs of the schools, and so this was a pretty intensive intervention, and the schools needed to be...
In partnership with the district, we realized that the schools needed to be a certain level of sort of proficiency with their PBIS implementation, so we worked with them to identify nine schools that we deemed to be eligible which means that they were in a place either that they had strong tier-one fidelity, that they had a tier-two team in place, that they had strong leadership, that these nine schools could potentially receive and sort of thrive off of the supports that we were proposing bringing into them.
So we chose, randomly chose four of those schools which became what we call our demonstration schools, so they are receiving all of these intensive supports, and then we continue to observe and collect data from five comparison schools.
And the outcomes are proposed to be classroom-level analyses, so we're looking at sort of changes in attendance, office discipline referrals, those types of things from the classroom level, but what has become more interesting over time of course and what we've all experienced in the last couple of school years is sort of this implementation piece, and so we've developed a really strong implementation science-informed study of the intervention as well, and so again for the most part I'm going to be talking about this QED portion which is the work really within these four demonstration schools that we've been working with over the last 3 years.
So the core components of the intervention that we wanted to make sure included were, again, sort of trauma PBIS supports together, and so I don't know where else in the presentation I get to say this, but one of the big stumbling blocks from the beginning was, and for those of you that are sort of versed in this world, hopefully this rings somewhat true, is to say that people have been talking for years about trauma-informed PBIS, and the more that I look into it and read and the more that we sort of talk to other folks, it's still felt too much like they were siloed, right, so that we are a school that does both trauma and PBIS, but it was very hard for us to get to a place where it is, are they speaking to each other? Are they mutually facilitative? Are they strengthening one another? Do they feel to the teachers that they are one thing, or if you ask the teachers, are they going to say, "Yes, we do trauma, and yes, we do PBIS"? We wanted to really get to a place where it really felt like one intervention or one approach, and so that was in terms of implementation, I mean, I'm not exaggerating when I say, and I'll get to our partners in a minute, that we had sort of these trauma experts, right, trauma-informed teaching practices, trauma-informed school experts on one side of the table literally, and then we had our sort of PBIS experts on another side of the table, and it was many meetings, many months of them just kind of talking at each other and really wrestling with, what will it mean for us to sort of make this into one? And so that's been a really interesting and rewarding part of this work but again something that we didn't necessarily foresee at the very beginning about this.
In terms of the tier two and sort of addressing back to the needs of the students coming in and saying, "PBIS mostly a tier-one intervention, we have students with more severe needs.
How do we go about addressing those?" We have partners that have a validated universal screener tool that helps identify students with different types of needs, academic, social and behavioral needs within the schools, and then we sort of worked with the schools in the district to outline a process to use those data in combination with other school-level data to appropriately place students in tier-two interventions based on what the screener and their other data revealed, and so that's been a process which has certainly had its fair share of obstacles as well over the last few years.
And then the last part is the school safety officers which until this year has gone relatively smoothly, and we can talk a little bit more about that, but the idea here was, again, thinking, too, this isn't just for teachers, right.
This is a school-wide intervention.
Everybody that comes in contact and has meaningful interactions with students should be versed in these ideas, and so aligned with the trauma PBIS supports, we had partners from Drexel University who designed this officer manual specifically speaking to the role of school safety officers in schools.
These are all the people that have to come together to do this.
We have our own research team at the University of Pennsylvania that I work with.
The school district has been an amazing partner, but this has all taken place within them and in many offices within their central office.
I have a co-PI from the Children's Hospital of Philadelphia.
Devereux Center for Effective Schools is a main supporter of Philadelphia schools and their PBIS implementation.
Thomas Jefferson University was bringing sort of the trauma lens from the behavioral health folks.
University of South Florida brought in the universal screener, and then I mentioned Drexel with the school safety officers, so it's been management.
I mean, it's been a lot to continue to wrangle all these partners.
Fortunately they're all wonderful and easy to work with, and it's really been great.
Just to speak just because I thought maybe people would be somewhat interested in how we sort of have been able to organize this team and keep the project working together, so we have our coaches who are school district employees that are working in demonstration schools.
We meet with them regularly along with sort of our tier-one team so the Devereux team.
We have a separate working group that does the tier-two stuff so thinking through the universal screener, what tier-two intervention schools should have made available to them and sort of how that whole process is going to go.
And then I have to make sure that I cordon off myself to think solely about the research and the evaluation of this because the implementation and the management of all of these partners can be overwhelming at times, but thankfully I have wonderful research partners at the university and at CHOP who sort of keep us focused on the research, so that's just a little bit in terms of how we've been able to manage this.
My brief time line of implementation, you'll get my little theme here, so we've had three school years that we've been implementing this, and everybody knows what happened during the course of this, but the first year, you know, as I was talking, like, we had this idea.
We knew the importance of the intervention and the different pieces, but it didn't exist yet, and so we really sort of had to build the plane as we were flying it.
Our coaches were in schools.
We were supporting them, but in terms of figuring out what this model and intervention was going to really look like, it took us some time during that first year to really get our heads around what it was going to be, so we were all excited to get into the next school year in '19, '20, and we really started making some headway in implementing the universal screener, getting students placed into tier-two interventions.
We developed the video module series around the PBIS, the trauma PBIS stuff, and we had some classroom coaching that was really starting to take shape, and we were seeing how powerful that could be, and then March 2020 happened, and we all know, so the rest of the school year, because no one was really prepared for what that was going to look like, was sort of a wash.
Of course, we came into this year much more sort of attuned to what that might look like and how that might work, but obviously the intervention itself and what we've been able to do in schools have been totally different.
I mean, as of now, no student or teacher has been in person in the School District of Philadelphia, so it's been 100 percent virtual so far.
So real quick, I just want to just talk about a lot of the successes.
I think I've touched on some of them, but we've developed this intervention.
It is a real thing that exists.
These video modules are getting a ton of attention across the district.
We developed a six-part video series that really sort of tangibly describes and shows teachers what we mean by a trauma-informed approach to PBIS which is something that they were really asking for so they would get the sort of overall light-touch trauma trainings, and their questions often were, "Okay, but what do I do differently now as a result of me knowing about this, right? How do I interact differently with my students? How does this change what I'm doing?" and so we really tried to, again, sort of solidify that in their minds through these video modules, and then the officer manual that Drexel led the development of to coach school safety officers as well.
I mentioned all the partners and the partnerships.
It's been great.
The team works on different projects together and sort of siphons itself off, and it's been really great, and I can't speak highly enough of our partners at the school district and again across offices have been so supportive.
The implementation and science study, again, I think we sort of took the last spring but even before that sort of took the opportunity of saying, you know, this intervention and this implementation is not going exactly as we road mapped from the beginning, but there's really good opportunities here to learn as much about implementation of these different components as possible, and so I think we're going to have some really exciting findings around that.
I'm really proud of how we've handled this past school year.
It has obviously been a challenge for all school districts, but the team has just sort of kept moving and kept going forward and figuring out any ways that we can implement any of these components to the schools and really support them as best as we can and building for the future which I'll save at the very end for just 1 minute and then, you know, challenges, always challenges here.
Beside from the third bullet down which we all know was a huge challenge for everybody with the pandemic and the closure of schools, you know, I think the thing that comes to my mind at the top is the school-level buy-in and finding PD time for teachers.
I mean, I would get face time with principals, and they were totally on board, and they were ready to engage and to take hold of the supports, and it just felt like by the next week, you know, they had moved onto 12 other things, and we were constantly working with our coaches to sort of get in their ears, but that sort of put them in a weird spot.
I mean, technically the principals were their bosses, and they had us as sort of this project team trying to push them in certain directions, so I think that whole sort of dynamic was just difficult to navigate.
At times, we felt like we were making inroads, and people were really being responsive, and then other times especially since last March, like, other things just sort of overwhelmed them, and as much as they say they want to prioritize these things and that they do truly prioritize them, it just sort of all gets sort of swamped up, and us being aside from the coaches, many of us being from outside the district had sort of less sort of sway and time with them.
I mentioned the competing initiatives, other things coming in and taking people's time, and then the last thing I'll mention here is just sort of identifying ways to collect data reflecting actual practice in classrooms.
We spent a lot of time and resources going in and doing actual observations of teachers.
It never sort of...
We had one good sort of administration of that which we were holding onto as hopefully some baseline data for our QED analyses, and we were supposed to do follow-up last spring, and of course that didn't happen, and of course now it can't happen in this school year, and so we have teacher beliefs.
We have teacher knowledge.
We have administrative data we can look at, but I think one thing that we thought was a real strength of the project that I'm not sure we're going to be able to salvage in a meaningful way is actually looking and seeing what changes in practice were actually happening inside of classrooms.
And then I'll just end by saying we've taken this opportunity also to...
The district has really latched onto a lot of the work that we're doing, and so we're helping them build for sustainability of some of these things, adapting the manual for non-instructional staff other than officers.
They really want the universal screener to become common practice in their schools so helping them think about sort of what the infrastructure and supports need to look like around that and then the video module series they are holding onto as sort of a centerpiece that they want to have for schools if and when I guess students and teachers start returning because they think it's that valuable so helping them, again, sort of make that a permanent part of the way that they do their work.
So thank you for listening.
I hope some of that rang true and made sense.
I look forward to the other presentations and conversations and questions.
>> Thank you so much, Dr. Fink.
Next, we will be hearing from Dr. Lindsey Turner on Studying Implementation of PBIS in Rural Schools During Challenging Times.
>> Hi.
Okay, can you all hear me? Great.
I forgot to unmute myself before I started sharing my slides, so I'm glad that that worked.
So our project is focusing on PBIS, like Dr. Fink's project, in rural schools, and so to set the stage, I will talk a little bit about what rural areas look like in the US.
Many states have rural areas, but we have this whole enormous area of the Intermountain West kind of from the middle of the country to the west that is incredibly sparsely populated with the exception of a few areas of Salt Lake, Denver, Phoenix, and I live and work up here in Boise, Idaho, which is an extremely rural state, so it's also a mountainous state which poses a lot of trouble for travel and getting around particularly in the winter and kind of weather that we're having right now, and this is our density and our school locations, so those blue dots each are schools.
We have 700 schools, and the majority of those are in rural areas.
And over about 300,000 K-12 students, half of them live in rural areas, and when I say, "Rural," I'm using the NCES designation for rural, and then we also include townships, small townships as well.
And this is another characterization of rurality which I find really interesting because I think it just reflects the diversity of different types of rural, so all rural communities are not the same, and while in our project we're looking to study implementation in rural settings, I think in terms of generalizability, it's important to recognize that what we're learning here may or may not necessarily relate in other areas but really hoping to test a strategy that should be able to be scaled up elsewhere, and those blue areas are the types of rural communities that we have here in Idaho.
These are a couple of the schools that are working in our project located in mountain areas, beautiful areas, and right now they look sort of like this, like much of the rest of the country unfortunately.
We had a foot of snow here in Boise over the weekend which is unusual, but in these communities that we're working with, they often get snowed in, so from about December to March, it can be pretty hard to get out there.
And with the diversity of our rural communities, we also have some in desert areas, and then we have some in farmlands as well so a fairly diverse set of rural schools that we're working with.
We also know that rural communities have many risk factors for adverse outcomes and then fewer community protective factors, a couple of those listed here, and particularly relevant to our work is the school capacity to implement evidence-based practices so with the one that we're discussing being PBIS, and these pertain to things like location so distance from urban areas where they can access expertise and professional development, transportation challenges and resource constraints and then also some of these cultural issues around distrust of outsiders and really wanting their strategies to be locally grown and come from the communities and also the student characteristics, so our poverty levels are higher in those rural settings.
And this is a description of our project.
We call it RK-12, Rural K-12, and it's a cluster randomized project, and in implementation science language, it's a type-three hybrid so meaning that what we're really focusing on here is understanding how to implement an EBP.
An evidence-based practice as PBIS, really commonly used, uses that tiered implementation approach, and then our intervention that we're testing is designed to figure out if we can help schools to more rapidly implement with fidelity through kind of a basket of implementation strategies that are specifically designed to address some of these barriers that we see in rural schools, and PBIS I should note uses a team-based implementation strategy at the school, and so in our 40 schools, what we had them do was identify a tier-one team which included a coach, and that coach is typically a counselor if they have one, if not someone in another role, and the principal and then three or four other people depending on interest and availability, and in some of our smaller schools, there just isn't the staff capacity, so their teams tend to be a little bit smaller, and for the purposes of statistical power, we did set an enrollment criteria that schools had to serve at least 100 students.
Now we have a lot of rural schools that serve very few students, but those are not participating in the study.
And here is what we planned starting out pre-COVID, so we are using a mixed methods approach to gathering data, standardized climate surveys rolled out through well validated tools for students, staff and parents, and in our staff survey, we've added in a lot of those things that I as an implementation scientist are particularly interested in understanding: readiness for implementation, things like psychological safety, perceived transformational leadership and those other things that could affect the implementation of an evidence-based practice in the school.
And then that team at each school, we do more frequent surveys with them around process, how they're meeting, about their team dynamics and what they're doing.
We have a couple different standard implementation fidelity measures that are used in PBIS world and then an on-site school safety environment tool.
This was developed by Sarah Lindstrom Johnson and her colleagues, and this assesses aspects of the built environment that contribute to students' perceptions of safety and safety outcomes.
And at the student level, we have standard ODRs, disciplinary referrals and academic achievement, and most of that data collection happened in the spring of each year, so in the intervening summers is when we started our training, so our grant started in 2018.
We used that to develop our intervention of those implementation boosters, and then in 2019 we started rolling with data collection at baseline in the spring of that 2019 year followed by our first training for all schools in the summer, and then the goal was to continue that pattern for the next couple years.
And here is what our intervention schools are getting is virtual learning sessions and an online portal, again, developed with the goal of being able to reach those distanced schools that are in remote areas, some leadership and coaching skills, unfortunately those had all been developed and delivered pre-COVID, and then technical assistance in coaching, and our plan actually because recognizing the weather was about to get bad in December, we did those on-site in the fall, so our coaches or implementation specialists could go out to the schools, work with those teams, get to know them, get to know the kids and then pivoted those to virtual, so even pre-COVID we had pivoted that technical assistance support to a virtual approach.
So here was the ideal plan for data collection, and here is what actually happened.
I've crossed them all out in red what we did not manage to get.
Our schools were actually late to close in Idaho.
We had a shutdown on March 26th, among the latest in the nation to do so, but by early March, I was starting to get worried, and I pulled our team out of the field, so we got set on-site visits from about 10 schools, and we didn't get very many of those other things, obviously no academic achievement because schools couldn't do that.
We did however manage to get staff surveys.
We just scrambled day and night to deploy those surveys early and catch them from staff, so at least we have a follow-up year of staff surveys and perceptions.
And then down here in terms of the tier-two training, we went virtual, so in March I made the call that we were likely to not get back to an in-person approach for the summer, and we pivoted those all virtual which I'll talk about in a minute.
And so here is our hope that through a no-cost extension, we can push out another year of data collection, and then this coming summer we can help schools with a reboot.
It looks like things hopefully with vaccine rollout will get back a little bit more to normal in the fall and that our schools were already starting to hear that when they're back in person, they want to kind of get going again on this, so we'll help them with the reboot.
We have added on interviews.
This was part of our plan to do in the end year, was to do some retrospective interviews, but we accelerated those recognizing that things were just really unusual in the spring and wanted to get a sense of what schools were doing to adapt and what they had done thus far, and I'll talk a little bit about some of the themes that we found from these interviews.
This targeted the coach and the principals, and we got most of them.
And then we are currently in the field right now with some brief interviews just to figure out how it's going now.
We're about a year into the pandemic, figuring out how they're keeping up from what they're doing, and these were all consented, recorded by Zoom and then transcribed, and we're coding them right now.
So themes from that first set of interviews very early on in the pandemic, we found that administrators tended to be a little bit more positive than their PBIS coaches at the same schools, and the intervention schools perceive things to be going a little bit more positively.
And then in those intervention schools only, that in-person support, those first couple of months and that follow the 2019 school year was incredibly highly rated.
Comments like face-to-face is more powerful.
So even before everyone was forced to go virtual because of the pandemic, we were getting a little bit of not push back, but just comments that we really miss the coaches and wish they were here.
The feedback was very good acceptability for the types of strategies that were being used.
These were really tailored and customized to what each school needed.
Then also, really appreciation for the way in which those coaches did what they did, but they are just wonderful, supportive, positive.
And here are our coaches.
Their role is implementation specialists, but we call them RK-12 coaches.
Tate and Nate are our tier two trainers, and that's on the right a photo of them with Rob Horner, one of the developers of PBIS.
Rob is not part of our study, but Teri Lewis, who is our Advance Tiers Trainer, did her dissertation with Rob Horner and George Sugai, and she's been working in PBIS for two decades and knows it inside and out.
So our team is just wonderful and as we can tell from those comments, the way that they go about doing their work is really appreciated by these schools.
We did that rapid pivot to online for summer institutes, and so here is our scope and sequence.
We kept the groups in the same groupings and sequence that they had been in when they were in-person because they had already started to make some connections with other schools.
That was one of our goals, too, was some sort of regional connections, and we sent them some tip sheets ahead of time about just sort of how this would work, getting used to the technology.
We were all sort of on that Zoom learning curve at this point in June of last year or maybe now, we've all had a little too much of it, but just helping people to prepare for the meeting.
And then the first day of that online meeting, just really kind of talking logistics and helping people figure out what we would do.
Our team, Tate, Nate and Teri, used really effective use of breakout rooms.
So each school's team could meet together and go off an action plan, and then come back and discuss as a big group, and we found that to be very effective.
In terms of the adjustments to our intervention, really not many because as I mentioned, the whole goal was to figure out a set of implementation support strategies that we could do with these remote schools that would be feasible to do.
So not a lot of changes here for the virtual learning sessions, and just a quick sneak peek on our very first year of fidelity on implementation.
So recognizing that this was just a couple of months into the pandemic.
We got these in May and June of 2020, but really remarkable strides, considering everything that our schools went through last year.
And we took a look at that number of schools that reached 70 percent as sort of an indicator of, really, "Are they sort of implementing it with fidelity or close to fidelity?" and found that 16 of the schools in intervention condition and 11 of the schools in the control condition were implementing at that 70 percent level.
So I think we're starting to see some evidence that these implementation support strategies are helping.
Just a quick update on what we're finding right now from these emerging interviews, kind of what's expected.
This year has been really hard.
Everyone is stretched very thin, and PBIS has been put on the back burner.
So adaptations in terms of how they're using PBIS, with the main one being that they're not using behavior tracking as much or other strategies from PBIS, but they seem to think that they're really not needing it, that that social distancing, and then of course, the hybrid or remote learning, they're not needing it.
They're reporting far fewer behavior corrections, and interestingly, we heard that parent engagement looks different and is more this year.
I suppose that that really makes sense as parents are concerned about the physical safety of their kids around COVID, so they're more in touch with teachers, and teachers are not necessarily as much in contact with their colleagues at their schools.
Our next steps are to keep providing these virtual implementation supports to do whatever we can to be a good partner to our schools as they navigate this really challenging time, gather as much data as we can, and then lots of analyses, even though we missed a lot of data last spring, we will have a lot from baseline and then some still from 2020.
So I'm really excited to dig in and look at that.
Huge gratitude to our team, it's been a rough year, and they have all been troopers and have just been so adaptable and flexible and positive about all of it.
So thank you, and I will be happy to take questions at the end.
>> Thank you so much, Dr. Turner.
Next, we'll hear from Dr. Lauren Decker-Woodrow on "Integrated Supports for Vulnerable Middle School Students, Importance Of Implementation and Context in Randomized Control Trials." >> Apologies for the very long title.
I'm not sure...
If somebody else can stop the sharing, and then I'll be able to share mine.
There we go.
Thanks.
Okay.
Well, I loved the little bit of variability that we have across our three presentations because we were actually able to complete our study prior to COVID, but nonetheless, we have lots of challenges to still discuss and talk about that actually did not have to do with COVID.
Here as you see, some other authors and partners we had in this work both from where I'm at at Westat, but also our Communities in Schools partner, who we worked very closely with on this work.
So I want to spend the bulk of the time talking about some of the interesting things we learned about the challenges in the schools, but to give you a quick overview first, we were looking at a randomized control trial of Communities in Schools, but this was not Communities in Schools or not as a treatment and control condition.
Rather, a more intensive version of communities and school services that our partner was interested in testing, compared to their typical service that they provide.
So to give everybody a sense, this is an in-school wraparound services support program that's in more than two dozen states across the country, and they offer both schoolwide supports, but also case-managed individual student supports.
And where our randomized control trial was looking was in these case-managed students, and as I said, this wasn't a communities and schools compared to not.
This was their typical services compared to a more intensive version that one of the Communities in Schools partners was really attempting to create on their own in one of their schools prior to our NIJ fund and study.
And in the traditional model of the case management for students in our Communities in Schools partner, the ratio of coordinators to students...
so coordinator is the Communities in Schools person who is on-site at the school, could be one coordinator to 100 students.
And they were aiming for between 22 to 48 hours of targeted services to those students every year.
Well, as you can imagine with a ratio of one to 100, that tends to be challenging and may not get as deep as they'd like.
So our partner was trying this more intensive version where they cut that ratio in half.
They were greatly increasing their targeted number of service hours, in hopes that they could get much more familiar and understanding the needs of the individual students and in essence, become proactive in the services and supports they were providing.
So rather than having a situation arise, whether it's attendance, behavior, course challenges and difficulties, they were hoping to get a little more proactive in identifying the needs and supports of the student by being able to spend more time with them and in hopes of driving outcomes higher for those students.
They also, for this more intensive version known as ACT, included a separate parent liaison.
So in typical Communities in Schools support, it was up to that one coordinator at the school to reach out to parents or guardians when any additional service or conversations might be warranted, but in the more intensive version, this ACT version, there was a separate parent liaison per school who was able to support the outreach as well as those conversations.
So our study was a randomized control trial, and I'm happy to talk later in the questions, if everybody needs to know a little bit more.
We have this larger study.
Not every single student was randomly assigned, so we focused our analyses on the randomly assigned sample.
These were sixth through eight graders across five middle schools in an urban district, and we collected not only the outcome data and baseline data, but as much fidelity information as we could as well.
You'll see how that became really important for us to understand some of the challenges that we were facing and how that was resulting in the findings that we had.
We were able to do this study over three academic years, and as you can see, our last year was right before the COVID challenges.
So we were very fortunate to be able to complete it at that time.
To cut right to the chase for the impact results, we didn't find any difference.
So one hypothesis would be that this more intensive attention to case managing students wasn't really doing anything different than their typical model, and their typical model is much more cost-effective.
So that might seem to suggest, well great, just keep doing what you're doing as Communities in Schools.
However, we had a question of is that really the case? What did we actually measure? Was implementation fidelity present to warrant that conclusion? And cut to the story short, no.
So in reality, what you're looking at here between these two sets of bars...
the far left was that planned minimum hours of services, and then each of the bars on the right of that are where the average actually lay.
When you even go into the student level, there was only a handful of students into the more intensive conditions and only a handful of students in the business as usual condition that even reached those minimum thresholds.
So on average, the amount of services anticipated and aimed for weren't even coming close to happening.
Now, it's important to keep in mind, we did see a significant difference between the amount of services in the more intensive model.
Students were receiving significantly more hours of services, but it was still about coming anywhere close to what the expected minimums were that they were supposed to be targeting.
So you could stop there and say, well, we didn't actually have the opportunity to evaluate the intended dosage.
However, we wanted to know why because these numbers are so drastically different from where they should have been targeting.
What was happening? What was the reasoning behind some of this? And fortunately, our Communities in Schools partner from the beginning was collecting a lot of really rich data from each of the campuses that we were involved in.
We were able to utilize that information and start coming up with some explanations, and those explanations pooled out some of the challenges that these Communities in Schools coordinators were facing, living within the schools.
So keeping in mind the way this program operates is the site coordinator and any of the additional staff are actually sitting within the schools.
They work within the schools every day because that's where the students are.
To support their before school, their after school...
really trying to support students and their families as much as possible.
But we found some interesting things when we looked deeper into this type of contextual information that Communities in Schools was gathering.
First, the amount of turnover in the coordinator and these liaison positions, there were also some tutors they were bringing in to support students, we saw between 33 and 200 percent turnover.
Now, where this is important to keep in mind in terms of potential effectiveness of the Communities in Schools services is the site coordinator is someone in a position who is seen as being a stable, responsible adult mentor in a student's life.
But if the turnover is this great over the course of the year, really a true mentoring relationship can't actually take hold from a reasonable perspective.
So that mentoring connection that...
was striving for may not have been happening.
That could be impacting motivation for students in a variety of outcomes.
But even beyond that, something that was interesting was the Communities in Schools staff were asking these site coordinators every month throughout the entire implementation, what other responsibilities were school staff and administration asking of them? So keeping in mind, the schools that we were in were targeted for Communities in Schools support for particular reasons because they had a vast majority of students who could benefit for the support, would be eligible to receive the support, and I'll talk in a minute about some of the school's challenges as well.
So the administrators and staff in the school all of a sudden have one or two other adults within their midst.
They're already struggling with a lot of things they have to do a lot of tasks they're responsible for, and all of a sudden, there are other people here.
So what was happening is these site coordinators, the parent liaisons, were being asked several times a day, every day to take on other responsibilities to support the administration of the school.
And what that inherently did was took their time away from being able to do the services that they were actually being hired and placed in the school to do.
So while the staff and administration most likely were trying to put out fires, were reacting to something and they needed other responsible adults to support, they, in essence, were maybe potentially hurting themselves by not allowing the coordinators time to focus on those services for students.
So at one point, we could see a range between 300 and more than 5,000 additional asks over the course of a year, and when we think about that, that's several times a day, every day, that you're being pulled away from the direct services to your case managed students.
From the school level perspective, we were in five middle schools in a highly concentrated area where there were several additional demands that were happening at the school level.
We did have a district reorganization in the middle of our study between year two and year three.
That one middle school was shut down, and those students were subsumed into a feeder-pattern high school.
We had some administrative turnover, so our Communities in Schools partners had to go back in in the second year of the study, in the third year of the study and talk to the new administration.
Explained what we were doing, how we were doing it.
So there was also this revolving buy-in challenge that we were facing.
There were also district improvement pressures.
So there were a lot of issues.
For some of these schools, they had been identified for district improvement efforts and strategies and focus.
So this really led to competing priorities for administration.
So oftentimes, they were making some choices.
As I said, redirecting some site coordinators to do other things in a variety of ways.
So we just wanted to offer, based on this experience over a number of years, some things that we saw that we'd love to consider for next time as well, and some things that worked really that we were already collecting to provide some of this context.
First, wherever we're going to be going forward or if anybody is planning a study, thinking deeply as much as we can about the potential risks that may already exist in some of the schools and other districts that we're going into and making some decisions ahead of time to say, are these potential risks that will really change the course of the study or potentially disrupt the study? And if not, if there are risks we believe we can handle and be flexible around, such as the way we do random assignments or something like that, then let's just think about what our potential backup plans might be and how we might move forward with that to make sure.
One of the other things...
as I was saying, the fact that we had as much contextual information as we did to not only understand the fidelity and dosage, but then to start understanding potential context stressors that were to explain our lack of fidelity became really critical for us for talking about the context that these students and the Communities in Schools were working within.
So thinking right from the beginning of a study, what information, contextual and fidelity, can you be collecting, so that if you run into any of these challenges, you may have more information to share, to discuss, to think about as the end result of the study because really, without that information, we feel like we really weren't sure what was going on, and it really helped us tell more of the study and of these schools as well.
And considering the continuum of research...
and what I mean here is thinking carefully when you're in randomized control trials, whether you're putting forth an efficacy trial where you are deeply, deeply overhanded with how everything is going and making sure that fidelity is happening amazingly, and whether your context is really supportive of that or whether you're in more of an effectiveness as sort of trial where it comes to that more real-world context.
Where we know we'll have massive variability and fidelity and where is the treatment in its continuum of research, understanding? Do you already have some data that say really strongly you know there will be significant effects if you're in efficacy? Then, you can be ready to move to these more real-world contexts.
So that was just a little picture from our example and would love to hear everyone's thoughts.
Thank you.
>> Thank you so much, Dr. Decker-Woodrow.
Last, we will hear from NIJ scientists Dr. Mary Carlton, Dr. Nadine Frederique and myself on the National Institute of Justice's Comprehensive School Safety Framework.
>> Thanks, Caleb.
Great.
So I'm going to talk to you real briefly about this framework, and I should say, so many of our panelists this afternoon have talked with a lot of detail about work that they have done to implement various programs and challenges they faced.
And our framework discussion this afternoon is taking what we've learned from research on school safety drawing very heavily on NIJ-funded research through the Comprehensive School Safety Initiative and trying to develop a framework that schools will be able to use to take a comprehensive approach.
So we're at the very earliest of the implementation phase here, probably implementation planning phase based on research.
Nadine? There we go.
Okay.
So real quickly, I'm going to talk about what we'd like you to draw from the presentation, what we're hoping that you're going to understand in terms of what we mean by our comprehensive framework, which has three elements.
So we're really going to spend most of the time focusing on what those three elements are and how we think they can work together or should work together.
Then we'll just kind of bring it all together.
So broadly, sort of the key message, if you will, from the comprehensive approach that we're talking about is the idea that school safety is multifaceted.
It's not just one thing moving to improve school safety.
It's complex, and it requires key players coming together from different disciplines, different systems to use research and help all of the stakeholders in the school community focus on improving school safety.
So I talked about the three elements real quickly, and really what I want you to draw from this slide is sort of the movement here and the interconnected nature of it that we were argue that the elements of physical safety, which we're thinking about issues like access control and security, the climate, which we heard a little bit from our presenters about just a few moments ago, and this notion of student behavior that we can think about approaches and tactics in each of these three core areas.
We think that attention in each of these three areas will help schools move in the direction of improving school safety in the ways that they desire, and that this represents a comprehensive approach.
So we've already heard about school climates from our other presenters, so I won't spend too much time here, but a couple of key points to make when thinking about school climate is that measures of school climate vary.
So some folks looking at how am I doing in terms of school climate might emphasize things like the support structures that students have, how connected folks feel to the school, both parents and students and teachers, the physical disorder of the schools, sense of fairness.
Not all measures are necessarily the same, and that might account for some of what appears to be...
what looks like disagreement, if you will, within the literature.
Sort of understanding what school climate is or, "Am I doing a good job or not?" and we're starting to really learn from the research.
As folks are really digging deep into school climate, what concepts may be more important than others, and I won't get to talking about the measures, but to say at the end that we provide some direction of where to get a little bit more information on what those measures are, should you be interested in pursuing that.
So why should you care about school climate when it comes to school safety? The evidence is pretty strong that a positive school climate can help keep students safe and promote other outcomes that are desirable for both students and schools, and these are just a few examples of where the research says that school climate, positive school climate, is important.
We see improvements in individual student behavior in engaging in bullying or acting as a bully.
We see improvements in students seeking help and students wanting to come to school.
Then, in terms of kind of other behavior, not just one's individual or misbehavior, but if they hear about a threat of violence, a positive school climate can help increase the likelihood that students will report that to a trusted adult at the school.
So a little bit about measuring school climate.
So just some really high level points to make here, that the research suggests that periodic measurement is important to get a baseline of where things are not, particularly if you're interested in making improvements.
And when looking at the results from school climate measures, really two things to draw from, and one is to expect that perceptions will vary on school climate.
You've got lots of different folks in the school who come with lots of different experiences.
We can expect that students, depending on their characteristics or how well they're performing in school, will have different perspectives on school climate.
We can expect that those who have been victimized at the school will have different perspectives on school climate.
So it's important to make sure that different folks are being asked about school climate and to look and see where those variations exist as well as to kind of dig deep and not just look at an overall measure of climate but to try to understand where are the areas that it looks like in our school we're doing really well on and those areas we're not doing as well on? Because, of course, that helps point in the direction if you're looking to make improvements of where those improvements should be made.
And we're learning a lot from research about what we can actually do to improve school climate, and a really high level picture here...
There isn't one recipe to getting to a desirable school climate, but we are starting to learn and see that there's strong evidence for certain approaches that can help improve school climate.
And these can overlap with other things that schools are doing to improve school safety.
So for example, we'll talk in a moment about discipline, but having appropriate discipline limits understood, consistent and fair can improve school climate and have other desirable outcomes.
Making sure that there are interventions to meet the social-emotional needs of students can help improve the overall school climate.
A final note on the notion of measurement: We talk about this periodic measurement another time to take a step back and take a look at and see what a school climate is, is after a measure has been introduced, and it's had enough time to really kind of take hold.
So we heard a lot about implementation challenges earlier.
Sometimes, it's hard to get these interventions to take hold, but once that has occurred, or you feel like it's sort of necessary to take the temperature, it's a good time to do that measure.
We still have lots to learn about how to improve school climate and why some of these approaches are effective, but we really are starting to sort of move the needle in advance of understanding.
>> Thank you, Mary.
Effective management of student behavior is yet another interconnected component of improving and maintaining school safety.
NIJ sees management of student behavior as the policies and practices schools have in place for addressing student's mental health and trauma, their behavioral health as well as the approach to discipline.
Just as a school's climate and physical safety is impacted by a school's approach to managing behavior, when trauma and students mental and behavioral health needs are addressed, they are going to be less prone to disciplinary issues.
So by providing teachers and administrators with evidence-based practices which address student needs, we can reduce the reliance on exclusionary disciplinary practices like suspension and expulsion, which suffer from well-documented racial disparities and can reinforce students' perceptions of unfair disciplinary practices which will adversely impact school climate and can make schools less safe.
To counteract these influences, school policies should prioritize individual-level student needs as well as schoolwide needs to more effectively manage student behavior, so over the next few slides, we'll provide some general guidelines for managing student behavior based on CSSI research.
In order to appropriately address student mental health and trauma, research suggests supporting students through implementing social and emotional learning programs and other mental-health monitoring and support systems.
Addressing mental health and trauma, again, are critical to managing student behavior because students who have experienced trauma have a increased likelihood of acting out at school.
Implementing social emotional learning programs can help students learn to manage their emotions and behavior.
CSSI-funded research also suggests adopting universal mental-health screening so that students in need can be identified for the right kinds of services, though such screening can be problematic because many schools do not necessarily have the resources to address these mental-health needs once identified.
This suggests that adopting mental-health screening really needs to be accompanied with the necessary counselors and support staff to address students' various mental-health needs.
Providing these supports is critical as mental-health problems and traumatic, aversive childhood events that are not addressed have been linked to negative outcomes for students, such as poor academic achievement, behavioral problems, dropping out of school and delinquency.
These negative outcomes, again, can manifest themselves in disruptions in the general school climate and in physical safety.
Behavioral health is a more broadly defined, is a little more broadly defined and can refer to support services for both students and teachers, which are not necessarily specific to trauma and mental health, so this includes classroom-management approaches, approaches to bullying and other programs which may impact important student outcomes like academic achievement.
We need to, again, to support teachers and provide them with the resources and policies to meet their students' needs and to manage behavior in the classroom.
And so while there are a number of independent kinds of programs and practices which address some of these issues specifically, many of which are highlighted at this conference, most practices can really be adopted within a multi-tiered system of support, or MTSS.
Based on public health models, MTSS provide both broad and individualized interventions and divides supports into different tiers, and it provides a framework for implementing both district and schoolwide changes to programs and policies.
A common example of MTSS used in schools is PBIS, which you have been hearing about in the last couple talks.
Interventions in programs at different tiers may require the involvement of more than just school officials but also parents, community stakeholders and other members of the students' support network.
While implementing programs to support the needs of students and teachers is critical, schools do have a responsibility to...
You do have a responsibility to respond to undesirable behavior.
Although addressing student trauma, mental and behavioral health prevents a lot of this undesirable behavior and really the need for a disciplinary response, sometimes there might still be this need, and so when a disciplinary response is deemed to be necessary, CSSI research suggests adopting a disciplinary policy which is strict but fair, clearly defined and widely known by both students and parents as to have a positive influence on both student behavior and school climate through in part actually reducing the reliance on exclusionary discipline practices.
CSSI has also funded research on restorative practices or restorative justice practices, which have been demonstrated as a promising alternative to traditional school discipline as restorative practices promote problem-solving between students.
Restorative practices have also been shown to improve school climate and result in a reduction in overall suspension rates and a reduction in the disparities of suspension rates between African-American and white students and poor or low-income and high-income students, but more research is needed.
>> All right.
Thank you, Caleb.
This part of the presentation, we're going to talk about creating a physically safe school, and when you think about creating a physically safe school with a comprehensive approach, you really need to think about the strategies and interventions and policies that would ensure physical safety, and so for the goals of this part of the presentation are to really look at how we maintain the physical features of a school.
We manage who enters and exits the schools and how to keep students, staff and their property safe from hazards inside and outside of the school.
To do this, I'm going to focus on four areas.
Though there's many others that we could focus on, we're just going to focus on these three, four, sorry, but I want to emphasize the point that each of these play a different role depending on a type of school, the location of the school and the different hazards that face different school districts, so even in this session, we've got a urban school district.
We've got rural school districts.
We've got a wide range, and each of them will face their own different hazards.
So technology is important to improving school safety.
There are many technologies that are currently employed.
Here are some examples here.
I will point you to some of the resources that we have on the NIJ website, so NIJ through the School Safety Initiative did work with RAND in a Johns Hopkins Institute to do a thorough review of the different technologies that are going to be employed to improve physical safety, and so for more information, I refer you to those reports, but one thing to keep in mind is that technology may have unintended consequences on the perceptions of fear and victimization in a school climate.
Perfect example is metal detectors, so when metal detectors are used in different jurisdictions, they may increase the perception that the school is not safe, that it's not a welcoming environment, and that really has an impact on school climate as Mary mentioned before.
Another important factor to consider when thinking about technology is, what are the barriers that may prevent adoption of technology solutions? So for example, in the plenary panel this morning, we heard Kevin Bethel talked about cost, you know, whether or not you want to calculate the trade-off of $1 million for cameras versus implementing other programs that might impact school climate, and those are real trade-offs that a school district has to consider when thinking about implementing technology solutions.
Another really important aspect of maintaining the physical safety is about tips and tip lines and reporting threats, so this conference, we have the privilege of having Dewey Cornell and some other folks who are doing really innovative work in tips and threat assessment and also tip lines, and so I'd also point you to those presentations and those resources on the resource page, but just keep in mind that tip lines are not just a phone line.
They also include texts, e-mails, apps, live interaction, websites.
Safe2Tell, which we've heard about a little bit today so far, is...
I would call it the OG or the pioneering tip line in school safety.
We have several rigorous evaluations of tip lines that are happening all across the country, and my colleague who presented earlier, Mary Carlton, has a forthcoming NIJ Journal article that really talks about tip lines, so I'd also point you to those resources, but I did want to point out that there are some things that we know about successful tip lines, and those are that they demonstrate a coordinated way to receive threats and to respond to the threats in a systematic way.
The successful tip lines that we've seen, they provide investments in technology.
They provide training for the people who are going to be reporting tip lines and also for the people who are going to be responding to them, and they also really emphasize student engagement, and I think that's something that we really want to focus on in terms of understanding the responsible use of tip lines.
So you have a tip line.
What do you do with the tips? Threat assessment, so as Dewey Cornell talked about earlier in this session, it's really important to consider having a really thorough way to respond to threats.
So what is threat assessment? It's a violence-prevention strategy that involves reviewing the threats of acts of violence, determining the seriousness of the threats and developing intervention plans to protect potential victims and address the underlying problems or conflicts that might have stimulated that threatening behavior.
Threat assessment should be used in concert with a considered approach to collect the information, tip lines, which may be, are there means to report information or assess the safety? Tip lines are a helpful way to collect the information, but threat assessment is also a very effective strategy because it could prevent overreactions and underreactions to those threats that are reported.
As Dewey Cornell mentioned in his presentation earlier, there's not...
There's no disparities among Black, Hispanic or white students in the result of the tip, so they're not more disproportionately likely to have out-of-school suspensions, school transfers or legal actions as a result of threats, so it's a positive outcome from the threat-assessment literature, happy outcome that demonstrates that this considered approach to assessing threats of violence is really important to reduce these sort of disparities in school discipline that we see.
We also know from the threat-assessment literature that there are...
There's the no-snitching culture, and it's really important for implementation of threat assessment to consider, how do you overcome students' resistance to reporting threats? So we funded a project called SOARS, which developed the Advocatr app, and they paired that with the educational program that normed social behavior, so it talked about how reporting a threat is about saving lives and not snitching, and Dewey Cornell talked about that in his presentation as well, so I'd refer you to his presentation after this to get some more information on that.
Emergency operations plans are also important guidelines to improving the physical safety of a school.
These guidelines have been developed at the federal level, and we're hoping that they have increased uptake by school districts.
We have pioneered a lot of research on assessing emergency operations plans, and some of those researchers are going to be, again, talking at this conference today, so I won't go into a lot of detail about those projects, but just to point you to their presentations, but to consider that there's a large disparity between a good emergency operations plan and a bad emergency operations plan.
I think, from some of the research, we've learned that the comprehension of emergency operations plans is not consistent across the school community, so people who are on the response team for emergencies might have greater comprehension than, say, your other important school staff like the cafeteria staff or custodial staff, but every member of the school community needs to be able to comprehend the plan so they know, when do you do a lockdown versus, when do you evacuate the school building? And those needs to be communicated.
I think another thing that we've learned from that research is that having a small plan is better than a large plan, so emergency operations plans can be really complicated, really complex and cumbersome, and that really decreases the ability of staff to comprehend what's happening.
So for more information, again, I'll point you to the resource page.
We've got several videos that talk about our research from emergency operations plans and also the presentations from Josh Hendrix and Suyapa Silvia from RTI who have done a lot of work on this area.
School resource officers, we know for the last, well, 6 or months or so have been controversial topic when you think about school safety, so for the purposes of this presentation when we know that there are calls to remove them from schools and a lot of things like that, but we're not going to tackle that issue here, whether they should be there or not.
Here we'll say that if you do choose to use school resource officers to help improve physical safety in a school, there are some things that you should consider.
The CSSI research suggests that school resource officers are more likely to be involved in formal responses to crime on school property, but they're also really helpful in coordinating the development and response of good emergency operations plans, conducting threat assessments and helping to respond to issues in schools, but we know that there's a consensus that SROs should not be involved in routine school discipline matters, so this is a really key point because when SROs are pulled into routine discipline matters, that's when we see a lot of those really bad videos that you see on TV about school officers doing horrible things, and so one recommendation, one consensus is that SROs should not be used in these routine school discipline matters.
We also know that training and selection of the right officers is critical.
NIJ has been doing a lot of work looking at the roles, courses, officers training and selection, and we're finding that it's really important to have the right people there and afford them opportunities to the right training.
So as Kevin Bethel talked about this morning in his work with the school safety officers in Philadelphia, kind of changing their orientation, allowing them to have access to trainings like the NASRO training, the National Association of School Resource Officer training, looking at things like positive youth development or looking at adolescent brain-development research and helping school resource officers understand adolescents and how to respond to that in different ways than they're normally trained to do as police officers is really important.
So what are some recommendations when thinking about school resource officers? We'll really have to note that there is no implementation model that tells schools or districts how to implement school resource officers, so that's one area of recommendation is that we probably need to move to a place where we can start learning from the research that we funded and develop a model for implementing school-based police officers, and we also need to evaluate and improve and evaluate that rigorously in the field so we can improve the model implementation, right? So what we have now, similar to the education when you have thousands of districts across the country, we've got probably thousands of different models of how school resource officers or school-based police are implemented and so moving to a place where we can start to recommend a more evidence-based, informed model of implementation.
So tying it all together, thinking about this framework that we're talking about, I think through this presentation that you'll see that improving school safety is highly complex and involves multiple interrelated concepts, so between Caleb, Mary and myself, we've talked about many different things that we would include in a comprehensive framework to improve school safety.
It's also really important as our graphic demonstrated that the changes and challenges in one area may impact another area so understanding that discipline practices may have an impact on the school climate if it's perceived as arbitrary or unfair or exclusionary.
It's really also important to consider that stakeholders involved in ensuring school safety all need to work together in a coordinated way.
I think Mary talked about all the different people who are involved in ensuring the safety of our students in our schools, right, from the principal, from school-district leadership to the lunch lady and the custodian.
Everybody is going to have a role to play, and we all need to be working together and talking about whose role is what, and how do we improve school safety of our students? We also want to kind of make a point that schools that employ comprehensive approaches to addressing violence are not immune from school violence.
However, implementing a comprehensive approach minimizes the chance of serious violence, and it also prepares schools to recover from the effects of violence when they do happen, so that's the conclusion of my presentation, our presentation on NIJ's school-safety framework, so I'll turn it back over to Caleb.
>> Well, thank you to all of our speakers, and so we have our question and answer section, and so if you have questions for any specific speakers, you can go ahead and submit them there.
Does look like we do have one question already for Ryan.
>> Yeah.
I see that one in there, and Nadine just sort of echoed some of this, but I'm not sure if this is what the questioner is getting at, but those to us are names for the same thing.
Kevin Bethel, who Nadine just mentioned, who leads the school police in the school district of Philadelphia, recently renamed the position to become school safety officers.
They were school resource officers before that, but we're talking about the same role in schools if that makes sense and if that answers the question.
>> Caleb, can I ask a question? >> Please.
>> This is for Lauren.
I was kind of wondering if you could tell us a little bit more about the Communities In Schools program.
Kind of what were the components of that? How was that sort of implemented? Just a little bit more description about that.
>> Sure, absolutely.
So there are different affiliates across, as you saw in the slide, 25 states, and then there's also a national Communities In Schools organization, so while they all do very similar things, there are slight variations across states.
Some affiliates operate within a state.
Some states have their, for example, state education agency over any affiliates within the state, so there are some different mechanisms so slightly varying restrictions or cutoffs or things like that.
But in essence, what Communities In Schools is doing is trying to in a case-managed way support students and families with a wide range of needs, so often what a Communities In Schools organization will do is, they'll partner with varying school districts, be positioned in various schools, and this can be elementary, middle, high school.
They stretch across age ranges, and then they also identify community partners to pull in to support students and families in a way that doesn't bring an additional cost burden to families, so they'll create community partners, and these services look different depending on what's available in respective communities, but in general, they will partner with a food bank, for example, any sort of therapy and mental health, behavioral-health supports that might be available.
The coordinators themselves will offer a shoulder to listen.
They'll have some training, usually social-work background, so they can assess needs, identifying needs.
There are some tutoring that's sometimes available depending on the capabilities, or Communities In Schools might partner with some other tutoring entities or pull folks in to do those sorts of supports.
So really what it is, it's more of a wraparound service model, and they believe that, by being placed within the school, there's also building a trusting relationship with the student community as well as the administration, so if they have a space in school and a child can easily pop in to get food, for example, it's real easy for them to pop in and out to the psych coordinator so it's not necessarily a known thing to their peers what types of supports or requests they may be getting.
They'll also partner with other entities and do drives for material things like school supplies and backpacks.
Some kids will come in and grab some of the food, and they'll just give them a backpack so they can transport it home so other friends don't see that they're actually trying to supply food for the home for the family through those respects, and there's...
It's very tailored in terms of there's usually an initial assessment when a child is deemed to qualify for the more case-managed services if Communities In Schools is present within the school.
They do ongoing assessments as well, so obviously if there's another need or an issues arises, either it's a disciplinary issue, or something's going on at home, that Communities In Schools is made aware of, they'll try to target services there as well, so the services don't just stop with the child.
They are mainly focused on the student, but if something is learned, that there's another need at home, some multi-tiered counseling supports might be helpful not only for the student but for additional family members, that's something they work to pair them with additional services that may be outside of Communities In Schools' expertise.
>> Does look like there's another question that came up in the chat, in the Q and As, "So there is a lot of discussion about community school models and other models that talk about the importance of community factors.
How do community factors fit into the model?" Well, I would say that across the framework bringing in the community seems to be something which is a consistent recommendation not only for helping try and manage student behavior but for a number of the initiatives that school safety requires.
>> Yeah, I think just too when you're a student within a school, within a district, within a community, there's a lot of bidirectionality there as well so definitely important to consider.
>> Caleb, could I ask a question of the group? >> Please.
>> So...
And this is really for everyone and is based on a lot of the thinking that we've done in developing the framework, and it's sort of on this theme of implementation, and so one of the questions that I have heard periodically from folks in schools, you know, "We want to make improvements on school safety.
Like, where should we begin? You know, what first step should we take?" I'm sort of wondering for the panelists that talked about the development of the programs and the implementation and the evaluation, kind of knowing what you know now, would you have...
would you make it a different...
Would you make a particular recommendation for how a school or a school district should start if they want to begin down the path that you have been working on? Knowing what you know now...
What do they say, Monday-morning quarterback? Would you recommend that they do anything differently? >> So I'll jump in with my thoughts.
Obviously, I'm a fan of evidence-based.
I think that's a absolutely crucial aspect is supporting schools in selecting an evidence-based practice and something that has a good fit with their needs and resources and desires, and I think a big emerging area of research in implementation science is understanding this concept of readiness to implement new changes.
I mean, it's not really even new.
I mean, community psychologists like Abe Wandersman have been, you know, talking about that for years, but if a setting is not ready and if they don't have sufficient resources, it's going to be really hard to implement a new program, and then of course that selection of a project or a program that matches well with their community values and ability to implement it is crucial, so, you know, no answer, just I think these are things that we need to be thinking through in helping people select a program or an approach that's appropriate.
>> Seems that a consistent theme is the partnerships that are really kind of required in order to do a lot of this work, and so, again, kind of going off Mary's question, so is there, like, a bare-bones, like, kind of partnership network that you would need to have in order to get some of this work kind of going? It kind of seems like you need to, you know, involve the police in some way, and then there are mentions of, you know, social-work help.
Do universities need to be involved? Is there...
Are there ways to do this without actually having to bring in individuals in higher education, not that that's necessarily a bad thing but just trying to think about, you know, implementing this in even those rural neighborhoods as well.
>> Yeah, Mary, if I could jump in as well, I think one things to consider is, what are the resources that you already have in your community? And so I think even some of these projects when I talked about implementation challenges, so, like, Ryan, for example, or Lauren, for example, talked about the competing priorities that may come up, and so thinking about choosing an evidence-based program like Lindsey mentioned, but also, what are the resources that you already have to bear that you can pull in from the community, and how are those going to work together to promote a comprehensive plan, I think, is really important, recognizing where the risks from competing priorities, competing stakeholders and making them...
helping them to get on the same page.
>> Yeah, I think that's important, Nadine, and to sort of tie in, I think, a little bit of what Lindsey said and bringing in some of these implementation-science outcomes into that, which I'm doing in some slightly related work off of this, which is the schools in some cases will, say, have lots of resources and have programs and interventions.
It's obviously an important question to interrogate those in terms of their evidence base, but then there are those implementation-science outcomes that come in that say, "Okay.
These things have existed here, right? People have been trained in them.
They've sort of been circulating around in your building.
Have we taken the time to really figure out where the sort of implementation origin lies and starting at the very beginning, which is, is this an appropriate fit for this building and for your main concerns that you are trying to address?" And really, that is a new way of thinking in my experience and coming into schools and working with administrators who want to say, "We need to do this.
It's important for our students.
Help me think about and help me come up with implementation strategies to get the people in my building to do this," and to try to sort of walk them back a little bit to say, "Let's look at what you have here, and let's think about the context that you're in, the issues that you're trying to address, and then let's really sort of forefront the implementers, the teachers, right, the people that you're asking to do this.
Are they in line with your approach here? Are they in line and on board with these interventions? Because if the buck is stopping right there, then that's where we need to start, right, is helping people understand or at least have input into, what are the interventions? What are the issues that we want to address before we jump to, why aren't they doing it, right, or how do we get them to do it with more consistency?" And so I just think that helping schools and school districts sort of back up to that level of interrogation into how things are going in their buildings kind of goes a long way in sort of figuring out ways forward.
>> Well, we're coming up right along at 5 o'clock now.
Thank you so much for all of the presenters for taking the time to have this wonderful conversation.
And if no one has anything else, I can go ahead and let everybody go and invite everyone to come back tomorrow for day two of the conference, starting at 11 a.m.
>> Thank you.
Disclaimer:
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.
- Meet the OJP Science Directors: Nancy La Vigne and Alex Piquero Discuss the Future of Research and Statistics at the 2023 NIJ Research Conference
- Director La Vigne Discusses Forensic Research and Development at NIJ
- Implementing NAGPRA Connecting Medical Examiner and Coroner Offices to Tribal Partners