Addressing Student Mental Health Concerns - Breakout Session, NIJ Virtual Conference on School Safety
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video includes the following presentations:
Interconnecting PBIS and School Mental Health to Improve School Safety: A Randomized Trial, Mark Weist and Joni Splett
There is a pressing need for school safety efforts to move beyond physical hardening and reactive disciplinary strategies to advance prevention-oriented, comprehensive strategies that systematically address the underlying causes of common behavioral problems and infrequent incidents of violence. The Interconnected Systems Framework (ISF) is a structure and process for promoting all students’ positive social, emotional, behavioral, and academic development while collaborating within schools to provide early access to a continuum of interconnected services when more supports are needed. In a randomized controlled trial funded by the National Institute of Justice, the ISF was implemented in elementary schools in two southern states and outcomes compared to schools implementing Positive Behavioral Interventions and Supports (PBIS) with and without co-located, but not interconnected, mental health services. Preliminary results suggest promising outcomes with more students in ISF schools being proactively referred for Tier 2 and 3 interventions, more interventions provided, and decreased in-school suspensions and office discipline referrals, including evidence for reducing inequities. In addition, student and teacher surveys documented higher respect, engagement, and perceptions of safety among students in ISF schools, as well as lower ratings of student acting out behavior. These results demonstrate the promise of the ISF for improving student well-being and reducing threats to school safety.
Project SECURE: A Multi-tiered Approach to Supporting Students Exposed to Trauma, Carl Sumi
In 2016 SRI International received a grant from NIJ to implement Project SECURE. The overarching goal of Project SECURE is to evaluate the impact of a multi-tiered evidence-based framework to strengthen the resilience of students who are the most vulnerable to disciplinary exclusion, gang involvement, and trauma. In this presentation authors will provide a brief overview of Project SECURE, the two interventions being studied (Second Step and Bounce Back), the screening process, and preliminary results from the study.
School Safety and School-Based Mental Health Services in a Large Metropolitan School District, Anna Yaros and James Trudeau
Evidence-based secondary and tertiary mental health programs in schools have the potential to impact an entire school population by reducing aggression and victimization and improving overall climate for students and staff (Ballard et al., 2014). RTI International partnered with Charlotte Mecklenburg Schools (CMS) to study school safety using a school-randomized controlled trial (RCT) of three types of school-based mental health (SBMH) services and a quasi-experimental study that compared each of the three SBMH arms to a set of propensity score-matched, nonrandomized, non-SBMH comparison schools (n = 34 schools). Findings from staff surveys, student surveys, and administrative data did not show reliably improved school safety between treatment arms. Examination of implementation levels suggested that variability within treatment arm in levels of SBMH received by students predicted staff report, and to a lesser degree student report, of increased school safety. Specifically, the percentages of students seen for services by a school psychologist, school counselor, or SBMH therapist were related to increased feelings of safety and fewer unsafe incidents. A cost effectiveness analysis revealed that two levels of increased SBMH services were both more costly and more effective than SBMH treatment as usual. Implications for SBMH service provision will be discussed.
Promoting School Safety: A Comprehensive Emotional and Behavioral Health Model, Jill Bohnenkamp and Cindy Schaeffer
Schools across the nation are working to formulate comprehensive, evidence-based crisis response and prevention interventions to address student emotional and behavioral health crises. This presentation will describe the implementation of a multi-tiered comprehensive preventive intervention to promote school safety and research findings from a two-year randomized controlled trial of the intervention with 40 schools spanning elementary, middle and high school levels funded through the National Institute of Justice’s Comprehensive School Safety Initiative. The study evaluated the impact of the Emotional and Behavioral Health Crisis Response and Prevention (EBH-CRP) intervention on school safety outcomes, emotional and behavioral health service utilization and quality outcomes, educator knowledge and preparedness to address emotional and behavioral health concerns, and a cost-benefit analysis. Overall study findings will be presented including key findings highlighting that schools implementing the EBH-CRP model are minimizing the use of disciplinary procedures including suspensions, office discipline referrals and juvenile justice referrals as well as being more likely to respond with a therapeutic approach.
>> Good afternoon.
Welcome to the Comprehensive School Safety Initiative Conference.
I'm so glad you're with us here today.
Welcome to today's breakout session on addressing student mental health concerns.
My name is Barbara Tatem Kelley, and I serve as the social science research analyst at the National Institute of Justice, and my primary interests are in adolescent development and its relationship to victimization and trauma exposure, delinquency prevention, juvenile justice system improvement, mental health, youth gangs and education.
I have some background in clinical psychology and previous experience teaching a self-contained classroom of delightful sixth grade students with emotional disabilities.
I'm very interested in hearing more about the approaches that you've studied and addressed, looking at issues of violence both at outward and inward directed aggression, impulsivity among students of mental and behavioral health concerns.
In this session, we will have four presentations, and then with the remaining time at the end of the session, we will address questions raised from the attendees.
Please remember to enter your questions into the Q and A section of the screen so that we can view them at the end of the session for answers.
Now I'm going to introduce each of our set up presenters.
Our first presentation is by Mark Weist and Joni Splett, who will share findings from implementation of the interconnected systems framework, which shows promise for improving student well-being and reducing threats to school safety by offering students both school-based strategies and access to a continuum of interconnected services and supports.
With that, I turn it over to Mark and Joni.
>> Thank you, Barbara.
It's a pleasure to be with you.
Good morning or afternoon depending on where you are in the US.
I'll open with a brief story.
Carl Sumi, one of our panelists, had prior experience with these grants and told me that if we don't hear in September, we don't get the grant, so I was flying to Connecticut, 5 P.M.
September 30th, no word.
We didn't get it.
My plane landed.
We got the word.
I called Joni.
We were freaking out.
So here we are 5 1/2 years ago got the word, and 5 1/2 years later, here we are, and what a privilege.
Thank you so much to NIJ and to the CSSI initiative.
It's really great to be a part of it.
So Joni and I will kind of go back and forth in presenting today, and Colleen Halliday, who is a coinvestigator and key person on this project, is not able to be with us but is also one of the authors.
So for at least about 12 years, we have these two major national initiatives.
One reflecting positive behavioral intervention and support, which reflects well done multitiered systems of support in schools.
Tier I, promotion, prevention; Tier II, early intervention; Tier III, more intensive intervention, with a focus on data-based decision making, teams, installing, refining evidence-based practices, doing the best possible work data systems practices in schools to promote student's positive social, emotional, behavior and academic functioning.
Then we also have this movement around moving the mental health system in schools because youth and families tend to not connect to specialty mental health, and so what we found for that past 12 years, these two initiatives are highly complementary.
For example, in PBIS, it's a great system for implementation but often lacks depth at Tiers II, early intervention, and III, more intensive intervention.
School mental health brings the more comprehensive school mental health augmenting the work of the school employees, the counselors, psychologists, social workers.
Now we also have clinicians from the mental health system joining at strength, but typically at Tier III, and the downside to school mental health has been it's been a colocated model so that clinicians come into schools and people don't even really know what they're doing.
So the two models are complementary, and so for about 12 years, national centers for PBIS and for school mental health have been working on this complementary framework called the Interconnected Systems Framework for School Mental Health and PBIS.
Over to Joni.
>> So in our project on testing the effects of ISF, these are the ways in which ISF enhances those core features, some that Mark already mentioned, but enhancing the core features of a multitiered system, having effective teams that include those community school-based mental health providers augmenting mental health resources of the school, so including them in those school teams, expanding data-based decision making to go beyond discipline referrals, formalizing a process for selecting and implementing evidence-based practices that are inclusive of that behavioral lens all the way through the full mental health continuum, ensuring that students and children have early access to those practices and interventions through screening, rigorous progress monitoring, and critically important is that ongoing coaching in both systems, in both the educational and the mental health systems and having a shared district community leadership team that supports that ongoing dual system coaching practice model.
In I want to say 2014, Mark, 2013? >> 2013.
>> 2013, yeah, this was kind of all first really formalized and a monograph available on pbis.org as well as the Center for School Mental Health's website via a lot of real kind of operationalizing the conceptualization of what an interconnected systems framework would look like, and then most recently, volume two of this was released.
It's a monograph as well or an e-book, about six chapters.
You can see it's really that -- kind of what it looks like and what it feels like to implement ISF all the way from initial considerations, explorations and adoption to installing initial implementation and sustainability, and offering a lot of the kind of what it looks like and feels like to implement ISF are really coming from the project we're sharing today.
So there's examples and the drive coming out of a result of this project.
>> So, again, here's kind of the formal title of the study and information on when it was awarded and the disclaimer, and we use the informal name Project about School Safety and our website, PASS, et cetera, that really helped with the interface with students, families and schools, and if you could go to the next one.
So this was a randomized control trial.
You can see the partners: University of South Carolina, University of Florida, Medical University of ]South Carolina and other partners.
We had 24 schools total, 12 elementary schools in Charleston, 12 elementary schools in Marion, which is near Gainesville, and three conditions.
So within each site, four schools were randomly assigned -- all of them, by the way, were implementing -- All 24 were implementing PBIS at some level of fidelity at the start of the study, and then at each of the sites, four schools were randomly assigned to just stay with PBIS, to add clinicians from the mental health system but in the typical colocated fashion, so we'll call that business as usual, or to the Interconnected Systems Frameworks.
So across the two districts, we had eight, eight and eight in each of the three conditions, and we had two years of intervention versus comparison.
We were also able to have an opt-out procedure on this, which was phenomenal.
Really reduced the need for infrastructure, and this is really a no-risk study essentially, and it fell in the education exemption systems intervention.
So we ended up with academic record data for about 31,000 students, which is phenomenal.
And again, we're integrating a more comprehensive approach into the core of PBIS data systems practices, and we're looking at the effect of the ISF compared to PBIS or PBIS plus school mental health on discipline, perceptions of school safety, student behavioral functioning, and then on a more approximal level, the functioning of teams, students access to treatment and quality of services and cost effectiveness.
>> Mark is going to talk about some of our outcomes that I think are hot off the presses, and first kind of describe how we -- We talked about what we implemented, but first describing how we measured fidelity, so we used that via the ISF implementation inventory, which is a tool to support schools and their community partners in installing and implementing the ISF.
I can give you a baseline measure across the three tiers and then to monitor progress and has associated forms with it to support action planning.
In a separate study, I've -- Mark and I have both studied, developed and validated the ISF II, ISF Implementation Inventory, so you can see just some of the preliminary validation results there, and it has the newly released version three that is available by contacting us.
In this study specifically, we, I think, an important strength is that most schools in both districts were able to reach the kind of accepted 80 percent cutoff score in all three tiers by the end of the study, so you can see South Carolina, the Charleston district on top.
In Florida, it's district Marion County, Florida on the bottom and across all three tiers, you see growth and implementation and progress on meeting that 80 percent cutoff.
Of course you do also see when you break it down by schools, that we did have, not surprisingly, some schools implementing better than others.
So in those sites, when you break it down by schools, we see that three of the four ISF schools met 80 percent or above by the end of the study as their final fidelity implementation score across all ISF schools in each site.
So you see a similar thing happening in Florida where three of the four schools in that implementation cut off, and so I think moving forward some of our analyses are both that the full sample intend to treat with a smaller subsample of schools that actually implemented with fidelity.
>> So of course our final report was due on New Year's Eve, and of course we were working on it on New Year's Eve.
It was 1:30 I think when we finally finished it.
These are the authors.
This is a complicated study, you know, with probably more than 50 people involved.
These are the authors on report.
I'll go to the next one.
And then acknowledgements with particular thanks to Cathy Gerard, our project officer, and Mary Carlton, our social science analyst, and also the youth and families and numerous advisors from the school mental health world but also from the PBIS world along with our coaches who deserve some thanks.
So we were excited to see these findings.
So in terms of proximal findings in ISF schools, we see school administration more involved with the MTSS team, and again, strengthening that MTSS team is a critical quality indicator.
Jeremy has written extensively about that.
We see more school administrators and clinical staff, such as school psychologists, the clinician from the mental health center actually participating in team meetings, and we saw more discussion of school wide issues and planning to improve Tier I programming in ISF schools, so interestingly in these schools, we actually had strengthening of the core PBIS elements over time.
We saw better utilization of screening data, so more students proactively identified with needs and referred for Tier II and Tier III interventions, more Tier II and III interventions actually provided to students, and for youth of color, we saw a trend of more interventions for them over time, so we know the inequities are not just around youth of color receiving more discipline, but they're also about youth of color also not receiving the needed interventions for them, so we were pleased to see that.
We're also very pleased to see these outcome findings.
In ISF schools, we saw reduced in-school suspensions for the overall sample, reduced office discipline referrals for the overall sample and reduced out-of-school suspensions and ODRs for students of color, so more evidence of the ISF reducing inequities.
We also had some trends in terms of student/teacher surveys, more respect conveyed by students in the ISF schools towards other students, greater student engagement, decreased externalizing behavior and increased perceptions that schools were safe, but these data were more trends at significant levels but not as strong as the prior findings.
Then we did not see any difference in terms of satisfaction with services, exposure to violence and bullying or teacher reports of aggressive behavior.
>> More recently we've leveraged some of these results and tried to extend them into a next-step study of thinking about how ISF reduces some of the inequities in the school environment, reduces some of the discipline referral and externalizing behavior over referral and over disproportionate representation for African American students, and so Project RISE is funded by the National Institutes of Health.
Colleen Halliday, who Mark mentioned is a coinvestigator on our NIJ project.
She is the PI of Project RISE, Reducing Inequities in the School Environment, and Project RISE is happening in Florida where we're taking the ISF framework and enhancing some of the Tier I practices for reducing unintentional biases among -- and prejudice reduction among teachers and students, and then creating accountability and progress monitoring systems for that using some data desegregation, a disciplined data strategy and in more diverse teaming practices.
So through that, we hypothesize further reduction of disproportionality and improvement of the school climate.
So that project is in its second year, and probably like most students happening now, has some interesting challenges in a COVID-riddled environment, so we'll start with new cohort in August and hopefully be able to overcome some of the first cohort's challenges and see how ISF continues to impact that kind of important indicator of equitable outcomes.
We've also put a lot of the resources that have emerged from this project and from our study under the mental health/social-emotional well-being tab at pbis.org and Midwest PBIS, and so if you go to Midwest PBIS, I think you can -- and at pbis.org, you see where you can access, like, technical assistance briefs, recorded webinars and presentations all on ISF and many of them coming from this project, including one I'm supposed to be editing right now.
And there's some websites for further information to learn more about ISF.
I think with that, too, we give our thanks and appreciation to everyone involved and pass it over.
>> Thank you.
>> Thank you very much, Joni and Mark.
I really appreciated hearing from you.
I'm sure everyone viewing did as well.
Our second presentation is by Carl Sumi, and he's going to discuss Project SECURE, which is a multi-tiered approach to strengthen the resilience of students exposed to trauma and considered most vulnerable to disciplinary exclusion and gang involvement.
So with that, Carl, I turn it over to you.
>> Thanks, Barbara, and thank you, Joni and Mark.
Mark did tell me he was going to tell a little story about me, and I got a little nervous, and I did threaten him just slightly, so hopefully that story was changed for one that didn't really matter that much, but again, my name is Carl Sumi, and I'll be talking about Project SECURE that was funded also, I believe, in 2016, and my colleague Michelle Woodbridge, my co-principal investigator.
This is a very nice progress of interventions or presentations, I mean, because our Project SECURE project is actually a study of several programs that could be implemented within an ISF and PBIS framework, so we have a closer examination of a couple of programs that I'll talk about if I get my screen moving here.
So I'm just going to outline presentation.
I'll just be providing a short overview of Project SECURE, talk a little bit about screening elementary students for traumatic stress and then some preliminary results of the study.
So very similar to most other programs or projects in the NIJ Comprehensive School Safety Initiative, the purpose of our program was to really take a look at what would happen if we investigated different interventions at the different tiers.
So we want to investigate interventions delivered at Tier I, Tier II and Tier III and see how they would interact with each other to help support students who are most vulnerable to disciplinary exclusion affected by trauma.
We implemented the Project SECURE in elementary schools in an urban district, and what we did was we had the schools implement a universal or Tier I program, social-emotional learning program, which we called Second Step, and then we also had the schools implement a targeted intervention for children exposed to trauma, a program called Bounce Back, and we also had a tertiary level program that we investigated I'll talk about in a little bit.
But what the goal here was to do a rigorous randomized control trial of the implementation of these interventions at their different tiers and see how the results could possibly interact.
The majority of the programs obviously very similar to other comprehensive school safety initiative programs, the more projects that were implemented by school staff.
In this case, the majority of the program was implemented by school social workers, which in this district, we're very lucky to have a dedicated school social worker to each school, but then also at the tertiary -- or, the Tier I level, teachers were implementing, at the Tier II level, school social workers were working with students and also at the Tier III level as well.
So looking at the framed -- the tiered approach, as I mentioned, the Second Step was kind of the basic intervention, the Tier I used for all students in the elementary school.
People might be familiar with the program -- social-emotional learning program to all students in elementary schools K-5 would be receiving Second Step.
And Bounce Back was the secondary or Tier II intervention for children who were exposed to trauma, and at the top of the triangle, students with the most needs, we investigated a program called the City Student Assistance Program, City-wide Student Assistance Program for students with the most level of needs.
So, again, Second Step is a universal prevention program for children in grades K-5.
People might be very familiar with the Second Step social-emotional learning program.
It's a pretty good evidence-based for its effectiveness.
It has 22 lessons implemented in the classroom by the teacher.
There's also a bully prevention unit that has just lessons for recognizing and reporting bullying.
This was implemented in the teachers' classrooms in the schools that were implementing Second step, and I'll get to the research design in just a minute, but what's a little bit unique in this project is that the teachers also received support coaching and consultation from a district level staff person called a teacher on special assignment, and these teachers on special assignment would support about five schools at a time, and they would be doing one-to-one or modeling lessons of Second Step and how to implement and problem solving with teachers if they had any issues and also presenting the whole Second Step curriculum to the school staff and providing any support that the teachers might need.
The secondary or Tier II level intervention is called Bounce Back.
It's an intervention or a program for children exposed to trauma or experiencing significant traumatic stress.
This program was implemented by the school social workers, who are licensed social workers often with an education or background in delivering evidence-based programs, also programs with a cognitive behavior therapy structure, which -- similar to Bounce Back.
It's a group CBT intervention for young children.
It says up to seven student participants, but what we learned in this is about four to six was the perfect number of students.
So there's 10 group sessions about once a week, and then there's about two to three individual sessions with the students as well and parent groups, and it's a very typical approach of cognitive behavior therapy using relaxation, CBT to reduce negative thoughts, exposure, social problem solving.
The social workers who implemented also participated in a weekly supervision group and received guidance from other what they call mentor school social workers at the district level of their implementation.
The Bounce Back program I'm not sure if people are familiar with the Cognitive Behavior Intervention for Trauma in Schools, or CBITS.
That was developed for children in the middle school level.
It's also been used at high school.
The Bounce Back program was developed by the same folks who developed CBITS, and they modified the curriculum to be appropriate for students in elementary schools.
Just a little bit more information about Bounce Back.
You can go to bouncebackprogram.org.
You can register on there for free.
They do ask for an e-mail address, but it's really just to track usage.
It's not to send you crazy e-mails down the line.
This Bounce Back website is extremely useful for people who are implementing.
There's a lot of materials and forms on there.
There's video clips that presenters or clinicians or implementers can use, and then what's really great is this online kind of form for experts and colleagues to get together and problem solve their implementation.
And then at the tertiary level, the top of the triangle, the school district that we worked in had a program that has gone through several name changes over the time but really was for the students who had the highest level of need, and what they were doing was pulling together people across the district and across the city in different types of programs, school health programs, department of health, people services within the school district, also probation.
They had a lot of people at the table because they recognize that these students had the highest level of need and were often involved in lots of different agencies, and so they would come together once a week to talk about how they could help these students and pool their resources.
So our research goals were really to run a randomized control trial to evaluate the efficacy of Second Step and Bounce Back and what the central question was, is if you're implementing an evidence-based Tier I program like Second Step, does that provide additional support to Tier Ii types of interventions like Bounce Back? We wanted to see if there was interaction and effects between these two interventions, and so we developed a randomized control trial to try to test that out.
So the study design, this is actually -- This is where we get a little hard time from our staff because we got a little really excited about this study design because technically it's what would be called a four-arm randomized control trial.
At its conclusion, we'll have had 36 elementary schools over the course of 4 years, and each year, the process is the same.
Each year, if you just think about 10 schools, half of those schools will be randomized to implement Second Step and half would do what they normally are doing, so five schools implementing second step and five schools doing business as usual.
But within each of the schools, either Second Step schools or business as usual schools, we're actually running an intervention Bounce Back, or a randomized control trial of Bounce Back.
So in every participating school, they're actually screening elementary schools students in fourth and fifth grade for exposure to trauma and consent and randomizing students to an immediate treatment group or a wait list treatment group for students in Bounce Back.
And what's happening here is that what you're going to end up getting are some students who are receiving Second Step in their school and also randomized the Bounce Back position, some students receiving Second Step but randomized to a wait list position, some students not receiving Second Step but receiving Bounce Back and then some students not receiving Second Step and a wait list group for Bounce Back.
So kind of a pretty exciting design that a lot of researchers, evaluators get a little excited about.
As you can see, I did definitely for sure.
Some of the data we're collecting for Second Step is we're just doing some implementation surveys from teachers about how much time they're implementing: are their lessons delivered, their perceptions of climate and students' social-emotional skills.
That's just for the implementation of Second Step.
The school district that we're working on actually conducts a school culture climate survey every spring, so we have access to those climate surveys in every school that is participating.
It's completed by teachers, parents and students in grade four through five, and it's a typical school climate survey addressing or asking questions about safety, engagement, interpersonal relationships, and then we're also collecting school record data around absences, office discipline referrals, suspensions, and we're trying to obtain also academic state taste scores.
So those are the outcome evaluation data for the Second Step intervention.
For Bounce Back, we're also collecting implementation fidelity data, obviously.
We're also collecting some surveys from students.
We're only doing the Bounce Back in grades four through five even though it's been validated for K-5.
We wanted to focus on fourth and fifth grade students online because those are the students also that we would also have the Tier I data from.
We're collecting the Screen for Child Anxiety Related Disorders.
Yes, it's actually called SCARD, and a social adjustment scale that was completed by the students, and we're also collecting some parent surveys as well.
We're collecting the data at baseline before randomization and then again at post-test for all the students involved in the Bounce Back.
So one of the questions that we get often when we present on Project SECURE is, how do you even screen students in elementary school for exposure to trauma? And a short response is, it's really, really hard.
We've done some presentations just on this screening for students in elementary schools for an hour on this process, and I'm going to get it to you guys in two slides today, so basically the traditional screening would be a large screening where you can have students to complete a self -- a checklist or a survey, and then you can follow up with them, or what's happened most often in elementary schools before is there's a really comprehensive one-on-one interview with the student and the parent.
Now, you can imagine that can take a really, really long time.
So what we developed through this project and systems of the school district is what they started calling the classroom SST screening, classroom student support team screening, and what would happen there is that through this process, they would identify students who have academic or behavioral needs or affected by traumatic stress.
And the process is facilitated by a school social worker who would actually just take a fourth-grade classroom and work with a teacher to review the data and programs or services or needs of all the students within that teacher's classroom, so they review the cumulative files, test scores, IEP status, attendance, health needs, discipline referrals, review anything that the teacher or social worker knew about that student.
It could also invite other people to these meetings, administrator, other school support staff that know the student, but, basically, it's reviewing, doing kind of records of, you have all the information about that student, about each student within a classroom, and we did that the first year, and we've started to recognize that as to be expected, students with internalizing behaviors were being left out.
Teachers were not identifying, or they were overidentifying students with externalizing types of behaviors, so we implemented in the second year a process to help students identify students with internalizing behaviors by including some steps from the systematic screening for behavior disorders where teachers would nominate five students with internalizing behavior problems, five students with externalizing behavior problems and then identify of those 10 students any modified screening items from the adverse childhood experiences to indicate any known traumatic experiences.
And so what we ended up doing was, if there was at least one symptom from the checklist around traumatic stress, at least one event from the modified ACEs and at least one academic, attendance or behavioral concern, the student would be eligible or considered eligible for the Bounce Back group, and so through that process, that was the screening process.
Again, it was a lot more complicated and involved, but that was the way that we figured that with the classroom SST, the review of the data, identification of student needs, we could determine if students might be eligible or severely enough affected by trauma where they would be a student eligible for the Bounce Back intervention.
This would take about 60 minutes per class, and in previous research, we were starting to find that it was taking 30 to 45 minutes per student when we were doing individual interviews, and that obviously takes way too long, so that's the intervention, so if you remember, we have Second Step at the tier one, Bounce Back at tier two and then the citywide student assistance program at tier three.
Some preliminary results for the screening, cohorts one, two and three, we just started cohort four during COVID.
We'll see how that works out, but our overall eligibility rate for students is around 22 percent with very wide ranges, ended up -- And then students with consent was a pretty high level of students consenting to participate in the Bounce Back intervention.
Jumping back to the model again, just to remind you why we're so excited about this model because one of the things that we'll be able to do is do a trial of Bounce Back and see if there are interaction effects of the implementation of Second Step at the universal or tier one level, so we don't have the data for Second Step yet because we are pulling the data from the district, and last year, when we went to pull the data in March of 2020, I can't remember if anything else was going on at that time, but the districts were really overwhelmed with something happening in the world, so we're still going back to the district now to get some of those data, so the data I'm going to show you now is just from the Bounce Back trial.
This is the screen for child anxiety-related emotional disorders.
The treatment group, intervention group, and on the left, you can see the pretest and the posttest.
The red line across is the line where clinical concern would be above that line.
It looks like there's a little bit of a change here, but that was not significant on the SCARED items, but when we, with a main effect of a 0.12 p value, but when we look at the subscales, we did have significant effects on separation anxiety at the 0.01 level with an effect size of 0.33, which was pretty strong.
The interaction with the socioemotional or learning step progress was at a p of 0.29.
That was just for the first two cohorts, so we don't really have anything going on there just yet.
Social adjustment scale, we didn't have any significant differences either at the full scale or in the subscales.
Although it was around a 0.08, we still weren't significant there, so just a quick finding of our findings.
For the students in the Bounce Back group, did better on the total score and the separation anxiety than the students in the comparison group.
They rated themselves basically as exhibiting less overall anxiety and separation anxiety than their peers who received typical services, and these findings are consistent with previous studies of Bounce Back.
We didn't have any significant outcomes in social adjustment scale, and we didn't have any interaction effects just yet which means it didn't depend if they were receiving Second Step or in a school that was implementing Second Step or not, and then the final steps right now, we're in our final cohort year.
We'll be pulling in the climate survey data, which we don't have just yet, and additional extant data, and finally, before I conclude, this is a student blog or website where we post preliminary results from a lot of our studies and actually also information just about evidence-based approaches to supporting students' positive behavior, mental health, well-being.
I encourage folks to go to the student behavior blog website, [email protected] or studentbehaviorblog.org to learn any more information, and then there's my e-mail address to folks who want to contact me individually, and I will turn it over, and I'm sorry for going a minute or 2 over.
I'm going to turn it over to the next presenter.
>> Thank you so much, Carl.
I really appreciated learning about Bounce Back and all the work you've been doing there.
Our next presentation is Anna Yaros and James Trudeau, and they're going to discuss the complexity of the findings that they found in the implementation of three types of school-based mental health services, so welcome to Anna and Jim.
>> Thanks so much, Barbara.
You were right in the intro about complexities, so we're going to try to make it quick and move through everything that we did.
I don't have a story about Carl, but I will start with a bit of connection to Mark.
We also were expecting to hear about this grant on September 31st -- I'm sorry, 30th, and it ended up hearing -- really didn't get the information until October 1st, so we thought there was no chance, so we were really excited and as you can see have engaged quite a number of folks at RTI International where Jim and I work.
So this project, as I mentioned, was funded in 2015 as part of the comprehensive school safety initiative and began in 2016, so if you were following the solicitation at that time involved categories, and this category was developing knowledge, so it really was designed to be a research project to promote some information and knowledge about what work.
And we partnered with Charlotte-Mecklenburg Schools, and the partnership really started from the beginning with trying to understand what they needed and wanted for their students and their safety and then going from there, and certainly what they turned to was mental health and support for mental health.
Sorry, just one second.
So I'm just going to briefly go over the link between school-based mental health and school safety.
I'm sure we've already heard something about that, and I'm sure you feel well versed in that so far.
The study design, which is complex but not as cool as Carl's, but it's an interesting, complex design, the service level results, where I'll talk a little bit about our implementation, our ultimate staff and survey results in our outcome evaluation, cost-effectiveness evaluation, which is an element that we really thought was particularly interesting and important for implications for schools, real schools interested in this type of work and then mixed methods process evaluation to unpack some of what we found and then end with discussions and implication.
So school-based mental health is defined in our study very similarly to what some folks have already mentioned and discussed, so in Charlotte, there were contract providers who were really embedded in schools and provided services, therapy services, charging Medicaid, private insurance and then sometimes funded by state and local district funds.
But those folks were part of a larger system that was supported by CMS clinicians, so school psychologists, school social workers and school counselors in North Carolina have a specific license to practice in schools, and each of these three types of providers were also included in our study, and I'll explain how in the next slide.
Well, I'll explain how in a few slides, apparently, so just wanted to give you a little bit of background and some of the underlying theory about why we chose to do what we did.
We really were trying to make an impact on broad-based school safety starting with a small number so really starting at the top of the pyramid with the tier three and then some of the tier two kids, and so we based our idea on this concept that some of the worst school violence or most disruptive school safety issues are perpetrated by a small number of students, and those students typically would fall into your tier three or tier two categories in the MTSS or PBIS structure, and many of them receive, may receive special education services.
They may receive mental health services, or their needs may not be met, and so we went with the premise that addressing their needs can improve their behavior but thereby also improve the climate for everyone, and then that's based on some research to that effect.
We know that school-based mental health so embedding therapy inside the school can significantly improve behavioral outcomes for those students, can also lead to overall school safety and school climate improvement and can reduce suspensions.
We also know academic outcomes which are incredibly important, well, to us but also to our school partners.
Those are also empirically related to providing these mental health supports in schools for students, so our study really engaged all of the middle schools to a degree in -- Whoops, sorry -- in Charlotte-Mecklenburg schools, so we had both a nonexperimental and an experimental arm, so I'll start on the right, actually, in our experimental arm.
So what we didn't want to do was pull services out of any school that was already receiving school-based mental health services, and so we really benefited from the infrastructure that was already there, and it was there in 25 schools both six through eight and K through eight schools, so schools that already had this school-based mental health program, which was just the embedded therapists in the school, were randomly assigned.
And we ended up with three treatment groups down there in the bottom on the right, eight enhanced treatment schools, eight expanded treatment schools and nine treatment as usual schools, and I'll talk about what that means in a second, but what I also wanted to mention is, with a clear understanding that we didn't want to randomize or pull away services that were already in place for many of these students or for many of these schools, we wanted to also create as best of a control as we could.
So we have a nonexperimental arm on the left which is our comparison arm whereby we used the 25 schools that did not have school-based mental health and therefore were not as in need or as affected by potential need for mental health services, and we matched them as best we could using propensity scores to our three other conditions.
So we ended up with nine of those schools, and, again, they're matched based on propensity scores because they are necessarily by design not starting at the same place of mental health need as our experimental arm on the right.
So the intervention arms ended up being a menu of services, and many of -- It's similar to what Mark and to Carl and Joni presented.
There are a number of different types of services you find in schools, and it's important to say that these here on the screen were embedded within a much larger school system that also included PBIS in some schools, other potential supports, so we knew that all schools here would have this top level which was school counselors, school psychologists and social workers.
We also knew that we had this subset which was the experimental arm that was already receiving school-based mental health through therapists embedded in the school.
What we did were actually use the grant funds to add pro bono slots, so those are slots for students who wouldn't otherwise have services provided because of lack of insurance, undocumented status, a variety of other reasons, so we were able to increase the overall amount of students seen by a therapist from school-based mental health services.
Our expanded condition really differentiated from the normal school-based mental health process to include two additional supports which was the student services facilitator and additional school psychologist coverage, so the student services facilitator was an administrative support designed to provide 504 compliance support, testing support and a variety of other administrative capabilities to the school in an effort to free up that time for counselors.
The number one thing that we hear from counselors is, they're not counseling.
They don't have time to do that kind of thing.
They're doing testing.
They're doing 504 compliance.
They're doing IEPs, and all of that is important work, but we wanted to try to emphasize and try to increase the amount of time that counselors spent on counseling both at the tier two and tier three level, and I'll discuss that in a second.
And then the last and most intensive treatment arm were the schools randomized to enhance treatment, and that included all of the things I've just discussed as well as specific training in evidence-based practice, so this was the idea that if we try to give these schools all the support we can and add training in really intensive evidence-based treatment, how is that going to go? So I'll tell you how it went, so the evidence-based practices that we added in the enhanced group were in tier two and three.
Tier two received -- In other words, students who qualified for tier two services received the SPARCS program.
SPARCS is a structured psychotherapy group, so it's a group-based therapy really designed for kids who have experienced trauma but also can be really beneficial in general for emotion regulation, and then for our tier three kids who really needed intensive mental health services, when appropriate, they were assigned and received dialectical behavior therapy which is a pretty intensive but very evidence-based practice in outpatient clinical psychotherapy including getting more so for adolescents.
And so it's specifically driven to improve emotion regulation, reduce anger and aggression and behaviors related to that and then reduce behaviors related to suicide and self injury, so the DBT practice was really intensive, and it was designed to be administered to kids whose behaviors were really interfering with their place in the classroom.
So just quickly, research questions, we wanted to see how our study groups differ.
We wanted to look at those levels of school-based mental health, so how did the implementation actually look? We wanted to try to learn about barriers and supports, what worked and what didn't for implementation.
As I mentioned before, we had those cost questions, and then questions that we really came to understand was one of the most important ones was, did outcomes function as a -- Did the outcomes change as a function of the service levels or the implementation that were provided? And then lastly our cost effectiveness analysis.
So the data sources that we used, just really quickly, we had a staff survey, and we had a student survey.
Both included many questions about school climate and safety.
We did look at discipline data, but that won't be presented.
We won't present that today, and then we had a provider survey asking therapists, psychologists, counselors and social workers about their experiences.
As I mentioned, we used a mixed methods process evaluation to really look into the service logs and the interviews with the providers to try to unpack our implementation and how they were providing services, and then our cost evaluation really looked at the startup which was sort of the costs to get things going and train staff and then ongoing costs to implement this level of services.
So I want to start the results section off by giving you sort of the wah-wah moment, which is that service levels did not vary by treatment condition.
In other words, when we looked at psychologists and therapists work across treatment conditions, we didn't find the systematic change that we expected, so going back here, we would expect an enhanced treatment and expanded treatment on the right two columns that both psychologist time and therapy time would potentially increase above treatment as usual and certainly above the comparison group.
Well, and to be clear, the comparison group does not have therapists.
However, we didn't find that systematic change, so on the left, you can see the therapist change over time, and we're particularly interested in the red line which was the enhanced or most intensive treatment versus the expanded line which is the yellow one and the treatment as usual which is the green, and you can see that treatment as usual and enhanced, increased, expanded was tracking along and then went down, and similarly, over on the right, the psychologist time, and this is coverage.
This is number of students seen per enrolled student over time.
The psychologist time also didn't increase as we expected.
We would've expected that enhanced line, which is the red one, as well as the expanded line, which is the yellow one, to have increased or at least been higher than treatment as usual and comparison than green and blue, but they weren't, so we took that challenge and then tried to unpack that as best we could when looking at our results, so I'm going to let Jim take that from there.
>> All right.
There you go.
Yeah, and I will try to keep this real brief.
Obviously, as Anna alluded to, we looked at a lot of outcomes.
I'm just going to touch on a few today that I think kind of tell the story fairly concisely.
Let me just mention that on these slides, you see time one, two, three and four.
On the slides Anna just showed, they only reflected time two, three and four.
Time two, three and four are data points in the spring at the end of the year.
Time one is the fall of the first year because that's why we just all of a sudden added a data point.
As Anna was just talking about, the service levels did not align with the treatment groups as we had expected, and so it's probably not a huge surprise that to a large extent, we were not seeing consistent changes in outcomes between the different treatment groups.
We did see some.
I mean, we saw some differences between the various SBMH groups and the non-SBMH schools, but just in terms of, you know, that was the quasiexperimental part of the study, but in terms of the experimental part, looking at the randomized schools that had SBMH, there really were -- There were some differences in some things the enhanced group did look better, you know, had better trends.
But a lot of it, the differences were inconsistent and certainly not at all pervasive or really compelling, and in particular, at time four, some of the promising trends that we had seen kind of blew up on us.
This is just one sort of example outcome that we could look at.
This is the staff survey items on, you know, how often do you feel unsafe before school, after school, during school in three different locations.
We collapsed some of that because, you know, the location-specific stuff wasn't mutually interesting, but what you're mostly seeing here is not a real clear pattern of group differences on this one outcome, and that was sort of the main story across the board.
Next slide, Anna.
So I'm loathe to put up a table like this, you know, in a presentation.
It works okay in a report, but bear with me 1 second, and I'll tell you why we're looking at this.
Okay, so Anna had talked about, you know, the psychologist, the therapist, the counselor service levels.
We looked at it in terms of both the minutes per student, you know, basically pro rata for students in the school and the percentages of students.
We looked -- Where you see concurrent, that means the services in one year and the staff survey outcome towards the end of the that year, the lagged is services one year and the outcomes the following year, okay, so that's a pretty ambitious thing to look at, but we thought, "Well, we have the data," and if there are effects there, it would be important to see.
What you're seeing in the cells -- Well, first, let me say, the outcomes to the left, these were all staff surveys.
You'll see that some at the top, the feeling unsafe and the personal experiences, those are about the staff's personal experiences or feelings, whereas the bottom half are more of what they're observing in student behavior through bullying, fighting, disruptive behaviors, things like that.
The values that you're seeing in the table are coefficients between the levels of service and the various outcomes, and negative values are good here, okay, because it's more service means less of the various problems that we looked at, and I'll save you from doing the math.
You only see a couple positive, i.e. unfavorable, associations, and you see quite a few negative, i.e. favorable, associations.
In fact, of all the possible things that we could have -- Those combinations that, you know, could have resulted, 17 out of 45 of the staff survey analyses show favorable relationships, more services, better outcomes, which we think is pretty noteworthy including some in the lagged columns, okay, so staff -- I mean, I'm sorry, behavioral health services one year improving staff survey outcomes the following school year.
All right? And also to note, a lot of the significant findings are in the observations, and we think that's actually important because, you know, those are staff broad observations of what's going on in the school, so anyway, there's a lot to cover.
I just wanted to give you -- I think that if one slide could convey a large chunk of a study, I think that one is it.
I think there's a lot of important stuff there.
Anna, next slide, please.
I'm just going to run through this cost study real quick.
There will be a paper coming out on it.
Credit to my colleagues Alan Barnosky and Alex Cowell who actually led the cost study, but we decided we'd just keep it to the two of us today.
Next slide, please.
As Anna said, we measured various, you know, the start-up costs, the ongoing costs, the start-up costs in particular.
Anna, just go ahead and jump to the next one.
The startup cost in particular you see for enhanced, very large start-up costs.
That's all the training that was going on.
Those are, you know, basically staff costs of being trained in DBT and SPARCS.
You see the expanded, not hugely different from the treatment as usual, and this is focused on the SBMH schools.
Okay, Anna, next slide, please.
And I'm not going to delve into these details, just to say, the expanded and the enhanced were both more effective.
The effectiveness in the cost effective analysis is focused on student reports of victimization, okay, so the expanded groups and the enhanced groups were both more effective at reducing victimization but also more expensive than the treatment as usual group, and down below, this ICER that you see, that's the incremental cost effectiveness ratio.
What this is saying is that expanded was really optimal for improving, you know, reducing victimization at, you know, more cost but not hugely more, and so in a hypothetical school of 1,000 students, spending $6,800 in the enhanced programming that we were seeing would result in nearly 300 fewer victimization events in that year.
You know, so $6,800 for 300 fewer student victimizations, you know, we think that that's probably a good buy.
Anna, I will turn it back over to you, and I apologize for barreling through everything.
Obviously, there's a lot to cover.
There will be papers coming out, et cetera.
Thank you all.
>> Thanks, Jim, and in the interest of time, I will move very quickly through these slides as well, just wanted to say we did do some qualitative analysis of implementation frequency and how often folks were implementing them so high implementation versus low, and high implementation really meant, you know, psychologists were seeing 15 percent or more of the students in a building, therapists about five percent or more.
Kids were getting -- and this is per student in the entire building.
So again, we use these metrics, knowing that not every kid saw the psychologist, but sort of averaged it across the entire enrollment, about 50 minutes or more for the psychologist, 30 minutes or more for the therapist were really our high implementers, and our high implementers tended to show quite a bit of support.
So you can see that that included collaboration.
It also included an increase in capacity.
I will just point out here regularly expressed to us was the high burden of DBT and SPARCS, and we are also working on a paper specifically on that, and it really may potentially explain some of our results and why enhanced may have a little bit less bang for the buck, but these clinicians or providers were really taxed by learning this and doing consultation groups and all sorts of really intensive work that may not have had the bang for the buck as just regular treatment as usual by those staff.
There's a few quotes.
One that I'd just like to point out is this comment about the inverted triangle.
We expect the triangle to have 15 percent or small number at the top, but actually, in many of these schools, many, many of those students are having education interfering behaviors and mental illness.
So I think that's a common theme, unfortunately, across the country.
So just moving quickly to our discussion, I know in general we saw that levels of student services were impactful in school safety, and that enhanced and expanded were effective, but also were costly, and that many of the barriers and the need for students sometimes make it challenging to implement especially intensive evidence-based practices.
So we obviously had limitations, and I'll also just make a last-minute plea for more research.
I'll mention we have a presentation tomorrow at 2:30 on whole-school interventions, specifically for equity, but certainly more increased research in this Tier 2-Tier 3 space is needed.
So thank you to all of our collaborators, especially Charlotte Mecklenburg and then our NIJ folks who were incredibly helpful, as well.
>> Thank you so much, Anna and Jim.
I know it's not easy to condense such complicated findings into a meaningful feedback to the audience, but I think you did a great job.
We're now moving on to the fourth presentation with Jill Bohnenkamp and Cindy Schaeffer, who are going to address the impact of the Emotional and Behavioral Health Crisis Response and Prevention Intervention on school safety.
Take it away, Jill and Cindy.
>> Thank you so much.
Hi, everyone, great to be with you today and great to hear these different models of promoting school safety via emotional and behavioral health supports.
We are thrilled to be able to share this model that we tested via randomized control trial with Baltimore County Public Schools.
I'm Jill Bohnenkamp, and I'm joined by my colleague and co-investigator, Cindy Schaeffer.
So we'll dive right in.
So this project was funded in 2014 by the Comprehensive School Safety Initiative, and we are thrilled to be able to share final findings and not preliminary findings, and especially to share more details about some hot off the press findings that have just been published in "Prevention Science," and we'll be going through those today.
So our goal of our presentation is to describe this multi-component preventive intervention to address emotional and behavioral health crises and to understand results from the 2-year randomized control trial.
So to give you an overview of this trial, look at the impact of the Emotional and Behavioral Health Crisis Response and Prevention Intervention, a mouthful, EBH-CRP for short, specifically on school safety outcomes.
In the 2014 funding mechanism, this was awarded directly to the school systems with university research partners.
So this was awarded to Baltimore County Public Schools and our National Center for Mental Health team served as the research and evaluation [audio drop] >> Jill? Excuse me.
I'm so sorry to cut in.
We're having a little trouble hearing you.
I don't know if you can get a little closer to microphone or speak louder, please.
>> I will.
I'll see if I turn it up and talk louder.
>> Yes, that's better already.
>> Okay, great.
Thanks for that callout.
And we were thrilled to partner with April Lewis, who is the Executive Director of the Department of School Safety.
So to give you some context, Baltimore County Public Schools is one of the largest school districts in the nation, 123 schools and programs, and it covers a huge geographic region that spans urban, suburban and rural areas, and really, the focus of this program, and when we were approached to partner with the school system, was this concern over the growing number of student emotional and behavioral health crises that couldn't be quickly diffused, modified or resolved and were presenting a significant risk to school safety.
So these were coming into the school safety office frequently, and they felt like they did not have a streamlined process for how to best resolve them, and they were a significant risk to school safety.
And so the school system and our team worked on developing this comprehensive training, organization and support protocol for school and community stakeholders.
I think a common theme that we're seeing is this importance of a multi-tiered comprehensive approach, and this approach was aiming to increase broad stakeholder competence in responding to and preventing student emotional and behavioral health crises, looking at combining evidence-based culturally competent and school-informed strategies, and importantly, to be able to address emotional and behavioral health crises across the continuum and not just addressing them when they were at the crisis level.
So specifically, we looked at these research questions.
So if we implement this comprehensive model which I'll go into much more detail about what was in that model, will we see an impact of school safety outcomes? And in secondary ways, will we see an impact on mental health service utilization outcomes and also those -- the quality? We also looked at stakeholder knowledge and preparedness and conducted a cost benefit analysis.
For the purpose of this presentation, we'll primarily be presenting on the school safety outcomes, but just like folks talked about, we have forthcoming manuscripts on these other outcomes, as well.
So our study design was looking at outcomes at the school level, and so we -- >> Jill? Jill, I'm so sorry.
Your volume dropped off again.
>> Oh, gosh, okay.
I'll -- >> I don't know why it's varying.
>> Thanks for letting me know.
Can you hear me better now, Cindy? >> No.
Let's see here.
Let's see if I get closer, and I'm -- Is that better? Okay, great.
So, Cindy, I wonder if you might want to talk about our study design, and I'll grab my ear pods and see if that helps.
>> Sure, yes.
So our study did -- Can everyone hear me okay? Okay, I'll assume yes.
So our study was not district-wide.
It was with a select group of schools that were most struggling in their safety outcomes, selected by the school system.
So the selection of schools was allocated to feeder patterns in the district because we were working with elementary, middle and high school levels.
We needed to -- And there would be crossover of kids during the study period we needed for schools to be within the same feeder patterns.
So we had 20 -- We had 40 schools total, and they were randomly assigned based on feeder patterns.
So the randomization was not at the school level.
It was at the feeder level.
The baseline year was 2014-2015, and the intervention was started and kind of I guess was at full throttle more in year 2.
And, Jill, are you able to talk about the components of the intervention? Not quite yet? >> If you all can hear me better now.
>> It might be a little better but -- >> Okay, I'll continue to stay close to the mic and perhaps stop my video and see if that helps with sounds.
Okay, so we'll move on to talk more specifically about the intervention component.
So as I mentioned, this is a comprehensive model, and so we worked with the school district to think about what supports they already had in place and how we could bolster some of those services and also make sure that we were thinking about universal prevention, early identification, thinking about assessment and service linkage, and unique to this project and what they really felt like they needed additional supports on were crisis response and post-crisis relapse prevention.
Cindy, are you able to hear me better now? >> Yes, it's quite a bit better.
And so I'll dive in and share more about the interventions at each level of this model.
So at universal prevention, we had the Safe School Ambassadors Program which is a student-centered intervention to identify and train students to intervene, prevent and stop bullying in the schools, and schools were also implementing PBIS, and so we had some additional supports to help to bolster this.
It was really important that we had a student component here, and we were really glad that the school system was thinking about having the Safe School Ambassador Program to really think about multiple stakeholders and, very importantly, student stakeholders.
In terms of early identification, the state was already implementing the Kognito program which is a 30 to 60-minute online suicide prevention and mental health intervention training.
This was not being utilized, though, in many schools across the state, and so via this grant, we were able to increase folks who were receiving this training, and across intervention schools, 80 percent of all school staff in the building were trained in that model.
Schools also had varying degrees of mental health services and supports, and so part of this intervention was to assess the current strengths and gaps for each school and to make sure that we were streamlining programs and resources across school mental health services and EBH crisis supports.
And so as I mentioned, this crisis response was a key area where the school district felt like they needed additional supports and across the country that we hear school districts talking about the need for this.
And so a main part of this model was the implementation of a crisis facilitator who basically was a mental health staff, a mobile mental health staff that was specifically for the school district, and each feeder pattern within the intervention condition was assigned a crisis facilitator who the schools could call when there was a student in crisis, and that person would come and work directly with the school teams, allowing school staff to be able to not have to take large chunks of their day, you know, managing students in crisis, but really be able to have some of that happen and then to have that be more streamlined in terms of follow-up which I'll talk about in just a moment, as well.
Also, both the crisis facilitators, as well as school mental health staff, school psychologists and school counselors, were trained in the Life Space Crisis Intervention model to help to improve crisis de-escalation.
And I'll show just a couple of tools that came out of this.
The Emotional Behavior Health Incident Report, this was a way for schools to be able to track and better understand what type of EBH crises were happening in a more standardized way, and we have a manuscript in preparation that hopefully will be coming out soon with that.
And then so another critical piece was after a student crisis happens, they did not have a system in place, and we hear from many school districts across the country that there's not a system in place to be able to follow up with that student and ensure that we have relapse prevention.
And so this process helps the crisis facilitator to plan post-crisis response, and here's just an example of what that looks like, also manuscript in preparation sharing more about this tool.
Okay, and so as I mentioned, there were a number of outcomes that we looked at, but specifically today we'll be talking about these school-wide primary outcomes of suspensions, office referrals, juvenile justice referrals and threat and risk assessments.
And I'll turn it over to Cindy to share those school-wide safety and discipline results.
>> Hi, thank you.
So we have more outcomes that we will be examining, but as Jill pointed out, we're focusing today on our primary outcomes which were around safety and discipline, school-level indicators, and we are excited to report that these results were just accepted and pretty rapidly published online in "Prevention Science." So if you want to drill down more deeply on the details, you can access that manuscript, and I think Jim mentioned earlier the hesitancy to show, you know, a bear of a table like this.
So apologies again, but this does sort of summarize most of our outcomes here.
So our analyses, again, these were school system level indicators so their own data systems, and we used three time point latent growth models to look at change in some key outcomes over time, from the baseline year across the 2 intervention years.
So all of these outcomes that you see on the left column, suspensions, office referrals, juvenile justice referrals and bullying reports, like reporting to the offices of each of the schools, all of those are count data which are highly non-normal.
So we used appropriate linkage functions for those growth models, including Poisson and negative binomials, whichever linkage fit those non-normal data the best.
And what you'll see for the columns are, these models provide an intercept, estimate for intercept, but then of more interest would be the slope which is the average change over time for all schools, and of interest to our study, the slope on condition columns which shows the effectively interventioned package on those outcomes by school.
So we would hope to see, in that S on condition column lots of negative values, all this bad stuff going down, and then for that rate of change to be significant.
And so Jill has highlighted here the significant findings.
Again, this is with an N of 40.
So this is, you know, in some ways a small sample size for these analyses, but nevertheless, we looked at everything at the overall level of 20 versus 20 schools essentially and then drilling down on some subgroup differences also with the effect of condition on the subgroups.
So as you can see, we did not have an overall effect of the intervention on school suspensions over time.
At the average level, those were kind of staying steady during the period of the study.
However, for the multi-racial students in our sample and also among secondary schools, we did see significant declines in the number of suspensions for those two key subgroups.
Similarly, for office referrals, there was general declines overall across the years, but in the secondary schools who received the intervention package, that decline was significantly more.
So we had an excellent on discipline referrals and suspensions for the secondary schools only.
The juvenile justice analyses were just for secondary schools, again intervention advantageous there, that although those were increasing over time on average, the rate of increase was less for our intervention schools.
So they were feeling the need to -- They were not keeping pace with the overall trend to refer more kids to juvenile justice, and then same for bullying.
Bullying was trending upwards at the overall level, but less so for intervention schools overall, and the effect is driven largely by the primary schools there.
Okay, next slide, Jill.
So we also -- So we felt good that we had some overall effects on some key indicators, but we wanted to understand whether the intervention was impactful upon disproportionality itself.
So first, we had to define disproportionality which this is a standard ratio formula in the field.
So it's basically the ratio of the subgroup of interest on the outcome over the majority group, the same ratio for the majority group, and if the two ratios are the same, you would expect the overall ratio to be a value of 1, just equal portions of both groups having the negative outcome.
And when we -- I don't think -- Do we have the actual disproportionality ratios? We didn't make a slide for that, but we did examine disproportionality for three subgroups, and what these results -- These are not the disproportionality ratios.
These are the regression models using that ratio as the dependent variable, looking at whether a condition, the intervention condition affected disproportionality, and the answer is, it did not.
So taken together with our overall findings, the intervention was very successful in bringing suspensions and office referrals down for secondary schools, bullying reports down for primary schools, but it did not have an effect on disproportionality itself.
So even though all groups were benefiting from the intervention, disproportionality still persisted across time for those groups.
Okay, and then we also examined threat assessments which is another school-level indicator of when a student makes a statement or exhibits a behavior that indicates, you know, risk of harm to self or others so self-injury behavior, a statement of suicidal behavior or something, an aggressive threat towards another student or teacher.
The formal threat assessment process is done by school emotional behavioral health staff, and if we're going to be reducing disciplinary approaches, we might expect these to go up, and in fact that's what we found, at least at the secondary school level.
So looking at that slope column, overall and for secondary schools, generally schools were doing fewer of those or they were staying pretty stable over time since those aren't significant, but in our intervention schools, those were actually -- at secondary intervention schools, those were going up.
So we were showing some increased, we think, hopefully appropriate threat assessments happening in lieu of the disciplinary stuff, and then the follow-ups to those threat assessments also increased for overall and for both primary and secondary schools.
So we feel that the intervention was sort of socializing or conditioning school staff to shift from a view of a disciplinary response there to more of a therapeutic view, and really, as indicated by the follow-up effects, not just assessing those, but continuing on with some intervention for those kind of concerns at a higher rate than our comparison schools did.
>> All right.
Since you summarized, you know, I think as many folks have talked about this idea of implementing a multi-component model across many schools is, we could spend a whole hour talking about that or perhaps a whole conference talking about that, and so, you know, we were glad to see that happening.
We also saw this uniform crisis response and post-crisis relapse prevention happening which was really something that the school system was lacking and felt like they needed and saw this as a key piece here.
And as Cindy went through, we saw that resulting in lower incidence of discipline response, especially in our secondary schools where we were seeing higher rates of that to begin with, we think in our primary schools, and we've seen that, you know, those rates were lower to begin with, and so really where that was key, we saw those lower incidence.
And this was supported by seeing, you know, a different indicator, but also a school safety indicator that we saw.
Bullying reports being lower in primary schools, and we see that as, you know, perhaps one of the school safety indicators, indicating that this whole model is supporting that move towards school safety.
And unfortunately, we did not see moves towards discipline processes being more proportionate, and while each of the interventions in this model were selected to be culturally responsive, we really see that there's this kind of increased need to make sure that these interventions are anti-racist and that we're thinking, you know, very specifically about targeting that, to see the changes in disproportionality.
And we were, you know, really thrilled to see this more use of proactive threat assessment and follow-up procedures coupled with that discipline lack, less discipline response, that this multi-tier model, when you're training multiple stakeholders, students, teachers, school mental health staff and having additional crisis supports, that we're really seeing that shift towards school safety.
And encourage you, if you want to get into more of the details, to check out our publication in "Prevention Science." The citation is there.
Thank you so much.
>> I really want to thank Jill and Cindy, and I'll kind of invite everyone that presented back on to their virtual screens.
We haven't received any questions entered into the Q and A box, and we have all of about 5 minutes left, maybe 4 minutes.
So I had a question for the group.
Basically all of you are funded under CSSI, but this work has such a broad range of applications above and beyond just what happens in terms of school discipline and school safety to issues around what Nicole Hockley described as mental wellness, but then permeates the life course of the kid in terms of avoiding suicideality, having better involvement in terms of academics, in terms of avoiding bullying experiences and also juvenile justice system involvement being lessened.
And so as you look at this work, there's so many things that when you help a kid be on a positive pathway and overcome traumatization and reach more mental wellness, then they can permeate so much of their lives.
So I know we have all of about 3 minutes.
But if anybody would like to jump in on that topic, go for it.
>> I'm probably the oldest here.
So I'll take a shot.
You know, I think your comment, Barbara, reflects the need to work across initiatives, across Federal agencies, across systems so that this should be a shared agenda and all relevant to all the Federal agencies, obviously NIJ, SAMHSA, NIH, the Institute of Education Sciences, CDC, but also reflecting a shared agenda in communities, education, mental health, child welfare, justice, disabilities, family and youth leadership, and these studies really provide a suggestion of a platform for us to build the work through these collaborative cross-system initiatives that are driven by the stakeholders.
I'll hand it off...
>> Thank you very much, Mark.
>> ...has a comment.
Maybe he doesn't.
>> Well, I think you -- >> Yes? >> "Maybe he doesn't?" I think you summed that up really nicely.
One of the things that I work on is a Federal Interagency Coordinating Council on youth programs, and this might be a topic that we'd like to present to them in the future so that more agencies could be actively engaged in the constructs you've developed and the measurement techniques, as well as your exciting research approach.
And with that, I think I want to thank Tina, who has been our tech support for this entire meeting and has kept us so ably on task.
And, everyone, you did such a lovely job of presenting.
Thank you so much for joining this conversation today, and to our attendees and, of course, to all of our research participants, very, very welcome.
So thank you.
>> Thank you.
>> Thank you, all.
>> Thank you.
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.