CSSI Comprehensive School Safety Projects - Breakout Session, NIJ Virtual Conference on School Safety
Review the YouTube Terms of Service and the Google Privacy Policy
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video includes the following presentations:
Assessing a Comprehensive Framework to Reduce School Violence, Beverly Kingston and Alison Dymnicki
Researchers from the University of Colorado Boulder’s Center for the Study and Prevention of Violence (CSPV) partnered with educators in 46 middle schools to implement Safe Communities Safe Schools (SCSS). SCSS seeks to address behavioral incidents, mental, and behavioral health concerns, and increase prosocial behavior in schools through three core program components: developing a functioning multidisciplinary school team, building capacity around data use, and implementing an action plan using evidence-based programs. The study explored research questions in: readiness (whether schools met baseline criteria and experienced readiness changes over time), implementation (whether the SCSS model was implemented as intended), and associated outcomes (effects on school climate, safety, related behavioral and mental health indicators, and academic outcomes). CSPV and external evaluators from American Institutes for Research conducted a mixed-methods randomized control trial with a staggered implementation design using qualitative and quantitative data (focus group, staff and student school climate, and school record data). The study found that (1) participating schools met the pre-developed readiness criteria and reported some improvements in readiness constructs over time; (2) some components of the model were implemented as intended; and (3) there were mixed impacts on school climate, safety, behavioral and mental health indicators, and academic outcomes.
Project SOARS (Student Ownership, Accountability and Responsibility for School Safety): Successes and Challenges, Claudia Vincent and Dorothy Espelage
Project SOARS focused on the development, usability testing, field testing, and pilot testing of a student-centered and technology-driven comprehensive school safety framework for high schools. We first provide an overview of the framework components and their theoretical grounding in extant research. We then summarize the findings of each phase of the project, with particular emphasis on the findings from the pilot test. All student outcomes from the pilot test were in the desired direction. Intervention effects on student-reported school connection, sense of personal safety, and level of disruption reached statistical significance. Effect sizes ranged from small to medium, with the largest effect size in students’ perceptions of personal safety. We contextualize these findings in existing school safety policy recommendations, discuss persisting challenges with integrating student voice and leveraging reporting technology in school safety decisions, and provide suggestions for further research.
School-based Coordination & Integration Efforts as Part of a Comprehensive School Safety Initiative, Joseph McCrary and Christopher Henrich
A coordination & integration (C&I) process was used with a sample of middle and high schools in Atlanta. School teams used a data -driven process to identify goals and objectives related to school safety, map resources and coordinate activities to achieve those goals, and choose measurable outcomes to gauge their success. Technical assistance and $40,000 per school to support activities was provided. A mixed methods approach is being used to evaluate the process and its outcomes. Focus groups, observations of school team meetings, and document review are being used to address research questions about the content and implementation of schools’ C&I plans, teams’ capacities and challenges faced, the role of technical assistance provided, and sustainability. Findings to date document challenges school teams faced in overcoming administrative and other barriers to C&I planning, and ways the technical assistance team adapted the process help school teams to develop and start implementing C&I plans. Quantitative analyses of state School Health Survey data are also being used to identify school-level climate and safety outcome measures. Using these outcomes, analyses will examine changes in schools’ safety and climate over time compared to other urban schools to examine the effectiveness of the comprehensive school safety initiative.
Developing a Longitudinal Dataset to Study the Relations Between Community and School Context and Student Outcomes, Maury Nation and Caroline Christopher
The Nashville Longitudinal Study of Youth Safety and Wellbeing is a CSSI-funded partnership between Vanderbilt University and Metropolitan Nashville Public Schools (MNPS) to develop a multi-level longitudinal dataset to study youth safety and wellbeing. The resulting dataset includes longitudinal, geocoded data for more than 15,000 students in 144 schools. Also, the dataset includes measures of school characteristics and neighborhood context. This presentation will describe the process of developing the dataset and how we are leveraging this dataset to investigate research topics including the neighborhood, school, and student factors that influence students' social and emotional competence and behavioral outcomes.
>> Hi.
Welcome, everyone, to this breakout session on comprehensive school safety initiative projects.
My name is Nadine Frederique.
I am a senior social science analyst with the National Institute of Justice, and I've been working on school safety projects for quite a few years, and I am happy to be your moderator for this breakout session.
We are going to be joined by quite a group of experts here and talking about comprehensive initiatives.
I should note that one of our projects is a longitudinal study, but it's so longitudinal, it has so much data, that we're going to call it comprehensive because it's quite impressive, and so you'll hear about that project as well.
We're going to...I'm going to introduce the presentations one-by-one just to give us time to transition in the middle.
A couple reminders is that we have a Q and A function here on the Zoom, and so if you have any questions, please post them in the Q and A function.
Also, we're going to be actively using the chat, so if you have any comments, please put those in the chat as well, and a reminder for everyone that this session is being recorded to be posted on the website later.
So without further ado, let me introduce our first presentation.
The title is, Assessing a Comprehensive Framework to Reduce School Violence, and our presenters are Dr. Beverly Kingston, who's a director of the Center for the Study and Prevention of Violence at the University of Colorado Boulder and Dr. Allison Dymnicki who is the principal researcher at the American Institutes for Research.
So without further ado, I will turn it over to Beverly.
>> Hi, everyone.
I'm Beverly Kingston, and I'm really happy to be here with all of you today.
So the project that I'm going to be talking about, I was the principal investigator on, and it was our comprehensive school safety initiative project that ran from 2015 to 2020, and it's...Our title today is, Assessing a Comprehensive Framework to Reduce School Violence.
Next slide, Allison.
Here is our NIJ disclaimer, very happy to put up, and this slide lists our project team members, and we had a really awesome multidisciplinary research team to develop, implement and evaluate this project, and our team includes two national research centers, American Institutes for Research and Arizona State University's REACH Institute along with PassageWorks Institute which is a national organization focused on whole-school transformation.
Next slide.
So I'll present today on the project overview.
Then I'm going to turn it over to Allison Dymnicki, our principal researcher and co-investigator on the project, and Allison will present the key evaluation findings and then thoughts for discussion.
Next slide.
One more, Allison.
Okay, so our project purpose was to partner with 46 Colorado front-range middle schools to evaluate the feasibility and impact of the Safe Communities Safe Schools model using a staggered-entry randomized-control trial.
The Safe Communities Safe Schools model was first developed right after Columbine, so quite a while ago, by Dr. Del Elliott, the founder of our center at CU Boulder, and the Safe Communities Safe Schools model is a comprehensive, actionable, individualized approach to school safety and violence prevention that focuses on the key malleable risk and protective factors that schools can influence, so like school climate, for example, and it supports schools to understand and effectively implement evidence-based programs and strategies that are matched to their unique needs, and we often talk about the Safe Communities Safe Schools model as a marathon rather than a sprint.
It's really focused on giving schools the capacity to do their school safety work and to sustain the work, and our project had three overarching research questions.
The first question focused on readiness, and that question asked, "Do schools increase their readiness to implement the Safe Communities Safe Schools model over time?" The research question focused on implementation, so here is where we asked, "Is the Safe Communities Safe Schools model implemented as intended in participating schools?" And the third research question focused on outcomes, so what's the effect on school climate, safety, behavioral and mental health and academic outcomes of implementing the Safe Communities Safe Schools model in our treatment schools? Allison, you can go to the next slide.
Okay, so our project had three core components.
Our first core component focused on engaging schools, and we primarily did that by developing and working in very close partnership with a multidisciplinary school team that served as the collective champions to implement the model.
The team participated in monthly meetings that were facilitated by the Safe Communities Safe Schools project staff, and the team also solicited feedback and promoted the buy-in of the Safe Communities Safe Schools model with their broader school community, and we really thought about the school team as the on-the-ground implementation team at the school level.
So the second core component was all about gathering data, and the researchers on our project collected the data in partnership with the school teams, and the data we collected included readiness data, student and staff school climate surveys and what we called the action plan questionnaire which was really about assessing current school safety systems, and the data was used by the schools to set their priorities and to monitor their progress, and it was also used by the evaluation team for our research study so that all the data we collected both benefited the school directly and the research.
For the third core component, schools developed and implemented an individualized school action plan, and they used their data to prioritize their needs and to guide their evidence-based program selection, so primarily they used their action plan questionnaire data and the climate data, but they also looked at the readiness data as well to think about their capacity of what they could actually take on, and then each school developed an implementation plan that basically integrated the principles of implementation science so they could really address as much as they could the implementation barriers and stay on top of that, and then the schools also determined strategies.
In addition to selecting the evidence-based program, they determined strategies for adjusting other data-identified needs in the foundational areas of school safety which include mental health, bystander reporting and response, information sharing and threat assessment.
So I'm going to turn it over to Allison Dymnicki now to cover the research findings.
>> Thanks so much, Beverly.
As always, all these presentations are jam-packed with lots of information about the actual overview of what's being done as well as the evaluation, so I'm going to present a lot of the data sources we use today and probably not dive too deep into some of the findings but really welcome more questions and comments afterwards about them.
So to start with describing some of the data sources for this study, Beverly mentioned the readiness data that we have, and we compared annual scores on a readiness monitoring school that was developed by Abe Wandersman and Jonathan Scaccia and the Wandersman Center, and it asked about motivation, general capacity and innovation-specific capacity.
We also assessed implementation data through a lot of different ways.
Two are up on the screen right here, and it's a facilitator feedback survey as well as a meatier implementation survey for program facilitators.
We also looked at classroom observation data done by external observers that were hired by CPSV and arrived unannounced in the classrooms.
We looked at implementation ratings that were made by the CPSV staff that worked directly with the individual schools and focus-group data.
And finally, we looked at whether...
We looked at school climate surveys and school record data to understand more about whether SCSS students and staff reported more positive attitudes and behaviors versus comparison group of students in nonparticipating schools, and we looked at attendance, truancy and suspension rates and academic success as in standardized test scores, and in the school climate surveys, we looked at both student and staff responses on those.
So lots here as always to start with some findings related to readiness, so we use this definition of readiness that is, readiness equals motivation times innovation-specific capacity and times general capacity, and we measured schools' motivation to implement SCSS as well as each of these other two constructs.
We saw six out of 21 scales increase significantly, and most of the increases were between, again, year one and year two for the SCSS school teams that were part of this initiative.
Most of the increases were on the innovation-specific capacity subcomponents.
If you can see on the screen, there's knowledge and skills increased, innovation-specific leadership and the supportive climate, so this didn't surprise us.
This is the kind of things you would expect to target directly with an intervention like this, and some of the other things like the general capacity, things like leadership were not theorized or intended to be impacted by SCSS, so we didn't expect as many changes on those components.
So when I talk about the implementation findings, we saw that some components were implemented as intended, but other components, schools struggled to implement, and we had the CPSV, CSPV implementation managers, technical assistant specialists rate schools on four components, and Beverly mentioned, like, three core components of the study, and so we're looking at healthy school team which relates to that first core component of developing a functioning school team, and then collecting school climate data and using data to guide programming as being about collecting and using data, and then implementing a blueprints program as being about the third component, implementing a blueprints program.
They didn't do that.
They weren't expected to implement a blueprints program until year two, so you see here in year one that schools implemented three of the three components with fidelity, and we're using some guidance from the field about having 70 to 80 percent of the model being implemented based on some other large, comprehensive efforts like PBIS.
In year two, you see that two of the four components were implemented at those rates, and we asked, and we obviously probed with lots of different measures and questions, about why, and some of the reported challenges had to do with school leader turnover.
Four of the 21 schools had turnover in that as well as the school-based coordinator turnover.
We also heard about the relevance of the evidence-based programs for students, specifically the tier-one effort, the tier-one programs, that staff felt like they had to do a lot to translate and adopt at their school, and we saw, we heard and saw from the custom observation data some challenges with program fit.
We also heard a bit about the perceived value of the program from the educators' perspective and how some of these evidence-based programs didn't seem to really meet the mark from their own perspective.
Okay, so when I turn to the outcome findings, I'm going to start...
And this is a busy slide, not expecting you to be able to read all of it or to see all of it, but the idea here is, I wanted to give you a sense about what some of these outcomes were.
This is the slide for the student school climate data in years one and year two, and this actually is just showing...
The red rows are showing you year-two effects.
There were only, out of the 21 outcomes, only four significant findings in both years for students' climate survey data and no effects for staff school climate data, so you see four significant outcomes in year two in the unexpected direction, two on peer norms.
Peer acceptability of aggression increased.
Peer encouragement of prosocial behavior decreased, and two on two types of behavior.
Perpetration of aggression increased, and truancy increased, and we...
The other kinds of questions that were on here had to do with things like respect for authority, respectful climate, trusting relationships, positive feelings, attitudes towards school, school and staff capacity to address mental health needs.
We didn't see significant effects in any of those outcomes.
So when we look at the attendance and truancy rates, we also don't see significant findings, so here, you see both year one and year two data, school-wide data, so not at the individual level, and you don't see significant findings for either attendance or truancy in either year.
When we look at the in-school and total and out-of-school suspension rates, and there's a...
somehow, this slide did glitch while there's a typo on this title here.
These are the suspension rates, and we see that we did find positive effects on the total suspensions and in-school suspensions in year one and did not see any significant findings on total in-school, out-of-school suspensions in year two.
And finally, we look at the reading and math achievement, academic achievement test scores in both years.
Here, we used...
We thankfully had access to individual-level data, so all of the school record data came from a data-sharing agreement with the Colorado Department of Education, and we had access to individual- level academic achievement data and then access to school-level attendance, truancy and suspension data, and so with the individual-level data, we were able to control for an individual's prior-year scores on these analyses, so we see positive effects on reading, grade seven and grade eight reading scores in year one and then also positive effects on reading grade eight scores in year two.
Because of the analytic approach, we can't look at grade-seven test scores in year two, and we did not see significant findings on math, grade seven and grade eight, math scores in year one or grade-eight math scores in year two.
So where do we go from here? So we had lots of conclusions and lots of discussions about this.
We were excited to see that there was readiness in adoption of the data-driven approach by the school-based teams.
Specifically, we've heard in lots of other work, like some of the Communities That Care work, that school successful adoption of a scientific approach where they collect and use data is very critical to the success of other-based programs, evidence-based programs, so while we did not see changes on the school climate or safety indicators, we did see some significant positive changes, a bit small, on the suspension and achievement data.
I think we have a clear story here about how the effects are tied to implementation which is probably not surprising to all the implementation science researchers in the audience.
It's been found in lots of other studies, and we found most of our positive effects in year one where we had fidelity of implementation of all the core components versus year two where there was only evidence of two of the four core components being implemented with fidelity.
Lots to talk about here, and Nadine and Beverly and I have chatted about this before, about the time horizon of the study and how, you know, some of these effects could be expected to materialize after 1 or 2 years which is what we had the ability to look at during this study.
It's not as clear why we didn't find...
why we find no or potentially negative effects on the school climate measures and positive effects on the more distal outcomes, but I think that there's potentially measurement differences here in terms of the individual-level academic data accounting for the individuals' prior test scores and the school-level student survey and the school-level attendance and truancy data not being able to account for that.
There were also significant changes related to implementing the program, or challenges related to implementing the program, so we heard about poor fit of the particular programs.
We heard about leadership and staff turnover in year two.
We heard about limited student response to some of the tier-one evidence-based programs that schools have chosen, and relevance of the program for the student population.
So we obviously processed some of these implications and findings as a team, and we thought about them both for our own research for the next set of studies that we might consider doing as well as for the field, and some of the things that we highlighted to have taken away from this is that some components of readiness to implement a model like Safe Communities Safe Schools can be improved over time, and others might not be as malleable.
We also think, and probably no surprise to the audience members, that implementation data can improve schools' ability to effectively bring models like this to scale school-wide and continuing to check in to understand what that implementation data says and how things should be tweaked or modified or sustained at a level there being done as important, and I think also this study really highlighted what schools can feasibly implement within a 2-year period and in what sequence.
I think we saw that schools can get help from, you know, research experts and develop a functioning school-based team, conduct resource mapping and use results from that to create a tailored school action plan and select-train staff and begin an evidence-based program but maybe not take that program to full implementation and maybe not really have the ability to have school-wide change occur.
You know, maybe really the change occurs within that school-based team and doesn't translate, and I think it has some important implications for others implementing models like this.
I encourage all of you in the last 30 seconds or so, I know Nadine is looking for me to wrap up, to just think about what some of these findings might mean for you and if this resonates with some of the work that you're doing, if you're seeing similar findings and how you might use some of these findings in your own work.
And with that, I'm going to pass it back to Nadine.
Thanks so much.
>> Thanks.
Thanks, Allison and Beverly, for that presentation, and I think that your research is going to have a lot of implications for implementation science, and I agree with some of the things and recommendations and things that we need to think about in terms of how it translates from a team to the whole school, so great presentation.
Thank you for that.
All right.
I am going to introduce our next project, which is Project SOARS (Student Ownership, Accountability and Responsibility for School Safety): Successes and Challenges.
Dr. Dorothy Espelage, who is the William C.
Friday Distinguished Professor of Education at the University of North Carolina, and Dr. Claudia Vincent, who's the research associate for the Center for Equity Promotion in University of Oregon's College of Education.
So, Dorothy, take it away.
>> Thank you very much, Nadine.
I'm looking forward to presenting 5 years of work with a much larger team than Claudia and I.
Claudia was our fearless leader, however, so we'll hear from her after I set the PowerPoint and the talk up.
We have colleagues that are not represented here from undergrads to graduate students in the State of Illinois and Oregon as well as co-PIs including Hill Walker, Brian Walker, Hill Walker and others, and so thank you for inviting us to share the results across this 5-year study.
And so as I go through this, we're going to talk about SOARS, and SOARS is the Student Ownership, Accountability and Responsibility for School Safety.
When we talk about Project SOARS, we're talking about a really multicomponent approach to school safety and a framework that...
And I'm going to try to walk you through all the various components in this very short time.
But really, the premise of this funded proposal was based on a number of things that we knew from the field and that...
But we needed to figure out how to put those together in frameworks to promote school safety, but we did know that many acts of school violence are associated long-term student victimization.
Metanalyses have shown this, that school violence involvement is associated with adverse outcomes both academically and psychosocial.
At the high-school level, students to be more aware of victimization and things that are happening among peers than the adults do.
We also know during adolescence that adolescents heavily rely on digital media to communicate.
I would say that that's even more so now, but adolescents are also more likely to communicate with adults that are trusted, that engage in nonpunitive restorative school climate, and so these, this idea was to build this, a framework with the youth voice, with input from model stakeholders, in a methodical way.
So what did it look like over the 5 years? So in 2016, we were funded and worked on the conceptual development of the framework components, which I'll lay those out for you as well.
We started working with a tech company there in Eugene, Oregon in 2017 to think about productions and components of the framework and user acceptability of our ADVOCATR app that I'm going to talk about as well, all the while while we're going back and forth to the stakeholders, the students, the teachers, the principals.
All along, they're giving us input to make modifications and adjustments to the parts of the framework.
In 2018, we field tested the ADVOCATR and instructional components and continued to do that through 2019, iterative process of making changes as we received feedback.
In 2020, we had launched the pilot testing that Claudia will talk to you about and pivoted very quickly to complete that pilot testing during the shutdown in both Illinois and Oregon.
So what is SOARS? What are the SOARS framework components? These are available at the website, Advocatr.org, so I encourage you to check that out, so we have kind of stakeholders.
For students, really the framework centers around the reporting app called ADVOCATR that was named, developed by students, by other stakeholders, but youth had real voice even to the word ADVOCATR.
They named it.
They helped with our logo, and with this ADVOCATR app, they can report something wrong, but we heard in focus groups that they also wanted to report positive things that were happening in the schools, and then they could also check on the status of the report.
It's critical to know that this is not confidential.
It's not anonymous, but it is confidential.
There's also instructional videos so that we could talk to the kids about when to use ADVOCATR.
What problems should ADVOCATR be used for? Which ones should they not be used for? Really talking to them about restorative practices, social-emotional competencies.
A big thing that came out of the focus group was this idea that there's differences between physical and emotional safety, so we have did-you-knows and instructional videos around that as well as guidance for safety campaigns which, again, the students really helped us develop I would almost say right alongside us.
For teachers, we have implementation guides, so the classroom curriculums implemented by teachers, so there's a guide for them.
They also had companion did-you-knows on the same topic, and then they implemented everyday restorative practices curriculum in addition to that.
For administrators, we worked alongside of administrators to think about how to handle the reports that were coming in, just the workflow of that was the implementation guide of the framework, and then for parents, we have the did-you-knows on the topics that I had mentioned before that they had access to.
And so the conceptual development really started in 2016.
We conducted two waves of focus groups with students, school personnel, parents in Oregon and Illinois and identified a number of themes.
We have several papers under review on that qualitative analysis of those focus groups.
One big one was this idea that there was some hesitancy to consider using ADVOCATR app just because of the anti-snitching culture, and so we learned a lot about what would have to be in place in the climate and trust between students, among students and students and teachers and the adults in the building for the ADVOCATR to even be considered as a tool to inform the adults in the building of something that might be happening.
Again, they talked a lot about that they wanted confidentiality.
They also didn't want it to be anonymous.
They wanted some kind of tracking of this and accountability for adults to respond appropriately and to make decisions and to share as much as they can about the results of that particular report.
School personnel talks a lot about just really understanding social awareness and relationship management and how kids and students interact and how that can lead to conflicts that may be a cause for concern.
They also felt as if there wasn't...
There needed to be more clarity around school-wide policies related to school safety and reporting of such events, and we really enjoyed speaking with parents too in these focus groups.
They were concerned about staff capacity to address the concerns that kids were having in 2016, mental health challenges, their friends have mental health challenges, and the parents as well as the students were very, very concerned about the inequity and the racism, overt racism, in the structural dynamics and the structure of those high schools, and so we considered that as we pushed forward in the development of the framework.
Then we had a mock-up.
We had the ADVOCATR ready and functional by 2017 and '18, and then we went into our usability and acceptability testing for the frameworks.
There were three ways, and we did the UATs with students, personnel and parents, and we tried to have continuity, so some of the kids that were in the focus groups in 2016 also were in the UAT which was nice for them seeing the project flow, and so in the UAT, we just really wanted to know whether or not they saw the various components as acceptable, and the scale is up to a four-point scale, and by way of three through the iterations and the feedback that we had, students rated all components above 3.0, school personnel very similar and parents as well, so we felt as if that was really suggesting that with the modifications, the revisions that they suggested, that we were onto something and that the framework was developing nicely.
Then it was time to do field testing, and we did the field testing of the ADVOCATR and the curriculum in 2018, '19.
Ten teachers in both states, again, just a field test, feasibility, just to see if we could get students to use the ADVOCATR, so we worked with 10 teachers, trained them up on the curriculum.
One hundred and twenty-one students received the curriculum, the framework, access to the ADVOCATR app, and 20 parents participated in data collection as well.
The ADVOCATR in usage, as the kids in that particular semester reported 24 something right reports and 20 something wrong reports, and if you compare that to the percentages of kids that use tip lines, are numbers are higher than that, and Claudia will talk more about that in the pilot study description.
We had no significant change in teacher perceptions of school environment or parent perceptions of child behavior, and we really hadn't hypothesized that simply doing a field testing with 10 teachers in a high school would impact the school environment, and certainly, we didn't expect that parents' perception of their child's behavior would change.
However, we did have an expected significant change in students' perceptions of school climate and peer victimization which in some ways shows feasibility in that there's a signal there and that kids that were being exposed to the curriculum and the ADVOCATR, they were seeing the adults paying attention to school safety which we heard in the focus groups that they did a good job around physical safety but really clueless about the emotional safety concerns that they had for themselves and their friends.
So quantitative data, we had students report on various behaviors, and we had a...
And remember, this is a pre-, post-difference.
We had a decrease in reports of peer victimization, a trend toward a decrease in bullying perpetration, an increase in personal safety, decrease in disruption, delinquency which is somewhat amazing given the short feasibility field-testing time that we had.
The teacher report of problem behavior inventory actually showed a trend of decreasing but again, you know, this is exactly what we wanted to see, whether or not this was a feasible approach, and I'm going to now hand it over to Gloria.
>> Yeah.
Thank you, Dorothy.
So I'm going to talk about the last phase of Project SOARS, which was our pilot test of all framework components, so the feasibility test Dorothy just described focused primarily on the reporting app called ADVOCATR that we developed in the first few years of the project, and that allowed students to report things that they felt concerned about as well as things that they liked in their school environment and also the curriculum.
So the pilot test focused on testing the entire framework in its entirety, so that meant the ADVOCATR app, the curriculum delivery, the school-based, student-based and student-driven school-wide safety campaign, as well.
So it is important to state that this test occurred in the 2019-2020 school year, which was, of course, impacted by the pandemic the school closures, and those school closures most likely affected peer interactions since students weren't allowed in classrooms, and bullying and harassment might have taken on different forms.
It might have migrated to online environments.
It might have decreased for some students.
It might have increased for others, so it's pretty difficult to assess that at this point.
It might have also impacted reporting of negative and positive behaviors because students were simply not involved in all these daily peer interactions as they occur in brick-and-mortar schools, and then of course, also perceptions of peer relationships might have impacted by the school closures.
However, even though we can theorize about how school closures impacted the study implementation and our findings, the results from the pilot test were very, very similar to those of the field test, so that really encouraged our confidence and the initial benefits that the source framework seems to have.
Four high schools participated in the pilot test, two in Oregon, two in Illinois, and in each site, we had one control school and one intervention school that received access to all of the intervention components.
The control school only collected data along the way.
Here are the intervention schools and control schools' demographics.
You can see that the overall enrollment, minority enrollment and free and reduced-priced lunch eligibility was approximately fairly even across those four schools.
Next slide, please.
So first we looked at ADVOCATR use, and we really focused on the fall semester of 2019 because that's when kids were still in school, and in Oregon, 0.93 percent of all the students who had access to the ADVOCATR reported safety concerns, and a little bit over three percent of students reported positive behaviors.
The Illinois percentage is fairly comparable: 0.91 percent of students reported safety concerns, and a little less than that reported positive behavior.
Dorothy mentioned that the use of tip lines, our app is different in many important ways from a state-wide tip line.
Most importantly, it's confidential.
It allows confidential reporting and not anonymous reporting.
Many state tip lines allow anonymous reporting, and then also it allows positive reporting based on student feedback.
That's something that students really wanted to see.
And if you look at the approximate usage rates of tip lines, they vary quite a bit, but for newly established tip lines, it's about 0.2 percent of all students who avail themselves of those tip lines to report problems.
And when tip lines become more established, then that percentage goes up to two percent.
So given that our ADVOCATR app was brand-new in its development, it fared fairly well compared to state-wide tip lines.
Now student survey outcomes was really something that we looked at in great detail, and since this entire framework is very student-centered and student-driven, the whole development was student-driven.
We were hoping to find outcomes, significant outcomes, on our student surveys, and indeed, we did find positive outcomes on those surveys.
So compared to the control schools, students in the intervention schools reported decreased bullying perpetration, so they were less likely to engage in bullying.
That change was not significant, but it was in the desired direction.
There was also a larger decrease in victimization, peer victimization in the intervention schools.
Again, not significant but in the desired direction.
Students in the intervention schools reported a greater connection with staff and with their peers, and that was statistically significant.
They also reported a greater sense of personal safety, again, at the statistically significant level, and they reported lower levels of disruption, again, statistically significant.
There was also in the intervention schools greater commitment to promoting school safety.
Again, that was not significant, but the change was in the desired direction.
In both intervention control schools, we found that there was a slight increase in levels of delinquency.
That was really very minimal.
It's important to note that because of the school closures and the pandemic, we had a fair amount of attrition, so it was difficult to capture students and to ask them to complete surveys after schools had closed, but your schools really did everything they could to recruit students to complete those surveys and those measures, and we had a total of 200 students who provided pre and postdata.
We did not find any significant findings on the staff measures or on the parent measures, and that was not surprising because we did only intervene very minimally with staff and teachers or with parents.
This was a very student-centered project and student-centered approach, so we fully anticipated that our findings would be limited to student outcomes.
Now we also did a quick inventory of current school safety policy and practices that are put forth by the Department of Justice and the Department of Education and how our framework aligns our compares with those recommendations.
So federal recommendations for school safety include establishing a reliable and easily accessible reporting system, and ADVOCATR really showed that it has a potential to be just that.
It had good student use, especially compared to state-wide tip lines.
This also focused on establishing relationships of trust among students, staff and families, and Dorothy talked about how important it is for students to have a trusting adult that they can share concerns with.
Federal recommendations focus on making sure that students who file a report know that something will happen to address that report, so ADVOCATR allows students to check the status of their report.
First, they can see that it's been submitted.
Then, they can see that it's in review, and then they can see that it's been closed out, and since it's confidential, school adults can follow up with reporting students and let them know, you know, that the report has been addressed.
Of course, there is emphasis on creating positive school climates, and we try to do that through our Everyday Restorative Practices curriculum, which really focuses on relationship building, on advocacy and self-advocacy, on emotional safety and how to be aware of one's own and others' emotional safety and also on behavioral accountability.
And then there is emphasis on ownership, student ownership of their school's safety, especially at the high school level, and we tried to do that through our Student-led Safety Campaign, and at both of our intervention schools, at the school in Illinois, at the school in Oregon, students mounted a really nice, school-wide campaign that allowed them to really reinforce the messages, the core messages and principles that ADVOCATR was built on.
There's also recommendation to address the code of silence and snitching, and that's something that we really learned a lot about in this project.
Can you advance to the next slide? Yes, so the challenges to be addressed is really the snitching emerged as one of the really important problems that needs to be addressed.
Students defined snitching as reporting concerns to adults that are deemed as unnecessary, or students often felt that they are in a better position to handle the problem.
They don't want to get adults involved, or if they got adults involved, there were sometimes fear of retaliation from peers or fear of, you know, being perceived as untrustworthy or labeled as an outcast and so on, and then again, that would of course feed further victimization, peer victimization.
So, really, developing an approach to teaching students the difference between reporting and snitching seems to be really important.
We tried to do that in our framework components, but obviously, it wasn't enough to really address that issue in the time that we had.
The lessons learned and then also looking beyond SOARS towards the next steps, so staff really appreciated having access to the Everyday Restorative Practices curriculum.
They really liked it, and especially after schools closed, we heard from many staff members that they felt that this curriculum gave them a conversation starter.
They could use it to keep students engaged even in this remote environment that they found themselves in at the end of the year.
So pairing the student-centered source framework with training in restorative practices for school staff might really be a useful combination to both prepare students for working towards safer schools and making their voices heard and reducing peer victimization and then also giving teachers and school staff the tools to support students in that way.
We also found out that students tend to be inundated by apps.
So there a lot of apps out there, of course, social media apps but also school apps.
Many schools have their own app.
Many departments in high schools have their own app where students are supposed to go and check information or submit information or, you know, post information, so there's a lot of competition out there when it comes to apps that high school students are expected to interact with, and to keep students' attention focused on the ADVOCATR in that very crowded landscape was also sometimes challenging.
So that was...
Yeah, that was our presentation, and I just read that I was over time, so sorry about that, and I'll turn it back to Nadine.
>> Thank you, Claudia.
No problem.
We'll just move right on to our next presentation, so this is going to be Dr. Chris Henrich who is going to present on School-based Coordination and Integration Efforts as Part of a Comprehensive School Safety Initiative.
Just as a reminder for our panelists, you each have about 17 minutes, and keep an eye on the chat because I will ping you.
So with that, I'll turn it over to Chris.
>> All right.
Thank you.
I can't see the chat now that I'm presenting, but I will try to stay on time.
So I'm going to talk about one part of the comprehensive school safety initiative in the Atlanta Public Schools that Joe McCrary at WestEd is PI of, and it is a collaboration with GSU and with the Atlanta Public Schools.
It's a multifaceted, multilevel initiative that has lots of moving parts to it, trying to promote school safety from multiple angles, and as the title shows, what I'm going to focus on today is school-based coordination integration efforts that were part of this initiative.
Here's my disclaimer slide.
Just to briefly provide you a history on what the Atlanta CSSI has been doing, we started off in 2016 with a conceptual white paper about school safety.
We moved from there to engage in work groups with members of the district and also with other community partners.
We conducted a needs assessment that involved doing surveys with administrators and with school police.
It involved accessing a lot of extant state data and it also involved doing interviews with a number of district offices.
I'm going to show here a piece of the white paper where we really focused on school climate as being key to school safety initiatives, and really thought about climate as being multifaceted and so the different pieces of climate that we talked about were norms and policy, feelings of safety, connectedness and physical environment.
Family engagement also became an important aspect of climate as we worked through the project.
And I also want to highlight our approach, which was really to balance top-down approaches with bottom-up approaches, so we spent a lot of time listening to district personnel and involving district personnel in decisions, trying to figure out what their needs were and how we could facilitate them being successful in addressing school safety issues.
So as I mentioned we did a needs assessment, multifaceted, and following the needs assessment we came up with several different intervention approaches.
District-wide, we started implementing a visitor management system, and we also contracted out to do school safety reviews of all of the schools within the district and developed a data dashboard for school police.
And following the needs assessment, we also facilitated a meeting with teams from different district offices that really focused on school safety and brought in a WestEd facilitation team to work with these districts in a half-day long meeting to identify several North Star goals that the district wanted the district offices and the schools to focus their efforts on.
And here are the three goals that the district came up with: Enhancing education and awareness of mental health, trauma-informed care, social-emotional health, behavioral health and school safety was the first one.
Building and strengthening internal and external relationships within the APS school community was the second one.
And develop and improve environments conducive to learning and teaching was the final one.
Following the articulation of these goals, we used this coordination and integration approach to help realize these goals.
This is an approach that WestEd had used as part of an evaluation of Project Aware previously that was adapted for APS.
And the approach was specifically designed to address some of the needs assessment findings which were that there was not much coordination and integration both in terms of who is responsible about safety and security functions as well as data systems being pretty fragmented.
So not everybody who was involved in safety and security functions had access to all of the relevant data.
And that schools had much different needs.
APS is a very large district and it serves a diverse set of communities.
So different schools had many different needs as well.
So what the CSI process looks like it's again focused to be a bottom-up approach and I'm going to talk about it in two phases.
A planning phase where last year we worked with schools to develop coordination and integration plans and then an implementation phase.
This year which is the final year of the project is the implementation year.
As you can imagine, it's not a typical implementation year because of the pandemic.
So the steps of the coordination and integration planning process started with data.
The first activity that WestEd worked with schools on called data equity walks.
And these were essentially poster sessions where we took state data as well as data that the schools shared, created posters and had an interactive session with team members within each school where they could look at all of the different data, interact with it and try to identify needs and gaps for their school based on these data.
After talking about the data, the next step in the process is to create smart goals and objectives that are tied to the needs and gaps in the data and have measurable outcomes and processes associated with them.
The C&I process next involved a process of resource mapping where school teams figured out what resources they had available to help integrate together in the schools as well as identifying activities and partners outside of the school whether it be other parts of the district or community partners to work towards meeting their goals.
And then to come up with a plan for measuring the progress including how they were going to assess outcomes.
So this was a year-long process where school teams worked on developing these plans and then after their plans were developed, they requested up to $40,000 for the second year in order to help implement those plans.
I'm going to show you an example of what a C&I plan looks like.
This is not a real C&I plan from one of our schools.
This is an exemplar C&I plan that shows all of the steps of it put together essentially in a large Excel spreadsheet.
This example is a school that would have identified bullying as one of the major safety issues that they wanted to try to address.
So it starts off with an identification of needs and gaps again using data to identify those gaps, coming up with goals and objectives to address them, determining what activities the school is going to engage in to help meet those goals.
Identifying partners, for example, an outside mental health agency here to help them achieve their goals.
And then articulating what kinds of process measures they're going to look at and what kind of outcomes they're expecting and how they're going to measure those outcomes.
So we recruited 10 schools from the district to take part in this C&I process.
When we were using the data from the needs assessment, we used the state and school district data to create profiles of schools' safety and security needs.
And we used those profiles to recruit a sample of middle and high schools that were all experiencing some issues around safety and security but had a diverse set of issues.
So for example, some schools were in dangerous neighborhoods and the safety issues that they were really concerned about had to do with violence from the community spilling over into the schools.
There were other schools that were in more highly resourced and safer neighborhoods where kids were still reporting bullying and high amounts of psychological distress associated with bullying.
So we tried to pull from different profiles of safety issues to recruit schools into this process.
After doing that, we met with the superintendent and discussed which schools to include with her.
She additionally wanted us to make sure that we represented the geography of the district as well, and so we made some changes based on that.
And then at the end of that first year we started doing focus groups with school teams.
Now, one thing I want to point out here is that we recruited 10 schools but only five made it through the first year of the process.
You can see a couple of them dropped out almost immediately.
That tended to be that they thought that $40,000 wasn't worth it for the amount of work that the process entailed.
Then there were other schools that dropped out along the way.
So at the end of last year, five of the 10 initial schools remained in the process.
These are the research questions that we addressed in terms of the process.
What were school teams' capacities in the processes of developing and implementing plans? What challenges did they face, and how is the technical assistance helpful? How did school teams collaborate to generate shared visions and goals? What did plans and activities entail? How were they implemented and how did school teams address sustainability? We used qualitative design where we had triangulation of several data sources.
As I mentioned, we did focus groups with the five schools that completed the first year of the planning process.
We did a focus group with the technical assistance team from WestEd.
Then we also had graduate students from Georgia State at all C&I team meetings.
They conducted observational protocols and took meeting minutes from these and were at this point I think I have over 60 meetings that we've observed.
Then lastly we reviewed the schools' C&I plans that they completed.
And I'm just going to go very quickly through this slide.
This is the coding frame that we developed from the focus groups and then applied to the minutes of the meetings.
It doesn't exactly match onto the research questions that we talked about because although research questions guided this process, we also allowed codes to emerge from the data.
I'm going to briefly summarize the results by research question.
The first question asked about teams' capacities and challenges and the roles of technical assistance.
Lots of barriers reported, not surprisingly the biggest one was time and bandwidth of school team members.
There were also some issues with some school leaders, staff culture, a lot of turnover in team members and then some issues with communication with district departments and offices.
In terms of the roles of technical assistance, school teams reported finding some aspects of the C&I process confusing and aspects of requesting the $40,000 frustrating, but they found the technical assistance team to be very helpful and the TA team was very nimble on their feet in adapting the process as it hit up against schools' capacities and pre-existing plans, and then one thing that gets highlighted in the data over and over is the role of the TA team in guiding the school teams through data-driven decision-making.
Again in school teams, talking about the collaborative approach that they took, they really highlighted the role of WestEd helping them use data both to identify needs and come to consensus about goals.
Of the five schools that completed the first year, four of them were two pairs of high schools and feeder middle schools so that pairing up of the high school and its feeder middle school to develop shared visions and goals is a strength.
And those are the four schools that we know at this point are well along in implementing their plans as well, and then school teams, we tried to get a broad range of stakeholders within each school, and over time those teams got smaller, and even though they got smaller, the people who stayed involved were very engaged, and key to this seemed to be assistant principals who were highly engaged.
And then this is an example of what schools were actually doing, so the goals that they designed were focused on reducing problem behaviors, promoting alternative discipline, improving school climate, meeting mental health needs and increasing student attendance rates, and some of the activities that they're doing this year include hiring an attendance officer, working with external partners on wraparound services and professional development trainings, especially around trauma-informed care and restorative practices.
Sustainability, it's hard to tell because the pandemic really put a question mark over this, although schools when they talked about sustainability talked about specific activities and how well they thought they were working and they'd be able to continue to do them, whereas the WestEd technical assistance team talked about hopefully going through the process, sowing the seeds of a cultural shift in the way that school leaders thought about safety and security challenges.
Each team came up with their own outcome measures and the pandemic threw a monkey wrench into the assessment of these outcome measures, so even though this year teams are implementing those things that I showed on the previous slide, it's very unclear the extent to which we're going to be able to get outcome measures at the end of this year from that.
And then to finish up, I want to highlight school climate down here, which is something that was part of our conceptual framework.
It was also something that was important to school teams and that we worked hard on trying to measure using extant state data.
There's a survey called the Georgia School Health Survey that's done anonymously and annually to all students in Georgia, and from that state data, we were able to access 6 years of data, and there were 23 items that connected to our facets of school climate.
And so one of the things that we developed is a multilevel-factor structure of school climate, where the facets of climate that we talk about being important were observed in the variance within schools, but when we look at variance between schools, it really looks like it's a one-factor-climate structure, and so we've developed an abbreviated version of these items to assess climate at the school level.
So just a quick summary of the coordination and integration process so far: It was successful for about half of the schools.
One thing that the schools and the data really highlighted was the role of the TA team in guiding data-driven decision making, and it's a big question mark about what comes next given the pandemic, how well implementation will go this year and what types of outcome data we'll be able to have at the end of this year, and we have a huge team, and thanks to all of them and to the school district and to community partners.
>> All right.
Thank you, Chris, for that presentation.
We'll look forward to asking some questions at the end, but we're going to go to our last presentation, last but not least.
This is "Developing a Longitudinal Dataset to Study the Relations Between Community and School Context and Student Outcomes." Dr. Maury Nation is the Robert Innis Professor of Human and Organizational Development at Vanderbilt, and Dr. Carolina Christopher is a research-assistant professional in the Department of Human and Organizational Development at Vanderbilt University, so take it away, Maury.
>> Well, good afternoon, and I am very happy to be with this panel talking about this study.
As Nadine mentioned, we're a little different because we are in the process of creating a longitudinal data set, but also at the same time looking at a variety of research questions that we'll try to summarize as we talk about our study.
Before I hop into this, I do want to acknowledge just some fantastic partners, particularly Metro Nashville Public Schools, who's been in this 100 percent from the beginning.
Also, a number of university partners, including University of Oregon and University of Louisville, but also many of our community stakeholders, and I'll talk a bit more about them as we get into describing the study.
And finally, and of course not least, the National Institute of Justice for helping us to do this work.
Now, what I'd like to do is to actually take you back to 2014, where this story really started with convening of a set of stakeholders.
The Annenberg Institute of School Reform actually came and helped us to convene about 40 or 45 stakeholders.
These were school district, youth-serving nonprofits, parents, people who were concerned about racial-ethnic disparities in exclusionary discipline in particular.
And the question was, "What is the problem, and what are we going to do about it?" And as I reflect back on that time, it was really extraordinary because it had such an array of stakeholders, and we had two mottoes.
One was that we were not going to focus on blame, that is, trying to figure out how we got to where we are.
The second was that we were going to tell the truth, that we were going to look at data and face it, and what actually came out of that process was about 8 months of some really difficult conversations involving school and community and, you know, university stakeholders, including the police department, juvenile courts, the public defender's office, our state department of ed as well as teachers and administrators at the Metro schools.
What we came up with out of those conversations was what I think is...
It ended up being a pretty complex conceptualization of what people thought was going to on.
So we had certainly talked about some of the school contributors, and just to talk a little bit about those, of course there was discussion of disciplinary policies and practices and examining what teachers were doing and administrators were doing with young people and how they were making decisions.
We also had questions about explicit and implicit bias and the degrees to which there was open racism as well as, you know, teachers doing, behaving in ways that reflected bias that they were unaware of.
Now, a couple of other things, though, that I think we were a bit surprised at how much time we spent on them.
One was just the fact that there were insufficient supports across our schools, and there were real differentials in the types of supports that were happening across schools and that that often was associated with the types of rate inequities that we were seeing across the schools.
And then the last thing that I'll mention here is that, you know, a lot of the discussion was predicated under the assumption that the inequities were primarily driven by within-school differences.
That is, that teachers and administrators were responding to young people in the same classroom differently.
And what we found is that yes, that was happening, but that the major driver of the inequities were between-school differences.
That is, we found that our schools are largely segregated, much more so than most of us would wish and that the predominantly white schools were suspending all students at much lower rates than predominantly the schools that were serving students of color.
So, you know, one of the things that was interesting about this process is that the district, I believe, was able to hear that, and we can talk a little bit later about some of the responses to each of these pieces of feedback, but the second part of the conversation, and this was one of the things that I think excited our team was to really start to talk about the things that were happening outside of school.
So we were hearing reports of, you know, young people bringing things, both their traumas, their experiences outside of school to school and that that was also driving some of the issues that schools were facing, so they had three things that particularly stood out.
One was access to resources, and, you know, Nashville is a district that has really transformed over the last 20 years, and what you see now is what I think is more typical of urban districts.
That is, that there are a significant number of students who are struggling to have access to basic resources, that is, food and housing security, things like the Internet, being able to move around the city, to access the resources that are available.
And again, there was really no acknowledgment of those types of inequities that were happening across the geography of Nashville.
Second, exposure to traumas: One of the things that we were well aware of was that violence, both youth violence and violent crime in general, was not randomly distributed.
In fact, it was very well concentrated around some of the schools that were also having high rates of disciplinary referrals, so we wanted to begin to look at that type of context, and then lastly, we wanted to capture, to the degree that we could, some of the kind of systematic marginalization that we know has happened for some of our neighborhoods in Nashville.
So back a couple of years ago, the Brookings Institution did a study on incarceration rates and found that one of our north Nashville zip codes had the highest rate of incarceration in the country for people born between 1980 and 1986, and if you extrapolate that out, those are people who are in the age range that to be parents of our current middle and high school students.
So we know that there is a variety of types of disruptions that have been longstanding within Nashville, both around policing and housing that are also being reflected here, and I think one of the things that came from this conversation was, you know, the school district somehow has been assigned to make all of this disappear, like, in terms of academic achievement and other types of outcomes, that the disparity somehow gets only assigned to the district.
And I think one of the things that came out of this was really to begin to look at, "How do we get an accurate picture and begin to think about safety and well-being both within the school certainly, but also factoring in these contextual pieces?" So all of this was happening right at the time that the opportunity to apply for funding came around, so we were really excited, and what you see here is some of our objectives in pulling all of this together.
One was that we certainly wanted to pull together a database, and part of this was we wanted a database that told the complexity of the story, the story that we all had come to share.
How do we find the data sources that allow us to tell this story more fully? The second thing, though, that we wanted to do was to be able to support the types of actions and the youth-serving initiatives that were happening in Nashville.
So everything from My Brother's Keeper to the PASSAGE Group that is still working on kind of suspensions and racial inequities and a variety of different outcomes, Alignment Nashville, which is around working to bring resources that are aligned with the needs of the schools that they serve.
So what we did, then, was to start to pull together, "What do we need?" And what you'll see here is the list of data sources that we began the process of accessing.
So we are looking at Metro Nashville administrative data, and my colleague, Caroline, will talk in more detail about some of the details of this data, but this is classic data in the sense that we're talking about attendance, achievement and discipline along with a variety of other characteristics of the young people themselves.
And what I should also say is that for all of these data sources, they are being...
so that we can place them geographically and follow them over time.
We're also looking at survey data, so there is a...
You know, our team really worked with all of the different stakeholders to figure out what pieces of data they needed or they would like to collect to understand their piece of the issue.
And so within the survey data, we have school-climate data, but we also added something that I will talk a little bit about here, is a neighborhood safety and well-being survey that the district, I think, has become really invested in which is looking at what young people are doing outside of the context of school.
So, you know, what kinds of after-school activities do they have access to? Certainly their perceptions of their neighborhood, but also their perceptions of their own competencies and their participation in extracurricular activities.
As you go around the circle, you'll see that we look at just the tremendous kind of variety of different characteristics, so we are looking at police data, both incidents, calls for service, arrests.
We're looking at the gun-violence archive which gives us a little bit of the different type of data than the police data does.
The police data is much more comprehensive, but the gun-violence archive actually gives us some of the characteristics of the events that the police data doesn't provide.
We're looking at land use, so how is the resources are located around young people.
Do they have access to parks and libraries, to grocery stores? Part of what we were seeing is that there's real differences in public investment across these neighborhoods, and then the last thing I'll mention on this particular slide is a youth-mapping project that we took on with a couple of our youth-serving organizations where they are literally working with young people to talk about where they go, I mean, and to map that so that we can see where they are accessing the resources across the city, and it's something that we're really...
We'll definitely talk more about that over the course of our presentation.
Now, quickly, just to give you a sense of the structure of this, what you'll see is we have a four-cohort design, with the youngest cohort starting at grade four and following them over the course of 3 years, so with the third year being this current academic year.
What you see going backwards from year 1 is the administrative data will actually go back for as long as they have data within the district.
So even though we're just looking at survey data from the few years that we're collecting it through this project, we'll have a longer-term trend data for when we include the administrative data.
So our second cohort is starting at grade six.
Third is at grade eight, and the fourth is at 10.
What we wanted to do is to capture the transitions that these students with experience from each of the tiers, so from fourth to fifth is the elementary to middle school transition.
From eighth to ninth is the high school transition, and then, of course, all the way through to their graduation.
The population, we essentially wanted to look at the whole population, so we've anticipated a sample of about 15,000 kids that we would be able to look at over the course of this study.
Now, key milestones, you know, a lot of this has been spent trying to work through relationships and how we access data and then making sure that we put it in a format that allows us to look across these various forms of data.
So with this, this will give you an overview of how we're approaching it, and I will allow, ask, Caroline to talk to talk in a little more detail about some of the pieces of this.
>> Thanks, Maury.
So I would love to, but Nadine has let me know that we're very short on time, so if you don't mind fast-forwarding to where we are in the process, the slide that talks about developing a plan for the structure of the data.
So that should be slide 14 or 15.
I'm not sure if you added a slide in there.
Just want to...There we go.
Okay, so, you know, when we are thinking about one of our big deliverables is this, what I think of as amazing, longitudinal data set with so many different contributing data sources.
One thing that we needed to grapple with is the anonymization process of making sure the data are de-identified, but also thinking about how to make the data talk to each other.
So we want to make it maximally useful because ultimately other researchers will be able to access this from the National Archive for Criminal Justice data archive, but we want to make sure the data are protected following our Vanderbilt-MNPS Data Use Agreement.
So if you'll go to the next slide, I wanted to just give you a glimpse of the data structure that we've come up with, and just kind of letting you know that, you know, this has involved so many different conversations about linking keys and different groups that, again, would allow researchers this amazing, rich data set to answer all sorts of questions but to minimize the risk of re-identification.
And so we've gone through various decision points about what data to keep at various levels, so neighborhood, school, individual, et cetera, and would you go onto the next one? Then our group has also come up with decision trees for classifying variables, so we're going through this anonymization process.
It is by no means, you know, an easy process, and we have a great team who's really, you know, learning a field of study through this process, but I just wanted to take you through that to kind of give you an idea, this journey through the weeds that I probably could have taken you far deeper into, but before we end, Maury, if you wouldn't mind kind of taking us back big picture quickly and just kind of closing us out, that would be great.
>> Great, so one of the things that we're in the process...
We're probably about 6 weeks from being able to talk about sort of the sum of our...
but we are in the process of working with local partners, both media partners, and one of the exciting developments out of this has been a more formal kind of research-practice partnership that has developed where we're looking at problems of practice and some of the initiatives that the district have engaged them and how we can look at data to see whether the levels of effectiveness and how they might improve those initiatives over the course of time.
The only other thing I would add is that while we started with questions about discipline disparities, those have only been kind of scratching the surface of the types of questions that have emerged from this process.
So we're looking at things related to social-emotional competencies and a variety of restorative practices.
There is a variety of initiatives that are going on within the district that we believe that we'll be able to speak to over the course of time.
Thank you.
>> All right.
Thank you, Maury.
Thank you, Caroline.
Clearly we could go on and on and on about all this important work and the research and how complex it is.
So we only have a few minutes, about 5 minutes, for Q and A, and so remember, we are using the chat function for the Q and A, so you can go ahead and include any of that.
We do have one question in the chat.
"I have a question about the evidence-based programs offered in Safe Communities Safe Schools.
If the schools found a selection of blueprints to be limited and they adapted the programs to better meet their needs, does this affect your confidence and expectation of the blueprints' programs' effectiveness? If it does, did you or would you consider expending where you offer us EVPs, for example, from either repositories such as IES' What Works Clearinghouse or OJJDP's Model Programs Guide?" >> So I can answer that.
My dog, of course, is barking, so you might have seen me yelling at him while I was muted.
Okay, so I won't do that right now.
Yeah, that's a great question, Jen.
Thank you for asking it.
It's from Jen Ropeter, and she used to work with us at the center, so I'm not surprised by it, and it's a great question.
So I would say what we are thinking about is in the future working really close with program developers to see how flexible they will be in terms of making, like, small changes.
Like, what we've found is it didn't need...
The whole program didn't need to be revamped.
Oftentimes it was more of, like, the context of stories or, you know, things like that that needed to be changed, and so some program developers are more flexible than others, and I think what we would do is up front, you know, try to make...
We'd sift out who would work with us, and in terms of considering programs on others lists, you know, I think...
I certainly would look at those, especially if the blueprints' list doesn't have a program that meets a specific need from a district.
So in that situation, what we would say is, you know, we would want to do...
We would want to make sure we're using it not everywhere, but only using it, you know, in a couple placing and evaluating the impact.
I think that you could run into the same issues on any list, so it really is something that the developers...
you know, we don't want to work with the developers, and it's also something that I'm going to talk to...
well, we have brought it up to Carl Hill, who's the PI, principal investigator, on blueprints right now, and we've...
You know, we're going to talk about how blueprints can even adjust for these kinds of issues.
So I don't have set answers, but it's something we have very, very thoughtfully been thinking about, and I'd love to hear other ideas if other people have thoughts on that.
>> Thanks, Beverly.
I had one question that I thought of, and this goes to Dorothy and Claudia.
You know, one of the interesting things I noted was that when the students used the Advocator app, they wanted to...They reported wanting to do positive reports of things going well, and your data seemed to indicate, correct me if I'm wrong, that there were more positive reports that negative reports, and I thought that was really kind of an interesting finding.
So can you talk a little bit more about why...
wanted to include that, and why do you think they were more positive things than negative things? >> Yeah.
So during our focus group, the positive-reporting function was originally not in our design, so we focused only on negative reporting, and then students really pushed back on that and said, "Oh, you know, you're always so focused on negatives and school shootings and all those strategies, and we really want to focus on the positive." And I think both schools, both in Illinois and in Oregon, were really heavily invested in Positive Behavior Intervention Supports, so they really focused on building, norming positive, prosocial positive behaviors, and that really showed in students' responses, and the school-safety campaign in Oregon was focused on positive behaviors as well.
So students erected this big tree in the lobby of their school, and people could, you know, post positive leaves of positive messages on it, and so there was a lot of emphasis on really norming prosocial behaviors and positive behaviors.
And that, of course, also then resonated with empowering students to also make their voices heard about potentially, you know, problematic behaviors but then finding the support both in their peers and in their teachers to address those negative behaviors in a restorative fashion, and then the curriculum sort of went alongside that as well, namely that there is not the emphasis on consequenting negative behaviors, but there was the emphasis on resolving those in a restorative fashion and bringing people together in conversation, moving beyond the focus on the negative and on punishment essentially or consequence, you know, for a less drastic term.
>> Mm-hmm.
Thank you.
You know what? That really reminds us that schools are not blank slates, that there are activities and things going on, and they're going to influence how we address school safety moving forward so just a reminder that what the students learned from PBIS was transferring to how they're going to use these threat assessments in reporting.
It's really important for us to remember as researchers and people working in schools.
We're at the end of our time for today, so I want to thank all of our panelists.
If we were in person, I'd give you all a big round of applause, but thank you all for your contributions, your presentations and the important work that you're doing, and everyone else who participated in this, thank you so much for being here.
With that, we'll close out our panel.
Have a great rest of your day.
>> Thank you, everyone.
>> Bye-bye.
>> Bye.
Disclaimer:
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.
- Celebrating 10 Years of NIJ’s Law Enforcement Advancing Data and Science (LEADS) Scholars Program - 2024 NIJ Research Conference
- Learning from Doing Evaluating the Effectiveness of the Second Chance Act Grant Program
- The Changing Threat Landscape of Terrorism and Violent Extremism: Implications for Research and Policy