Preparing for and Responding to Threats and Violence - Breakout Session, NIJ Virtual Conference on School Safety
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video includes the following presentations:
Preliminary Data from a Statewide Anonymous Tip Line and Multidisciplinary Teams in Nevada, Al Stein-Seroussi
In 2018, the State of Nevada launched SafeVoice, a statewide anonymous tip line for students to report events that might be harmful to them, their peers, or their school community. Harmful events are far ranging and include suicide threats, bullying, harassment, depression, and planned school attacks. The goal of SafeVoice is to provide a mechanism for students to inform responsible adults who can then prevent harmful events before they happen or to stop them from continuing. Although often referred to simply as a "tip line," SafeVoice also requires each school to have a multidisciplinary team (MDT) available around the clock to respond to tips. The Nevada Department of Education administers and oversees SafeVoice; the Nevada Department of Public Safety operates a 24/7 communication center that receives tips from students and then disseminates them to the appropriate local school district or law enforcement agency. SafeVoice is funded primarily by a grant from NIJ (2016-CK-BX-0007) to Pacific Institute for Research and Evaluation which is responsible for all research components. During this session, we will present program data about the volume and type of tips and preliminary data from our MDT survey about the experiences of those who receive and respond to tips.
Student Threat Assessment as a Safe and Supportive Prevention Strategy, Dewey Cornell
In 2013, Virginia became the first state to mandate student threat assessment in its public schools and in recent years many other states have required or encouraged its use. This project examined the statewide use of threat assessment in Virginia and identified ways to improve training and implementation. In this presentation we explain why student threat assessment must be distinguished from other kinds of threat assessment. We report some of the difficulties in statewide implementation of threat assessment and describe our development and testing of an online educational program for students, parents, and staff to encourage support for threat assessment. Next, we present outcomes for a sample of 1,865 cases assessed in 785 schools. As expected, threat assessment produced low rates of disciplinary and legal outcomes. Furthermore, there were no statistically significant differences for Black, Hispanic, and White students. These findings reflect the potential for threat assessment to provide an alternative to zero tolerance that is less punitive and more equitable. Finally, we describe next steps for future research on this rapidly growing violence prevention strategy.
Evaluation of the Say Something Anonymous Reporting System to Improve School Safety, Justin Heinze and Hsing-Fang Hsieh
Background. Anonymous reporting systems (ARS) have the potential to improve school safety through facilitating reporting and improving school climate. Yet, they have not been evaluated with experimental designs for either the effects they have on student reporting behavior and attitudes or school violence.
Method. We seek to understand the effectiveness of the Say Something Anonymous Reporting System (SS-ARS) program in improving school climate and preventing school violence by examining underlying psychosocial factors in a cluster randomized control trial among students in 19 middle schools in the Miami-Dade County Public Schools. Using repeated survey responses, we compared students' self-efficacy and intention to report warning signs, perceptions of school climate, and exposure to violence at school in treatment versus control student populations.
Results. Results indicate that SS-ARS improved both short-term (3-month) and longer-term (9-month) outcomes for students to report warning signs. The intervention had positive effects on students' perceptions of school climate and reduced students' reports of violence exposure at school.
Conclusion. Our findings suggest that the implementation of ARS systems can be effective when they include ARS training that is integrated into a more comprehensive approach to improve school climate.
This is Not a Drill: Student and Staff Comprehension of Emergency Operations Protocols for School Violence, Josh Hendrix
School shootings in the past few decades have raised questions around how schools prepare for active shooter situations and the extent to which they are ready to respond to an emergency. We reviewed safety plans from 10 middle and high schools, assessed variation in lockdown protocols, examined staff and student comprehension of procedures, identified areas of strong and weak mastery, and highlighted characteristics associated with comprehension.
>> Good morning.
My name is Phelan Wyrick.
I am Division Director with the National Institute of Justice for our research and evaluation division.
Welcome to this first breakout session of the Comprehensive School Safety Initiative Virtual Conference.
We're happy to have you here today.
We've got a number of presentations for you in this breakout session.
The overall topic for this session is preparing for and responding to threats of violence.
We have four presentations that you will hear today.
You won't hear much from me.
I won't begin with extensive introductions, but I will encourage you to look at the conference website to see the biographical information for our presenters here today.
Our first presentation will be Preliminary Data from a Statewide Anonymous Tip Line and Multidisciplinary Teams in Nevada by Al Stein-Seroussi.
Our second presentation will be Student Threat Assessment as a Safe and Supportive Prevention Strategy by Dewey Cornell.
Our third presentation will be Evaluation of the Say Something Anonymous Reporting System to Improve School Safety by Justin Heinze and Hsing-Fang Hsieh.
And our fourth presentation will be This is Not A Drill: Student and Staff Comprehension of Emergency Operations Protocols for School Violence by Josh Hendrix.
I will ask that you provide questions and answers using the question and answer function.
You'll see the Q and A function at the bottom of your screen, and we will save questions and answers for the end of all of the presentations.
So with that, I will turn it over to Al Stein-Seroussi, who will begin with our first presentation.
>> All right.
This is Al Stein-Seroussi.
I'm with Pacific Institute for Research and Evaluation, and as Phelan said, I'm going to be talking about some data from a statewide anonymous tip line and multidisciplinary teams in the state of Nevada.
We'll move right along.
If I can get my...
there we go.
So this project takes place in the state of Nevada, as we've said.
So just a little background for folks.
Nevada is a big state geographically.
There are almost 500,000 students enrolled in schools in the state of Nevada in public schools.
There's 17 school districts in Nevada, so it is a very diverse state in terms of urbanicity as well as the presence of many rural and frontier communities.
So just to give you an example, Clark County, which is in the lower right-hand corner of your screen in the map there, is a single school district that is comprised of 325,000 students, so that's about two-thirds of all of the students in the state of Nevada are in the lower right-hand corner of the state, and the Clark County School District is the fifth largest school district in the country.
That is obviously where Las Vegas is located.
In contrast, you've got counties like Nye County, which is just to the left of Clark County.
That is, I believe the third largest county in the country geographically, but there are only 5,000 students living in that county, so Nevada is an interesting place in terms of just the wide variety of locations and landscapes, and this issue of rural and urbanicity is an issue when it comes obviously to student services and law enforcement services.
Let me give you some little bit of background about the tip line itself.
This is a statewide tip line, anonymous tip line throughout the state of Nevada.
It was mandated by the Nevada General Assembly a number of years ago.
The mandate and the legislation says that there has to be an availability of an anonymous tip line for all students across the state and that there have to be multidisciplinary teams or MDTs in each school that respond to tips.
So all students K-12 and all public schools, including charter schools, are involved in or have access to this tip line.
Each school district and school building must have three-person multidisciplinary teams that are responsive to tips, and these teams must include a school administrator and a school counselor, psychologist or social worker.
So two of those three people on those teams have to be an administrator or counselor of some sort.
The main goal here is really to prevent widespread harm, very tragic events like school shootings or people bringing weapons into school, but there's really also a very strong emphasis on social and emotional issues, mental health issues, prevention of bullying, prevention of suicide, identification of crisis situations, so it's not just about trying to prevent these really large-scale events but also to try to enhance opportunities for student wellness.
A few other features of this particular tip line.
I know there are tip lines all across the country, and they all have a lot of similarities and some differences.
So in Nevada, the tip line is administered by the Department of Education.
It's within the Office of Safe and Respectful Learning Environment, OSRLE.
So it's administered by the Department of Education, but it is operated by the Department of Public Safety, and it's co-located within the State Fusion Center and the Nevada Threat Assessment Center.
So it follows the Colorado Safe2Tell model.
For any of you who know the Colorado model, it's a very...
It's sort of law-enforcement centric, but it has strong partners in the school system as well as with mental health partners.
The project in Nevada is funded largely by an NIJ research grant, which we received back in...
We received it in 2016, the first year.
The grant was in 2017.
The Nevada tip line is called SafeVoice, by the way.
I don't think I mentioned that.
It uses P3 campus technology for communication between the tipster and the Department of Public Safety, in between the Department of Public Safety and local school and law enforcement jurisdictions, so some of you may be familiar with P3.
The important thing here is that it's a unified system of communication.
It allows auditing of communication and tracking of communication, and from the tipster's perspective, you can report tips via a mobile app, via website or via phone call.
And finally, again, just to give a little bit of background here, half of the state began operating SafeVoice.
They had full access to SafeVoice in training in January of 2018, and the other half began full access in August of 2018, and this allowed both for the establishment of a research design, which allows us to track half of the state versus the other half of the state over a period of time, and it also really helped with standing up the program because as you can imagine, getting a statewide tip line up and running starting in January would have been a real challenge, so this allowed for...facilitated the research design and the implementation.
Just so you can see, there's obviously...
Maybe it's not obvious, but there is a flowchart of how tips are meant to be responded to and how tips are disseminated.
So the important thing is there that the Department of Public Safety determines the priority level of the tips that come in.
There are four priority levels.
All tips go to schools.
All the tips of priority levels one through four have to go to the school buildings for people to respond at the school level, and if tips are in priority level one or two, which are those that present more imminent danger or present more direct harm to individuals, those also go to law enforcement agencies throughout the state, okay? So this is, again, both local school systems and local law enforcement agencies participate in this.
So there are four basic research questions from the grant.
How is it implemented? To what extent do the SafeVoice reports prompt appropriate initial responses from local school district and law enforcement? To what extent do SafeVoice reports prompt follow-up services for students of concern, that is, beyond the initial response of the tip or students getting the kinds of help that they need after the tips are reported? And then, to what extent does the presence of SafeVoice in the state of Nevada contribute to changes in school climate, problem behaviors and discipline? And you can see on the right-hand side of my slide, there are lot of data sources.
Those that are in italics are primary data sources that we have implemented and developed for this project itself.
Other data sources that are not italicized are things like using school climate survey data and school discipline data, but we're also...
As part of this project, we collect programmatic data through P3 Campus.
We conduct annual surveys of the MDTs, the multidisciplinary teams.
We conduct annual site visits and so forth, and today, I'm going to present a little bit of data just to give you a bit of a flavor for the program and the kinds of things we're saying from program data from P3 Campus and from the annual surveys of the multidisciplinary teams, and that's what I'll be talking about for the rest of my time here.
So on this slide, you can see these are outputs of the reports that have been generated or that have come in to the districts or throughout the state from the inception of the program on January 1, 2018 to December 31st, 2020.
Obviously, the information on the slide, there's a lot of information there on the left-hand side.
That's basically to show you that there is a wide variety of tips that come in and a wide variety of numbers of tips that come in with things like bullying at the top, more than almost 4,000 reports of bullying all the way down to the very bottom, which you probably can't see very well, but things like eating disorders, kidnapping and human trafficking, domestic violence or dating violence.
A wide variety of tips come in really across the board.
The portion on the right shows you the top 10 tips that have been received since the inception of the program, and you can see that really, bullying far outstrips the other, is clearly the most prevalent tip that comes in on this system with, again, almost 4,000 bullying events or 4,000 tips coming in about bullying.
That's followed by things like suicide threats, school employment complaints.
Interesting, this is the number three kind of tip that comes in.
It's an interesting side effect of this system.
The system was not really created for students or anyone to be providing complaints about school staff or for employees to provide complaints, but they do use it, so for folks out there are interested in setting up a tip line, be prepared for all sorts of tips coming in that may or may not really be directly related to what this was designed for, but as you can see, almost 1,000 reports of planned school attacks, almost 1,000 reports of self-harm, harassment, cyberbullying, 500 tips for fighting and some for smoking and assault and battery, so, again, a wide variety of tips have come in over the course of this project.
You can see here just the flow of tips that have come in over the years, and I want to point this out.
This was looking at 2 full years' worth of data which are when all of the schools were up and running, and you can see not surprisingly, during the summer months of July and August, the reporting dips dramatically.
Students aren't in school, so most of these tips really are about campus-based events, and you can in the summer, that really gets eliminated.
One thing to point out here, though, is you can see the effects of COVID, right? So the green line is the 2020 data, and you can see that the tips started really dipping in March/April of 2020, and they really haven't picked up to the same level as in previous years.
So COVID, where students are out of school or they're maybe in a hybrid model, it has really reduced the number of tips that have come in over the past year.
You can see the effects on the kinds of tips.
These are four half-year periods.
You can see on the left-hand side, these are before COVID.
That third cluster is when COVID really began in the middle of that period, and the period on the right-hand is after COVID started.
A couple of really key things to note here.
One is that, of course, bullying has died down.
The bullying reports decreased dramatically.
Again, kids aren't in school, so you see those bullying reports dip, but one thing that has stayed pretty constant across this are the suicide threats, so that's kind of important to notice.
Even with kids not being in school, suicide threats have stayed stable over time, as have cyberbullying.
So again, something to keep in mind, as even in the absence of being in school, these are the kinds of events that seem to be stable over time, and these reports keep on coming in.
I'm going to skip over this.
Well, actually, let me go back to that.
You can see the different kinds of actions that have taken place over the years with the tips that have come in, in terms of law enforcement actions.
There have been able 3,000 law enforcement investigations initiated as a result of tips coming in, 550 wellness checks that have been done by law enforcement officers, and then much less frequently but as important, we've got 69 threat assessments.
Sixty-five suicide protocols have been instituted or initiated by local law enforcement.
We've got 31 people transferred to hospital by law enforcement and 13 placed on emergency hold, and another thing to keep in mind is, there are a lot of unfounded cases, or when law enforcement does an investigation, they say that these are events that maybe were reported, but ultimately there was no evidence that they were taking place, and then a lot of cases where there's just insufficient information to act, and again, with these tip lines, you get a lot of that, so a lot of good actionable information, but a lot of information that there's not much that can be done.
In terms of school actions, actions at the school level, there have been 4,600 student check-ins as a result of tips that have come in, almost 4,000 cases of school-based supports being provided to students as a result of these tips coming in, 2,000 cases of a bullying protocol being implemented, 605 suicide protocols, and then smaller numbers about threat assessments and students being placed on emergency hold and transferred to a hospital, so again, a lot of activity at the school level that are a result of tips coming in about potentially harmful events.
And again, in the schools, a lot of...
Three thousand cases or 2,000 cases of either unfounded, unsubstantiated reports.
It doesn't mean they were abusive reports or reports that came in with people being negligent with the system, but there was no evidence that an event was going to take place, and then a lot of cases have insufficient information coming in.
In the last few minutes, I'm going to talk a little bit about the data we're collecting from the multidisciplinary teams, and I've got a few minutes here.
We collect data in annual data, so we have three ways of data.
I'm going to show you a little bit of data today from wave one, where we had 1,300 respondents representing 92 percent of the schools that are participating.
In wave three, where we have again 1,400 respondents, about almost 90 percent of the schools are represented in these data, and we ask things about MDT functioning, training, administrative support, self-efficacy around the respondents, school climate, the burden on their workload and the effects of COVID.
So let me...
There's a lot of information here, but I just want to pull up a couple of points here.
So in all of the dimensions of MDT functioning that we've asked, the responses were pretty high.
That is, the MDT members felt that the teams function well.
Members were accessible to one another.
They respond to tips at an acceptable level.
All of those reports were quite high both in wave one and wave two.
There's been a little bit of growth in wave three even despite having what kind of looks like ceiling effects by wave three.
There's a sense that the MDTs are functioning even just a little bit better, and kind of interestingly, if you look at the far right two bars, the MDTs are meeting more regularly, so now they meet at least monthly.
About 40 percent of them reported meeting at least monthly, and that was a little bit lower during wave one, so a little bit more engagement by the MDT members.
I'm going to skip over this This is showing that the MDT members feel that there is a lot of support in their schools from their administrators in terms of how they respond to tips, which is a really good thing.
You want to have the administrative support for these systems.
Interestingly though, if you look on the far right side, in terms of responding, the school has a champion to promote implementation of SafeVoice, about half of the respondents felt so.
So to the extent that the folks in Nevada, the Department of Education and Public Safety want to continue to enhance the program, one area that there could be substantial growth is looking for finding a champion to really push the program forward to create more opportunity for tips and more opportunities for follow-up.
In terms of self-efficacy...
I know I just have a few seconds here.
Typically, respondents felt that they had a great degree of ability to respond to tips, that they felt they know how to connect to students, that they felt that they do a good job with responding.
One issue that was raised early on, and this is where I'll end actually my presentation because we're coming up at the end of time, is it was a concern that teachers or staff might feel reluctant to share information about students, and that's a critical part of this project is being able to share information with those who need it to be shared, and about 40 percent do feel that reluctance to share information because of FERPA concerns or HIPAA concerns, so that's something that school districts can continue to work on.
About 20 percent, not as high as we might have expected, feel overwhelmed by the need to watch for students' behaviors of concern.
So I'll end it there.
I know I'm at my 17 minutes.
I hope this has been helpful for folks, and I'd be happy to answer questions at the end of the presentation.
>> Thank you very much, Al.
While you switch over, I will go ahead and introduce again Dewey Cornell who is going to talk to us about student threat assessment as a safe and supportive prevention strategy.
Dewey, go ahead when you're ready.
>> A few clicks here, and I think I will be ready to roll this thing.
Let me go ahead and get started.
It's a pleasure to be here with you all and tell you about our research project.
Copies of these slides will be available on our website and certainly, you can e-mail me with any questions that you have.
Let me just say all of this was a team effort.
I've had the good fortune and privilege to work with a number of graduate students and faculty colleagues over the course of this project.
I want to say a little bit about the context for school threat assessment, describe a little bit about project findings and then talk about our next steps.
Certainly, threat assessment is a violence prevention strategy.
It has really three phases.
An identification phase, where we need to have threats reported in some like a presentation we just heard.
Sounds like an excellent way to facilitate threat reporting, but threat reports may come in a number of different ways.
And then there needs to be a team in place to evaluate the seriousness of the threat and then finally, even though the term is threat assessment, intervention is very much a part of the threat-assessment protocol.
Let me just say that threat assessment is not such a new idea.
It's new to many in education, but law enforcement authorities actually promoted the idea of threat assessment and particularly threat assessment in schools 20 years ago.
Both the FBI and Secret Service recommended the use of threat assessment in schools after the school shootings that we saw in the '90s.
Let me just say that all threat assessment is not the same.
Very early on, we were concerned with the idea that threat assessment in schools should be differentiated in some important ways.
I want to highlight those.
I want to say that schools are really unique settings because they're filled with young people, and students are a lot more likely to make threatening statements because of their immaturity and often engage in fights, particularly our upper elementary and middle school young people.
We also have to recognize that threat assessment teams could overreact or schools could overreact if they don't have threat assessment teams and that overreaction can have really serious consequences for a student's education and future.
Of course, the school has an overarching mission to educate all students, and that's really important and needs to be incorporated in a threat assessment program in a school.
So of course, the primary purpose of threat assessment is to prevent violence, but we feel that threat assessment has to also recognize the need to avoid overreactions to student misbehavior that maybe isn't serious and to overall help troubled students.
Of course, helping troubled students will also serve their purpose of preventing violence.
We developed a threat assessment model with the University of Virginia back in 2001.
This is not the only model used in Virginia, but it's one that we've done the most research on, and it generally follows the idea of having a school-based team that follows a decision tree when a threat is reported.
Then it takes differentiated actions that could include protective action in the most serious cases.
Now in 2013, Virginia legislature, in response to the Sandy Hook shooting, actually mandated that all schools in Virginia have a threat assessment team.
Our colleges already had threat assessment teams following the Virginia Tech shooting and legislation that followed that.
Many schools, of course, already were using threat assessment in Virginia using either the model that we developed or other models.
Fortunately, also at this time is when the National Institute of Justice, Department of Justice initiated this program and funded us to do a 4-year project to look at the implementation of threat assessment in Virginia schools.
We collected a lot of data.
Much of this data was through records collected by the Virginia Department of Criminal Justice Services.
We thanked them very much for their partnership in that, but we wanted to examine how threat assessment was being implemented statewide.
We wanted to determine what the outcomes were for students and schools, and then we also wanted to look and provide training and technical assistance to see if we can improve the threat assessment process in Virginia.
Now, we've done a lot of work over the last 4 or 5 years on this.
Here is a list of journal articles that I'm not going to summarize, of course, but they're all available for you.
I'd be happy to send you copies of them and lots of book chapters.
It turns out a lot of handbooks about mental health services in schools wanted to add a chapter on threat assessment, and we were happy to contribute to that, and we've done a lot of speaking at national conferences as well.
So let me just say that...
Let me give kind of an elevator summary of our main and most important findings before I go into some of the nitty-gritty details.
First of all, Virginia schools have been resolving thousands of threats of violence without serious violence occurring.
We've had no shootings, no stabbings.
None of the threat assessments have led to any person being seriously injured.
We have had some fist fights and some arguments and so forth but nothing of serious consequence.
We also know that the schools are making differentiated assessments and recognizing that most threats are not serious.
Very few of the kids are expelled or arrested.
A very low rate of that, so they've clearly moved away from a zero-tolerance approach, and also very importantly, when we look at the outcomes for black students, Hispanic students and white students, we have not found statistically significant differences in their legal consequences or in their use of exclusionary discipline.
This is a very fortunate, happy finding that we have replicated in a couple of analyses.
Let me just say that implementation, however, is a big task, and it's gone slowly.
We know that more and more schools...
All the schools report having threat assessments teams.
It's mandated by law, but they're not mandated to do threat assessments or have threats to evaluate, and we've seen a slow and steady increase in the number of schools reporting that they are conducting threat assessments.
We've also seen an increase in the number of threats that are reported.
Now, you'll notice that we have both threats to harm others and threats to harm self.
The Virginia law actually added the direction that threat assessments teams look at suicide threats.
This was kind of an unexpected development.
It's a policy that we've had some questions about because the process for evaluating a student who has threatened other people is not the same as the process for a student who has threatened to harm themselves.
There's only a small amount of overlap.
I know there's a public perception that individuals who want to shoot up the school are also suicidal, and some of them are, but as you can see in this paragraph, we've had increases in threats against others.
That's the blue part, and we've had large increases in threats of suicide, and I think that's mostly reporting rather than an increasing in suicidality, that is, the schools had to learn that their threat assessment were also going to address suicidal threats, and so we've seen a big increase there but a fairly small overlap.
Let me just say that we also have a big job in getting our school staff to understand that there is a threat assessment process to follow in their school.
In a statewide school climate survey, where we asked teachers and other school staff does their school have a formal threat assessment process.
As you can see, the percentage saying I don't know is pretty substantial, and even now, seven years into this state mandate, a substantial number of our school staff are not really sure whether the school has a formal threat assessment process.
So there's a lot of work to be done there, and that's an area that I'm going to say a little bit more about in a minute.
In terms of our needs assessment, two key needs, as you might infer from the charts I just showed you is, first of all, a need to educate our students, parents and staff about what threat assessment is and about the importance of reporting threats, and then also, our needs assessment revealed that a lot of schools wanted to have additional training, more training, for their teams.
So one of the responses that we did to that needs assessment is that we developed some online educational programs for students.
It's 15 minutes that educates them about threat assessment.
I'm going to show you an excerpt from that in just a minute.
A program for parents, a program for teachers, these all educated stakeholders about what threat assessment was and how it works, and then we had some additional education and training for threat assessment team members.
I'm going to try to play for you a really short video that illustrates a key section of our educational training program, and this is a video in which a student reveals a weapon and says that he's planning to use it in school.
Hopefully, you'll be able to hear it.
It's a little soft at the beginning, but let me play this, and you'll see what we mean.
This is part of the training program.
>> You've got to catch him in the cafeteria.
Take him out.
>> If a student brings a gun to school or talks seriously about killing someone, it should be reported immediately.
Remember, school shootings have been prevented because a student reported a threat.
>> If a student is being picked on, bullied or harassed, you should report that too.
No one should stand by and let another person be abused.
A threat assessment team will investigate and stop the situation before it gets worse.
>> Look, I'm worried that he really needs it, but I don't want to be a snitch.
>> A snitch is somebody who tells on you to get something for themself.
It's not snitching if you're trying to keep somebody from getting hurt.
>> Sometimes, students don't want to report a threat because they think it's snitching, but there's a big difference between snitching and seeking help.
Snitching is something you do for your own benefit, like if you were trying to get out of trouble, so if you report a threat, you're not snitching.
You're trying to prevent somebody from being hurt.
There is students who have prevented a shooting because they reported a threat, and they're not snitches.
>> How will you feel if he actually does it? What if he kills someone, and you could have done something to stop it? >> There have been many situations where a shooting was prevented, but in schools where shootings did occur, there were often students who decided not to say anything.
They kept quiet and let it happen.
Don't let that happen in your school.
>> I guess you're right.
I don't want anybody getting killed if I can prevent it.
Who do I talk to about this? >> If you hear a threat and believe it can be serious, do the right thing.
Talk to someone you trust, like a teacher, counselor, coach, principal, police officer or your parents.
I don't know how to say this, but I'm kind of worried about Conner.
>> Oh, well, if you're worried about him, I want to know.
>> Well, see, he said he has a gun, and he was going to shoot Evan in the cafeteria.
>> Oh, wow.
You did the right thing by letting me know.
I want to make sure I understand the situation.
Tell me more about this.
>> All right.
So that's an excerpt from the educational program that we show to our students and teachers, and parents to help them understand the importance of reporting threats, and then there's additional information about how the threat assessment process works and so forth.
In testing this with staff and parents, we got very positive evaluations about the program.
We also measured their knowledge about threat assessment before and after the program, and prior to watching the program, they got 45 percent of the questions correct, and after the program, 78 percent of the questions were answered correctly.
We found similarly that parents increase their knowledge of threat assessment and their support for threat assessment in the schools, and most importantly, I think we saw that students also showed...
include knowledge and also an increased willingness to report the serious threat of violence.
These programs are available, let me just say, around the country.
Due to our federal funding, we were able to make them available to all schools at no charge, and there are hundreds of schools across the country that have signed up to use these educational programs in more than a dozen states.
Now, we are also interested in the impact of threat assessment on students and what the outcomes were, so let me say a little bit about that.
During one particular year, we were able to collect data from the schools on their threat cases.
Seven hundred eighty-five schools reported detailed threat data on 1,800 threat cases.
And, as you can see, threats occur at all grade levels.
We even had a couple of pre-K and kindergarten students who received a threat assessment for a threat that was concerning enough to their teachers that they wanted to investigate it, but threats tend to peak in those middle grades when kids are, perhaps, most rambunctious and aggressive and then tends to taper off thereafter.
Let me just say, in terms of types of threats.
We've got lots of different types of threats, but we have hundreds of threats to kill and shoot, which can be very concerning and worrisome, and I'm very pleased to say the schools did not take a zero-tolerance approach.
They did not automatically arrest or expel these students.
Most of the students made threats against other students.
A smaller percentage made threats against teachers and other staff.
Phelan, I'm going to ask you how much time I have...
How many minutes do I have left? Just go ahead and tell me.
>> You have about a minute left.
>> A minute left.
Oh, my gosh! I thought I had a little bit more time than I do.
Well, let me just say...
>> We'll grant you a couple.
>> Just say that threats were not attempted.
A few threats were attempted and averted, and less than 1 percent were actually carried out.
In terms of disciplinary outcome, a wide range of disciplinary outcomes, but only 1 percent arrested or expelled.
In terms of school placement, almost all of these students were able to continue after school, and in terms of disciplinary outcome, as I mentioned before, there were no statistically significant differences in how Black, White and Hispanic students were treated.
Our next steps are, of course, we want to do more to have training for all team members, to educate our stakeholders and to encourage schools to use evidence-based practices.
We certainly want to measure fidelity and implementation.
I will tell you there's a wide range in fidelity and implementation of threat assessment, and, of course, we want our schools to look at fairness and equity.
I've got just a couple more slides, Phelan.
He's got that big hook.
He's going to reach out and grab me and pull me off the stage here.
Let me just say we're very pleased that we are now started on a new project with NIJ funding to look at threat assessment in Florida.
We're looking at stakeholder reactions to training.
We're looking at characteristics of threat assessment, and we're looking at academic and disciplinary outcomes, and Florida has done things a little differently than Virginia.
We want to see what impact that makes, and we want to be able to show that the good results we've had in Virginia can be found in other states.
I also want to mention the National Center for School Safety, another NIJ funded project, homed at the University of Michigan, but it's a multisite project, and our part at the University of Virginia is to lead the threat assessment more clear, and we are working on developing standards and guidelines for threat assessment training and implementing.
So let me conclude by saying, I think the future for school threat assessment is very bright.
We've gotten some very positive findings that we want to build upon and extend, and we're very grateful to the National Institute of Justice and the US Department of Justice for supporting this valuable work.
>> All right.
Thank you very much, Dewey, and as we transition to our next presentation, I'll remind all the audience members that there is a Q and A button at the bottom of your screen, and you can use that function, and we will address questions and answers at the end of all of our presentations.
Our next presentation is focused on an evaluation of the Say Something Anonymous Reporting System to improve school safety.
Our presenters are Justin Heinze and Hsing-Fang Hsieh, and I will turn it over to them.
>> Thank you.
My name is Hsing-Fang Hsieh.
I am a system research scientist at the Department of Health Behavior and Health Education at University of Michigan.
I'm really happy to be here today and very excited to present our study for evaluating the Say Something Anonymous Reporting System in improving school safety in the Miami side.
I'm going to hand over to my Co-PI, Dr.
Justin Heinze, to start our conversation.
Good afternoon, everybody.
Thanks so much for joining us.
Can everyone hear me? >> Mm-hmm.
>> All right.
That's good, and this is apropos because I'm going to begin this presentation with an appeal to please bear with me.
This is a view from my back deck this morning.
We had about a foot-and-a-half of snow dumped on us, which meant that my car is stuck in the driveway, and I'm in my house rather than at my office where the Internet works.
So if I cut out here and there, I apologize, but we'll try to get through as best we can.
So, folks, many of us are here today because of horrific events like Parkland or the Santa Fe High School shooting or the Sandy Hook Elementary school shooting, and those events have become so...
that they not only changes the way that we think of schools as safe spaces, but they've been the impetus behind a whole range of interventions, from metal detectors and automatic blinds to lockdown drills and even arming teachers, all in efforts to ensure that school shootings will never happen again.
Now, as laudable as those intentions are, I'm here today because I think many of us are very concerned that schools are using policy and procedures that might have little to no evidence that they're effective at preventing shooting, and they can even be harmful to the very students they're intended to protect.
Hsing-Fang, would you move to the next slide? So in addition to that potential harm, we need to consider the cost to the school's educational mission and a little story as an example.
in an active-shooter drill, in particular, a few years ago, when I having a conversation with the principal in a school about 7 miles away from Sandy Hook Elementary, and when we were having these conversations, I asked about how things have changed since the shooting, and Principal Thomas here said a comment that struck me, and it was about the amount of time and energy they were diverting away from educating and putting into drills.
In her words, it was, "learning how to hide." Now, as an educational psychologist, I'm thinking that we need to be devoting as much time as possible to reading, writing and arithmetic, but also recognizing that safety is one of those foundational components to helping students self-actualize as learners.
What can we do? How do we strike that balance between not spending so much time and energy and attention on the safety piece to facilitate that learning? And in particular, for some of those strategies that we're not even positive might be effective.
Next slide, please.
What are we seeing here? Are we seeing Say Something and SSARS? Okay, there we go.
So another approach that's received a lot of increasing attention, but still has limited empirical support or its clear best practice is the use of an anonymous or a confidential reporting symbol.
About half of the states in the US mandate some form of reporting system, and the goal of these is to create avenues to ...
for students to speak up if they think someone is going to be a danger to themselves or others.
These systems arose, as I think has been mentioned already on this conversation, after previous work showed that, in many cases, school shooters told someone of their plan before perpetrating the event.
Dewey, I'm going to need to copy that video because I think it really does help to set up the necessity of having an avenue for students to speak out when they feel that there could be a problem.
The argument is, by creating awareness and encouraging reporting, there's a possibility of averting that serious violence.
And so we're going to be presenting some of the findings from Sandy Hook Promise Foundation Say Something Anonymous Reporting System.
Now, if you're not familiar, Sandy Hook Promise was founded in the aftermath of the Sandy Hook Elementary school shooting by parents and other community members in Newtown, Connecticut.
The foundation developed a system to support schools in their...
in promoting safety across the country, but a key feature of their anonymous reporting system is that they pair training to recognize signs of violence in addition to the technology.
So they provide the technology in terms of smartphones or computers or phones, but then they're also pairing that with interacting with kids and engaging them to help them learn when to recognize signs and when to report.
Next slide, please.
Here is how we think the program should operate, at least theoretically, but the argument, again, is that by implementing an anonymous reporting system paired with training about saying something, you're going to increase risk recognition, as well as more positive attitudes and subjective norms around reporting.
So, for example, combating the so-called code of silence, or as Dewey's video said, snitching.
All right? So what we're trying to repress those urges and enable and facilitate that reporting, but ultimately, those changes, if we can change those subjective norms, if we can help students recognize signs, we should see increased reporting, and through that increased reporting that we would expect to see changes in school climates, as well as violence outcomes like perpetration, victimization and, perhaps, even justice system involvement.
And we believe there are a couple of different pathways operating here that reporting can have a direct effect on violence outcomes, as well as a mediated effect through school climate.
Next slide, please.
And so to test this theory, with support from National Institute of Justice, we enrolled a sample of middle schools in Miami, Florida, in an RCT.
The schools were matched on sociodemographics, their size, disciplinary histories and then they were randomized to either receive the ARS intervention or serve as a control school.
We followed those schools for 2 years with measurements roughly in fall and spring semesters.
Although all schools could, in theory, utilize the ARS technology, it was available through the web to anybody who wanted to use it, it was advertised only in treatment schools, and only the treatment schools received the training from our program partner, that Sandy Hook Promise.
So the results that you'll hear today are initial test of some of the programmatic effects that we're discovering.
>> Okay, so let's slide 10.
Very quickly, I want to mention that we do have multiple data sources, but today, we're going to focus on the student survey.
Other data sources we have are administrative record, program record for implementation, and we have anonymous reporting data.
We have police incident.
We're in the process of gathering all the data, cleaning and analyzing the rest of the data, but today, we're presenting the student survey perspective.
We used the three-level multilevel models because we had three time points, so we could account for time, individual and the school and the costs and data with the school.
Very quickly, for the survey, like Justin mentioned, we assessed students' report of their program-specific knowledge of reporting and recognizing signs.
We asked them about their attitude and subjective norms of taking action to prevent violence, and most importantly, here is the stealth advocacy and intention to report.
And then, we also asked them about the long-term outcome where we see as school climate.
For school climate, we have three key component there.
We assess their perceived to school safety.
We also asked them about their perceived trust and emotional safety within their school.
We asked them about their school connectedness, and then finally, we asked them about the violent behavior they have and also their exposure to violence at school.
So this slide offer us a summary of our preliminary result.
Like Justin just mentioned, we had prophesied that the knowledge, self-efficacy and intention as a more immediate, more short-term result that we would expect from the program participation.
So from what we're seeing in the survey, they have is that we see that treatment school students have improved knowledge, improved self-efficacy and intention to report.
We also noticed a spike of the tips with initial 48 hours after the anonymous reporting system were available for the treatment school, but interestingly, as we did not find a spike last month, and we were only able to get about 128 tips in total for over all treatment schools, and a wide range.
It ranged from zero to 48 tips per school across the 16-month period.
So in terms of the long-term outcome, we're able to find pretty good result for the school climate.
We call it buffering effect, and I'm going to go over in detail why we call it buffering program effect for the school climate, but other than that, we also find a good use in exposure to violence at school.
And I'm going to talk about it in the following slides.
So this slide shows the 3-month follow-up for the most important two predictors of students' reporting behavior, which are self-efficacy and intention, and we test that in two ways.
On the left-hand side, the two figures shows the self-efficacy and behavioral intention to report across multiple channels.
When we talked about multiple channels, we ask them specifically how confident in their intention reporting through anonymous reporting system to parents, to police and to a trusted adult.
And what we're seeing here is the mean score of the four items, and the blue lines represent the treatment school outcomes and the change over time, and the red line represent the control school.
And on the right-hand side, you can see that we test specifically layers of self-efficacy and intention to report through anonymous reporting system.
What's good is that we see across all four figures the outcomes that the treatment students really demonstrate better self-efficacy and intention to report at the 3-month follow-up.
I should note that with the test for the long-term result because we're curious how self-efficacy and intention can maintain over time at the 5-month follow-up.
What we're seeing is a pretty consistent pattern here, like what you're seeing here at 9-month with one exception, the self-efficacy to report through multiple channels.
The upper-left figure you're seeing actually was not significant at the 9-month follow-up, but otherwise, all the other results stay consistent.
And on this second figure slide, we're showing the longer-term outcome at a 9-month.
As we just described in the conceptual model, we expect school climate and the violence exposure and violent behavior as a longer-term outcome.
It takes a long time to see the program effect, and what we're seeing here on the left, the two figures on that, figure E and G, representing the school climate result, and the blue lines, again, are the treatment school results.
They remain stable over time.
I would say the change, even though you see the slope, the change over time wasn't significant, while for control school students, we see a decline in school climate and their perception of safety.
I should say that it's partially due to Parkland shooting was happened right in the end of the baseline data collection.
So we would expect a drop of the perceived safety because of that tragedy, so that helped explain what we're seeing here, but again, like I said earlier, we see a buffering effect of the program because the students in the treatment school were able to maintain pretty stable perception of their school climate and safety over time, even at the 5-month follow-up.
And on the right-hand side is where we see exposure to violence reported by the student.
We see a decrease in the treatment school compared to an increase in the control school.
That is really encouraging, but I should also note that the range of this violence exposure is a comparably relevantly low-incident outcome.
So this is on a scale zero to 30, where each point represent one event, so what you're seeing here is for treatment school.
They drop almost one event at the 9-month follow-up.
I'm going to stop here and hand it over back to Justin for our discussion of the next steps.
>> Thanks Hsing-Fang.
So, in the last minute or two here, just want to talk about some next steps for us and some of the challenges, and I'd love to hear it.
We have an opportunity to send feedback about them.
So for our next steps, a big question here is whether an upstream intervention like this can result in sustained changes over time.
So we are cautiously optimistic that we've seen some of these initial changes, and we want to see it.
If the idea is, once a program is offered, that we can see some of those impacts sustaining across a longer period of time, as well as these, are all self-report survey data.
What will the objective data that we might get from the police department or from the school disciplinary data? Will that also pan out, and we'll see some of those changes? We're also testing sort of a pseudo-community-level intervention on individual-level outcome.
So there's an interesting sort of cross-level interaction that I think we can start to incorporate into our analysis as well, and then finally, that mediational model.
So we're coming from, you know, behavioral theory.
We have an idea of how this should work and would be exciting for us to test to see whether or not that that pans out the way that we would expect.
For some of the challenges, I mentioned earlier, I was kind of picking on active shooter drills and some of the other ways that folks are trying to prevent violence in their schools as potentially being harmful, but we have to consider whether anonymous reporting system could do the same thing.
So are we increasing the salience of safety concerns or violence in schools as we're training our students, and if so, what are some of the things that we can do to mitigate any potential harm from that exposure? Another challenge that we're really struggling with, Hsing-Fang just mentioned, rare outcomes are difficult for us.
Extremely rare outcomes like school shootings are almost impossible to detect.
So we can't assume that a null finding here means our program is working, and so we continue to grapple with that challenge and are considering some simulation data or other ways that we can think about how and whether we are averting some of those serious but very rare events that happens within schools.
And then finally, we're also very interested in how time and space interact in order to create the conditions for some of this violence to occur.
So what's happening on a day where there's a big fight, or there is a gun brought to school, and what are some of the intersecting contributing factors there that we might be able to address, either through the reporting system or in future work.
So some of those things that we're still thinking about, but again, very excited about what we've found thus far, and we're hoping to see something similar in the future.
So I think that is it for us.
We just want to acknowledge some of the other members of our team for all of their work and contribution, and many thank you in the last slide here to our partners in this effort as well as the National Institute of Justice.
>> All right.
Thank you very much for that presentation, and now, we're going to move to our fourth and final presentation for this breakout session.
This will be delivered by Josh Hendrix.
The title of the presentation is "This Is Not a Drill: Student and Staff Comprehension of Emergency Operations Protocols for Schools Violence," and turning it over to you now, Josh.
>> Hi, everyone.
Can you hear me, okay? Great.
Thank you so much for the opportunity to present.
I am presenting today on a 3-year study that was funded under CSSI.
The overarching goal of which is to look at the comprehension of emergency procedures in schools among students and staff.
This is a multi-method, multiphase study that involves document reviews, surveys, interviews, focus groups and even some observations when we visit our schools.
Today, I'm going to focus the presentation on staff comprehension of their school's lockdown procedures.
As many of you know, schools are implementing lockdowns for a variety of events but in order to prepare and prevent active shooters, hostage-type scenarios and many other types of incidents.
Just for a brief background, as we all know, there's been numerous shootings in the past 20 years, and a lot of these have really highlighted how important it is that the school community has documented procedures in place to sort of handle a potentially violent threat, and we've seen some pretty vivid examples of schools being able to prevent or mitigate violence when they have a very coordinated and efficient response.
So school safety experts have been telling us for years, as long as many members of the law enforcement community that mastery and comprehension of these emergency protocols are ...
it's absolutely essential because, when stressful situations happen, that muscle memory is really critical for ensuring that people know what to do.
But today, we have not seen very much research looking at comprehension and looking at the extent to which students and staff are ready to respond.
Do they know the procedures that are in their safety plan or their emergency operation plan? And are there key gaps in knowledge that we could help to guide schools in becoming better prepared for these types of incidents? So the research questions that I'm going to focus on today are, how much do staff know about their school's emergency procedures for lockdown, and what are the main gaps in knowledge? What characteristics of staff are associated with higher or lower levels of comprehension, and then what are some of the challenges and lessons learned from schools for preparing for these types of emergencies? Quick overview of our study phases.
In phase one, we recruited 10 schools to essentially allow us to collect and analyze their safety plans.
In phase two, we created and administered surveys based on those safety plans that would allow us to assess comprehension levels of those protocols.
In phase three, we're conducting site visits, and then finally, we'll be synthesizing all the results and reporting back to the schools.
This slide shows a bit about the schools we ended up recruiting, so our goal was to recruit 10.
We ultimately wanted to get some diversity in terms of the size and location and urbanicity of those we were working with, and we ended up getting five middle schools and five high schools.
We have four rural, three town and and three suburban schools, and then you can see we also have a range in terms of their size, with four of them having at least 1,000 students.
So that first step, as I mentioned, was getting access to each school's safety plan, which was actually a lot more of a complex process than you might imagine.
Just invoking the word "safety plan" or "emergency operations plans," a lot of times schools don't know exactly even in their own materials what falls under that plan and what they should send in all of that, but I will say we got a ton of variety in terms of the plans that we saw, while a few of them were several hundred pages long and many, many documents and just very, very comprehensive, others were only a few documents and only a few pages.
One of them was more of a PowerPoint presentation than anything else, so we saw a lot of variety here.
We then assessed each of those plans based on an assessment tool we created that was based on numerous federal and state resources, and I will just mention that each plan, while very different from one another, they all had strengths, but in general most were missing some core sections.
Many of them needed to be more customized to the building level and go beyond generic protocols, having more specific protocols for specific types of events, and almost all of them had some significant issues with organization.
In terms of their lockdown plan specifically, looking across these plans we identified 18 unique actions for the 10 different schools.
On average, the lockdown protocols included seven steps, the most common of which were locking doors, which all 10 schools had, thankfully, but then things like hiding, waiting for an all-clear signal, sweeping hallways, turning off lights and covering windows, whereas there's some other actions that were much less common, such as giving staff the option to prepare to counter, you know, making makeshift weapons, barricading the door and those types of things.
So after we had assessed those plans, we then created a staff survey uniquely for each school based on the protocols that their plan covered.
Each one had about 50 questions.
We used a variety of question formats, and we covered areas such as lockdown, evacuation and shelter-in-place, but as I mentioned, I'm going to focus on lockdown today, and our dependent variable that I'll be talking about comes from an open-ended question in which we asked staff to type in all the procedures that are required for a lockdown event for their school.
For each action they identify correctly, we would give them one point and then create a comprehension grade by dividing the number of points they received by the total number of points that were possible.
We then, in order to look at what are some characteristics that can predict these different comprehension grades, we looked at a variety of staff characteristics such as whether they had read their school's entire safety plan, the number of safety teams or committees that they are involved with at the school, whether they feel like their school has really prepared them for a violent event, things like sex and years that they've worked at the school, and then also their position, whether they are sort of a teaching staff or if they work in the front office or that sort of thing, and then in terms of some school characteristics, we also wanted to look at, does the size of the plan have an impact on comprehension grades, and also the number of days since their last lockdown exercise.
So our final staff sample is 585 staff members, and I'll just point out on this slide that about 27 percent served on one safety team.
The majority had read part of their plan, whereas about 41 percent said they had read their entire plan.
Fortunately, only 5 percent in the whole sample said they had never seen the plan at all.
The majority reported that they felt mostly or completely prepared to handle a violent event at their school, which we thought was pretty positive based on, you know, what our expectations were going into it, and then each school had an average of 60 days since their last lockdown exercise.
So on this slide, this will show you a little bit about the average comprehension grade by some of these, those highlighted in yellow being the more common lockdown actions, and you can see that they looked pretty good for being able to note that the doors need to be locked, so about 70 percent said that they would lock the doors first.
A little more than half mentioned things like hiding or turning off the lights.
About 40 percent correctly identified covering the windows and staying silent, and then about a third were able to mention things like sweeping the hallways and waiting for an all-clear signal.
So we've been running a number of random intercepts models to predict these branching grades and get a sense of what are the staff and school level predictors of higher or lower grades, so just real quickly, we're predicting that lockdown comprehension grade, and our predictors are whether or not they have read the entire plan, the number of safety teams that they serve on, whether they've been employed at the school for at least 5 years and then their position at the school, so relative to teachers, are they a teaching assistant or paraprofessional, a front office staff member, a support staff member, meaning they work in custodial or food service or some other kind of secondary role, their sex, whether they feel mostly or completely prepared for a violent event, the number of days since their last lockdown drill, and then whether we grouped them as having a small or large safety plan.
So you can see we have a number of significant effects, and really no matter how we run these models, we tend to see the same things, which is very good.
Those staff who have read all of their plan scored almost five percentage points higher on average than those who have not, so even though we might expect somewhat of a larger effect, it's still important because it's a very straightforward recommendation of getting staff members to simply read their plan.
Those who serve on safety teams are scoring higher than those who don't.
Interestingly, being employed at the school for more years does not have an impact on comprehension.
We see these very major effects amongst the positions at the school, so compared to teachers, teaching assistants, administrators and support staff are all scoring about 10 percentage points lower.
We have this effect of sex where males are scoring lower than female staff.
Another interesting non-finding is, staff who feel very prepared by their school are not scoring any higher than those who do not.
We don't see any effects of days since their last lockdown drill, but we do see this very large impact of having a small plan, so those schools we categorize as having a small safety plan scored about 18 percentage points on average higher than those with larger plans.
The overall score was a 42 percent, so obviously this tells us there is a lot of room for improvement.
Fifteen percent that were sampled tried to answer the question about what to do during a lockdown but received zero points.
They just couldn't identify any of those core actions, and we actually run some supplementary analyses where we dropped those who did not enter anything into the answer box just in case that was sort of skewing the results, but it did not.
It's also interesting that in every single school, we have a subset of staff who would, like clockwork, describe steps that are not described in their safety plan.
They will tell us that the terms we use, which we only pulled from the safety plans directly, were inaccurate, so if we called it a level one lockdown, we would always have a subset that would tell us they had never heard of that or that that was not an accurate term at their school, and then we would always have that subset who would provide very generic answers such as, "I'll do whatever it takes to keep the kids safe," and wouldn't give us any sort of specific, concrete actions.
Phelan, do I have about 4 or 5 minutes? Is that accurate? >> Yeah, that's about right, about 5 minutes.
>> So we've only been able to complete four site visits so far because right in the middle of it all, COVID happened, and so we've had to kind of put those on hold, but these have been absolutely tremendous for learning about how schools prepare for emergencies and just the insight that staff have about how to make that process better.
They just have had a ton to say in addition to the students.
Today I'm going to focus on staff, but at another time you'll get to hear about all the great things students have to say.
So in our focus groups with teachers, it's very interesting that they are...
It's very common for them to talk about not knowing exactly what their school's safety plan is, where to find it, what exactly it consists of, and even if they did know where it was, they wouldn't have time to read it.
They've talked a lot about the main challenge being getting people to take their drills seriously, especially students.
They have been concerned that the traditional lockdown model that their school follows is not enough, and many of them want more discretion.
They want more permission to be able to evacuate if needed during an active shooter event or to use weapons that they've come up with in the classroom.
They have a lot of confidence that because their students have been doing these lockdown drills since elementary school, that they could do them even without the help of adults, but that the reality is no matter how much they prepared, these events can't really be prepared for, and the panic that would set in would really undermine any efforts that they made to get ready for these types of events.
Real importantly, they have identified that their school has a lot of gaps in their level of preparedness.
They are not preparing well enough for incidents at lunch or something that happened in between classes or after hours, and they recognize that this is a critical area.
The custodial and food service staff have been just absolutely excellent.
They have so much insight, and they're excited to talk about this issue because a major thing from their focus groups is that they feel left out of the process, but yet every day they're overseeing hundreds of kids eating lunch without the help of administrators or teachers, and they recognize the cafeteria as being a major area, a major vulnerable spot for schools, and they need to do a better job of preparing for the possibility that something could happen at lunch.
Counselors, nurses and mental health services, they worry a lot about staff turnover, and especially among assistant principals who tend to lead the emergency preparedness efforts, but whenever they leave and someone comes in with new ideas, there can be conflicting philosophies or directions about what the school is supposed to do.
They also have worried a lot about the impact of unannounced drills, and this has been a very emerging theme across students and different types of staff, but recognizing they're valuable for simulating a sort of real experience, but the damage that they cause to students is something that really needs to be considered.
The front office staff are really interesting in the fact that they have almost been unable to be critical about their school in any kind of way.
They really suggested that everything is going great and nothing really needs to change.
And then of course assistant principal and emergency planning staff, they've been very insightful as well.
They talked about the importance of soliciting feedback after drills, then debriefing, and unlike other staff, they think that comprehension is huge and that people knowing what to do when an event occurs is really paramount, but they are frustrated by the amount of information that is out there, at times conflicting, about what schools should do when a violent person enters the campus.
So just overall takeaways, I'll be super quick here, Phelan, that because there is confusing about what the plan is and where it is, the schools need to better communicate about where that plan sits and ensure that staff are regularly reviewing it.
These plans need to be customized to move beyond generic instructions.
They need to make sure that when they are updating training, they are updating their plans so that these things are in harmony, and of course we want to see more and better training for everyone to get those comprehension scores up, especially for non-instructional staff, so we want that consistency between plans.
We want widespread dissemination of the plans, and we want ongoing feedback to be incorporated into how schools are training, and of course we want everyone to understand the importance that mastery of those protocols has and that impact it could have on their ability to respond to an emergency.
Here is my contact information and looking forward to answering any questions you all have.
Thank you so much, Josh, and to all of our presenters.
So we did get some questions that came in, and what we'll do at this section is, I will read off some of the questions that came in and turn it over to you all.
We have about, oh, 13 minutes or so, so we'll try to be fairly brief in our responses, so the first one is for Al, and it has to do with the number of tips regarding threats that come into schools, and the question was, "Well, you know, how many of these tips are duplicated? Do we know if there's, you know, numerous tips coming in for specific incidents?" Is there a way you can speak a little more to that question? >> Yeah, that's a great question, and I probably won't have a satisfactory answer at the current time, but basically, right, these are...
the counts I presented were the number of tips that came in, and we could have multiple tips for any given event, and we have to do kind of a lot of unpacking of the data, start to get into a tip audit or a series of tip audits to see where those tips went and to see how many of those really represent duplicate versus unduplicated cases, so I don't have those data now.
I'll be working with the Department of Public Safety and the Nevada Department of Education to obtain those data to get a better sense of how many unique events those reflect, to get a better sense of how many of those were substantiated versus unsubstantiated, and to gather a better sense of what the actual responses were, and so all of those data are forthcoming, but I don't have those numbers for you now, but those are great questions.
When you have a system like this, you're getting a lot of information, a lot of information at the front end about the tips themselves, but you have to do a lot of digging and drilling to really find out more about what those tips really are about and what the resulting actions have been.
>> Sure, thank you, and a couple questions for Dewey Cornell.
First of all, I think people really are interested in the video that you showed.
Can you say anything about how people can access that video? >> So the slide itself actually has contact information.
It has the website where the videos are available, and so if you get a copy of the presentation, you can do that.
You can also e-mail me, and I will send you the instructions.
The way this works is, they're all on one site, and the school can get a password that gives them access to it, and then all of the data generated will be coded by that password to that school district.
I just double-checked, and we have 116 school districts in 29 states that are using the program, have access to it, and we'd be happy to add more to that, so I can send you the instructions.
They can be downloaded from our website, the Youth Violence Project at University of Virginia.
We'll get it to you one way or another.
>> All right.
Thank you, and another follow-up for you, Dewey, in terms of, "There's a recent emphasis on automated threat assessments, and can you say something more about how important it is that the information from threat assessments be automated?" >> Tell me what automated threat assessment is.
>> Unfortunately, I'm not able to translate that, and I don't think we have an open dialogue with the questioner, so...
>> So there are systems that will search for threatening language or threatening images and raise a flag about that.
That's a part of what you might do in threat assessment, but it is by no means a threat assessment itself.
Threat assessment requires human beings, preferably a multidisciplinary team that are going to gather and assess information and respond, and if you aren't gathering information, assessing it and responding, it's not a school threat assessment.
>> All right.
Let me make sure I'm scanning the questions and getting the ones who have asked, so this is a question to Josh Hendrix, and it's a question if you've seen any schools have a centralized system for reporting drills within an emergency operations plan.
>> Yeah, I'm sitting here kind of thinking about that and a way to answer it sufficiently.
I have yet to see schools that are doing a really great job of documenting what happens at their drill and then incorporating that into their BOP and sort of feeding that back into any updates or changes that they make, but a lot of our schools do do a good job of at least talking with their staff and identifying some of the problems that happened during drills.
Unfortunately, that is often in a very open door sort of informal, "Come to the office if you have things to say," rather than doing something in, like, a centralized, systematic way, which I would love to see.
Not to say that there aren't districts out there that have some really great guidance or forms that schools could use to document their drills or also at the state and federal levels as well.
The extent to which schools are using those I think is a question that is open still.
>> Great, thanks, and I have a question also for Justin and Hsing-Fang.
The question is about the...
you reported after...
you reported this 48-hour duration in terms of a spike in responding to the system, and I'm curious what thoughts you have about sort of why that is and how schools could go about addressing that kind of spike to make it, you know, a longer duration.
>> Want to go for it, Justin? Or I can.
Well, let me start.
I think that speaks to how accessible and maybe some reminders that we will need for students to remember to download the app and remember there's such a program that's available there for the longer-term, so in order to maintain that reporting system there, I think constant reminder would help.
Justin, you have anything to add? >> No, other than I almost see the reverse challenge.
I mean, I think it's expected that when something's novel and it's accessible, people are going to use it.
The question is, we want to see that use maintained over time, but then at what point in terms from a measurement challenge do we see whether or not this is helping us reduce incidents, or are we reflecting increases in incidents? Right? So it's trying to disentangle, is more reporting a good thing, and for how long is it a good thing, and how do we determine the success by rate of reporting, or is there a proxy that we should be considering in addition to the number of reports that we're getting or the severity of the reports or kind of the post-report audits that Al mentioned? >> Yeah, I also want to just comment, I mean, the work that all of you are doing right now is really on the front lines.
It gets a lot of spotlight in this domain because of course it's the things that we often turn to first when we think of the most horrific events.
We think of the most violent events, the ones that are most publicly in the public consciousness, and it came up in a couple of the presentations this issue of the frequency, right? These are rare events, and the challenges of doing this research, so first, I want to just commend you all for really going after these topics because they're very hard to do research in these areas.
When I think of emergency operations planning, you know, there's very little research on that, right? And Dewey has carried so much work on the student threat assessment, and so these types of topics have gotten increasing attention in part because of the investments we've been able to make through comprehensive school safety initiatives, but I just throw it open to the group, and if there's any thoughts any of you would like to share on just the challenges of doing work on these really rare incidents and trying to do rigorous research in this space.
>> I would just comment that, right, you sort of have two things going on, in my view, that we see in Nevada.
You've got these big sort of horrific, tragic events that don't happen that frequently, that then it becomes very challenging to determine whether this kind of system is making a difference in that, and then you have all those other kind of events that I think are as important that the system can help with, so issues of bullying and harassment and cyberbullying, but those are also challenging to measure because I don't know that we have great measures of when those events actually take place.
We may have measures of those events being reported, so you've got events that don't happen very often, which you may be able to track more easily, but they don't happen often, and you have events that happen much more frequently, but they're equally challenging to document because of the kind of indicators that are available for them, so yeah, we sort of have it at both ends and the different kinds of event that this kind of system can influence.
>> Okay, we got a couple more minutes.
Any other thoughts before we close? All right.
>> You know...
>> Oh, go ahead, Dewey.
>> You know, Phelan, every forest fire starts with a campfire or a spark or a little brushfire, and I was heartened to see all the bullying reports that Al reported because those are important.
They're certainly important for the kids who are involved in them, but from a prevention standpoint, the more that we can do to reduce problems like bullying, the more we'll be able to reduce these rare huge events, and so I think we need to identify these more prevalent events that are more measurable, but recognize that they're valuable and worthwhile to study as well.
>> Think that's a great comment there, Dewey, and I think it's an excellent way to wrap up.
It's certainly been something at my end in Washington that we've really talked to a lot of people in sort of the policy domain about, which is that, you know, you don't just start by focusing on shootings.
You start by looking at the whole context, and that's why when we support research in this space, when people talk about school safety, it's everything from school climate to these security measures and all the rest that's part of it, so thank you very much.
I'm sure the whole group would be giving a round of applause.
In this session, we've had between 50 and 60 participants, so thank you all so much.
This is my virtual round of applause to you, and I hope everybody enjoys the rest of the conference.
>> Thank you.
>> Thanks, everybody.
Have a great afternoon.
>> Thanks, everyone.
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.