Physical Safety and Preparedness - Breakout Session, NIJ Virtual Conference on School Safety
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video includes the following presentations:
Findings and Lessons from the School Emergency Preparedness Study, Suyapa Silvia
The purpose of this study was to describe the state of preparedness for violent emergencies such as active shooter events, in K-12 school communities in the U.S. Findings will be summarized from each phase of the study, including: 1) a content analysis of State Department of Education websites to identify guidelines and recommendations provided by states; 2) a national web-based survey of school districts, to learn about strategies used by districts with different characteristics; and 3) school-level telephone interviews in 36 well-prepared schools, to learn about best practices in emergency preparedness. The presentation will conclude with lessons learned and recommendations for future research.
Perception Versus Reality - Physical Security at K-12 Schools in Arizona, Thomas Foley
In 2015 the Sandy Hook Commission Report recommended schools use an emergency response time approach to security design, which involves delaying an attacker long enough for police to arrive. Unfortunately, there is no information available to schools about how long certain physical security measures will delay an attacker. A research team at Embry-Riddle Aeronautical University is currently conducting research designed to help schools adopt an emergency response time based approach to physical security design.
This presentation will give an overview of this research what the research team has learned during this on-going project.
Securing Schools? School Violence and Engagement with Security Measures in One Urban School District, Matthew Cuellar
This presentation examines the relationship between school security and school violence through two competing theoretical lenses: (a) opportunity theories of crime that suggest higher engagement with security should predict less school violence, and (b) a school criminalization perspective that suggests greater engagement with security should be unrelated to or even increase school violence. Findings highlight various associations between engagement with school security and non-serious violent crime and weapons-related crime. Further, it is suggested that this relationship is different for Hispanic and African-American students. Implications for practice, education, and research will be discussed.
>> Can you see my screen okay? >> Yes.
We can see your screen.
Well, thank you for having me.
This is from a project that was funded by NIJ's CSSI in 2016.
We completed the research project in 2019, and the topic was, "School Emergency Preparedness." I want to acknowledge my colleagues at RTI, Josh Hendrix and Terri Dempsey.
So we were motivated to study this particular topic.
At the time, there was a lot in the news about school shootings, and also we had been looking at the history of school violence in this country, and also realizing that violent emergencies in schools are unpredictable, and they can occur anywhere in any type of school setting, and therefore all schools must be prepared.
We defined violent emergencies to include active shooters, hostage scenarios, bomb-terrorist incidents, serious fights and weapons-related incidents.
The purpose of our study was, first of all, to describe a state of preparedness in general across the country for violent school emergencies, and we did this at three different levels: the state, the district and the school.
And we also wanted to see how this varied, perhaps based on certain characteristics.
A second purpose was to assess the interrelationship between the guidelines and the recommendations that were provided at the federal, state and district levels and how those hopefully translated to guidelines and recommendations that the schools were receiving and have access to.
And finally, we wanted to describe best practices and challenges with development and implementation of emergency-operations plans, and for that we wanted to tap into schools that were particularly well prepared, and I'll share how we identified those schools or a sample of those schools.
A number of research questions guided our research, but today I want to just focus on the ones that I've highlighted in red at each of the three different levels.
At the state level, we wanted to know, "How comprehensive are state guidelines and mandates for emergency preparedness, and is there any variation across states?" At the district level, we wanted to study the extent to which the district guidelines were derived from state and federal guidelines and if there was any association between certain district characteristics and the state of preparedness, and finally, at the school level, we wanted to learn from those well-prepared schools about their challenges and also best practices.
So our study design was multileveled and multiphased.
Beginning with the state level, we developed a readiness rubric, and I'll share a little bit about that in the next slide.
We used that rubric to conduct a content analysis of all the materials related to safety planning that we could find in state-education websites as well as some other state-agency websites, and we did this for all states.
Around the same time, we conducted a national-district survey between January and June of 2018 and received nearly 2,700 completed surveys, and this was stratified across a number of different characteristics, region of the country, size, urbanicity, et cetera, so we had an excellent representation in terms of these characteristics.
Next, we wanted to take a closer look at districts and schools that seemed to be more advanced in terms of their preparedness.
To do that at the district level, we developed a preparedness measure, and then used the survey responses to identify 36 model districts.
To further confirm that these were model districts, we went to their district websites and conducted a similar readiness assessment using a similar rubric as we had used for the states, and out of that, we ended up with the top 15 districts to move on to phase four.
And in phase four, we conducted school-level interviews in 37 well-prepared schools recommended by those districts that were in those top 15 districts, and these, again, reflected a diversity of size, urbanicity, enrollment, et cetera.
Here are the elements of that readiness rubric that we used to conduct the state-content analysis, state-document reviews in websites, and theses are critical components of any emergency planning for schools, and they include elements that are essential to good planning, and they come from the Federal Guidelines for Schools.
And we scored all the different materials that we could find on the websites on these elements based on the scoring rubric a zero, one or a two.
I just want to go over a couple of terms that I'll be using throughout.
Threats and hazards, identification and risk assessment is a process whereby threats and hazards that are perhaps more likely for a school are identified, and that can be based on prior threats, prior events that have happened at the school as well as any other information about likelihood of that threat or hazard happening, and each of these is assigned a risk value, and then that is then used to prioritize the planning versus threat assessment, which is in response to an actual threat.
And a threat assessment is intended to thwart, basically, the event, the violent event, from happening.
An emergency-operations plan, or EOP, is a reference document that details the procedures and actions in the event of a violent emergency.
So what should emergency planning in schools look like? First of all, there should be an emergency-planning team, and this can vary widely in terms of the composition, but in general, it should be very comprehensive of different kinds of school personnel, representatives from the student body, parents, the community partners, any organizations that work with the school, law enforcement, et cetera.
This team identifies the threats and hazards, you know, that protocol that I just mentioned, assesses the risk and then begins to prioritize the planning.
Now, emergencies can occur of many different kinds, so they can be natural, so just tornadoes, or human-caused, such as violent events.
Generally, a school will have an emergency-operations plan that covers a number of different kinds of emergencies, but ideally that plan should include not just the basic courses of action for general kinds of threats and hazards, but it should include a separate section for specific threats, such as active shooters.
This plan ideally should be at the building level, and it should be tailored to the physical environment of the school.
So if a school is large, if it's a rural school, if it's a sprawling campus with multiple buildings, all of that should be taken into consideration when developing the EOP.
Also, the student population: How will you handle small children? What is the plan for students with special needs, et cetera? And it should incorporate all the technologies that are in place at the school.
Lastly, there should be training, drills, exercises conducted based on that plan, and that is very key, which brings me to this slide.
So the key is to have this detailed plan in place that covers all of the relevant threats and hazards and to actively prepare the school staff, the students, the first responders to implement that plan, and this is ongoing.
It is not a one-time-and-done kind of activity.
It is ever changing.
It should be updated, reviewed and then practiced.
So what kind of technical assistance and guidance is provided to schools to do their planning? At the federal level, there is a very valuable resource, the Readiness and Emergency Management for Schools Technical Assistance Center that is administered by the US Department of Education.
They provide a number of different types of technical assistance, from webinars to guides, documentation, and there is a specific guide for K-12 schools, "The Guide For Developing High-Quality Emergency Operations Plans," that takes a team through all of the different steps for creating an excellent EOP.
At the state level, many state-education agencies have information available.
There may be other kinds of guides in other state-emergency agencies and management agencies, and then there are some school-safety centers in certain states as well as regional or state task forces.
Districts are often the first place that a school looks to for specific guidelines for that district, and they may offer templates and even develop an EOP that covers all of the schools in that particular district.
So I'm going to just highlight some of our findings.
We had extensive data that we received that we analyzed through this project, but today I thought I would just hit the highlights on each of the three levels that we studied.
So in terms of the states, we found that the majority do cover the basics.
Most of them expect school to have an EOP, to have an emergency-planning team and even to conduct a threat and hazard-identification assessment.
However, one of five states provided very little information.
They didn't have much on their website or any kinds of -- especially the critical elements that they would share on their website, and this guidance varied in terms of urbanicity or other things.
So for example, states with more resources than those in urban areas often provided more guidance, whereas less guidance was available in rural or Midwestern states.
We also saw that not all of those federal recommendations were making their way down to the state level.
The district survey provided information about the experiences of the districts in the sample in terms of violent incidents in the past 10 years.
Thirty-two percent indicated that a bomb or threat, a bomb threat or incident, had occurred in the past 10 years.
Fifty-eight percent indicated no violent emergencies had occurred in the past 10 years.
Conversely, 42 percent had experienced at least one of these emergencies.
These violent incidents tended to be more likely in the Northeast and for larger districts and less likely in the Midwest and in smaller districts.
We also saw in terms of planning that most districts expected schools to have an EOP and to conduct threat and hazard-identification assessment, and these were fairly comparable to what we saw at the state level.
We also asked what materials they used, that is the districts, to provide their guidance to schools, and 60 percent pointed to state-education agency materials and very few, 25 percent, to that federal guide that I pointed out from the REMS TA site.
School-level interviews were very interesting, provided a lot of information about safety planning at the local level, at the school level.
This was a very valuable activity, and we are still combing through for different dissemination pieces because there was quite a bit there.
This was conducted in 2018, these interviews, and at the time, there were still, and there were some, high-profile school shootings that occurred, and all of that national focus on school shootings really came together to add even more pressure to schools to do something and to prepare, and it almost changed how schools began to think about emergency planning.
So for example, trainings and simulations became more realistic.
Some included gunshot sounds so that the staff could hear what that would sound like.
There was an increased reliance on technology.
Even now, much of it remains untested.
There was a perceived need to overprepare.
You know, school administrators were constantly looking for those gaps and those weaknesses and feeling like they were not ever going to be as prepared as they needed to be.
There was this idea of empowerment to allow staff and students to make their own decisions in the face of a school shooter, for example.
That is there -- It's still controversial.
There are individuals who think that there is no room for that, and there are others who would like to see that used more often.
And lastly, an increased expectation on the part of students and staff to speak out when they hear about a threat, and that's where we have those tip lines available to report that information.
We asked these well-prepared schools to tell us about their best practices.
Oftentimes, as I indicated earlier, the EOPs are developed at the district level, and they tend to be generic or more general to that district.
So one best practice is to customize those district EOPs, make them customized for the school according to the characteristics of that school.
Another is to solicit feedback from students and staff on protocols and procedures.
They have a lot to say, and they can help improve or spot those gaps.
Another is to develop strong relationships with community partners, the law enforcement, first responders, mental-health professionals.
All of those partners are critical to the success of planning, and those relationships will go a long way.
Getting everybody onboard at the school for these drills and trainings, making sure everyone understands the importance and the need and that everybody takes these drills seriously and maintaining open communication with the entire student body and the partners.
There were also some challenges that we learned about.
The first was the lack of comprehension of the protocols by the staff.
"It's all in the execution," they said, but that's the hardest part.
My colleague, Josh Hendrix, yesterday presented on this very topic, so if you missed it, I'm sure we will all have the recordings afterwards, and you can watch that, but that was the focus of his research project.
Another challenge is the staff turnover.
You can imagine how that impacts safety planning and keeping, making sure that everyone has been trained.
The other is this balancing act of priorities in time and resources and funding, so what are the academic priorities versus the safety-planning priorities? Another challenge that sometimes comes up is disagreements over how to best implement certain protocols and/or communication breakdowns between partners, and lastly, getting buy-in from staff when changes are needed to those protocols.
So they might want to do things the way they've always done them and be reluctant to change any of the protocols.
In conclusion, in terms of everything that we learned, we put together just some thoughts about perhaps what schools need the most in order to best prepared or, you know, prepare plans that are most effective.
First of all, schools could use some more direction and some guidance, especially for developing those EOPs.
They don't often know what resources are available at the federal, state and district levels.
Certain protocols, like the family reunification and accounting for all persons are difficult protocols to implement, and so those are in need of a standard way of doing them.
There are various opinions as to how those should be conducted.
The threat and hazard identification and assessment process, many schools, many districts were not aware of those or knew how to do them properly, so again, that looks like a need in terms of technical assistance for schools and districts.
And then there's more research that schools could use for some of the gray areas.
How much should students be involved? When is that a security risk? How realistic should drills be, and what about that empowerment model? And lastly, technology: Lately, especially with the STOP Act, there have been lots of funds flowing through to schools for hardening measures in technology, but oftentimes they don't really know, schools and districts don't really know, how to spend those funds effectively.
So this research needs to be brought to them in order for them to understand what is most effective in terms of that technology, and that's it, and here is my contact information.
>> Thank you, Dr. Sylvia.
We appreciate your presentation.
Reminding the attendees, if you do have questions for Dr. Sylvia, please put them into the Q and A, and we will answer them at the end, and we will now turn the time over to Dr. Tom Foley.
>> Thank you, Michael.
Let me -- Can you see that? >> Yep.
We can see it.
Okay, so my presentation today is going to be about a research project funded by the NIJ that we at Embry-Riddle Aeronautical University are working on that looks at the physical security, those technologies that are deployed in schools, and we've done surveys in schools, K through 12 schools in Arizona, and so I put this quote up there.
"We are all going to be okay.
There are bad guys out there now, and now we have to wait for the good guys." And this was a quote from one of the first grade teachers at Sandy Hook, and the reason I used this quote is it best summarizes what we are trying to accomplish with our research.
With respect to physical security, there are what we call the Five Ds of Security.
There's deter, detect, deny, delay and defend.
What we're focusing on are the deny and delay components of the Five Ds and simply put, how do we slow the bad guys down and speed the good guys up? So just for background, the Sandy Hook shooting took 6 minutes.
That's a very short amount of time, but from the standpoint of physical security and developing barriers and delay, 6 minutes is an achievable amount of time to delay an attacker from accessing the school and then accessing the classrooms.
One of the recommendations that the Sandy Hook Commission made following the Sandy Hook event was that schools use what they call an Emergency Response Time Analysis to guide their physical-security design.
What a Emergency Response Time Analysis does is working with local law enforcement, you determine how long is it going to take for law enforcement to get on scene and deal with an active-shooter situation, and then you design your physical-security measures to delay the attacker's access to targets at least that amount of time so the police can get there and deal with the attacker with minimal casualties.
But one of the things realized is there's a lack of data out there, so how long does a solid-wood door prevent an attacker from accessing a classroom? So that's one of the things we want to look at, and also, often there's a kind of a "one size fits all" approach recommended to schools, and generally it's focused on high schools with kind of a trickle down to elementary schools, but each school is different.
There are things that elementary schools can do, like keeping classroom doors locked all the time, that are relatively easy that are more difficult in a high school where students are moving between classrooms every period.
Elementary school, the students are generally in the same classroom all day, and we want to take those differences into account.
So for our research, we actually had four parts.
One, surveying stakeholder perceptions.
How did they perceive the level and quality of security at their schools? For this survey, or for this part, we surveyed parents and teachers in participating schools.
Second, we are determining how those schools are being secured today.
And then what we're going to do is, we're going to compare those perceptions with the data we've collected during our physical surveys of the school.
The purpose for this is to see, is there a disconnect between what parents and teachers perceive the school's security posture to be and what trained security professionals observe in their schools? This can be helpful in a couple different ways.
One, if parents and teachers are overestimating the quality of the security in their schools, educational measures can be taken to help them better understand the level of the threat and the quality of their existing security program, and if they're underestimating it, that can also be useful for educating them as to what is an appropriate level of security and to reassure them as to the level of security at their school.
And then finally we plan to test commonly used barriers that exist in schools to establish delay times, and I'll talk about that more in a minute.
So our preliminary data parent survey, we collected information from parents at 45 schools across nine school districts.
We had 619 parents start the survey.
Five hundred and twenty-nine completed most of the questions.
Female respondents, or females responded more often than males by a four to one margin, and the respondents' children's grade levels ranged from preschool through seniors in high school.
What we found with this preliminary analysis is what parents perceive as important is access control in general.
They feel that having features on the playground such as fencing around the school as well as building features to control access as being the most important for the safety of their school, and they also frequently identified a system for requiring visitors to check in before they're allowed access to the rest of the school grounds or the school building and then providing visitors with badges so staff and students can recognize that they are not a part of the school, but they are authorized to be in the facility.
Parents also identified the monitoring of school grounds as being important.
This could be video surveillance of the grounds or having some sort of guard patrol to identify an intruder as early as possible.
And then some respondents mentioned having armed guards and law enforcement, although that was much further down the list than the general access control responses.
For the determining how the schools are secured, we had a team of security experts physically survey 62 schools.
These schools were located in 15 school districts.
Two of the districts were located on the Navajo Nation, which we were really excited about getting, and the NCES school district classifications of the schools that we surveyed ranged from Rural-Remote to Suburban-Large.
We tried to get a good cross section of schools, and we really wanted to get differences in emergency response times.
One of the school districts we looked at on the Navajo Nation, two of its schools were located in a town that if they had an active shooter event, their emergency response time would be between 45 minutes and an hour before the first law enforcement person could reach the school.
So that differs greatly from the large suburban school where it could be 5 minutes or less.
During our survey, we collected data on more than 4,000 exterior and interior school doors.
We looked at the doors.
We gathered data on the hardware used on the door.
What kind of lock sets were they? Were the doors lockable? Were they lockable from inside the classroom? Or did the teacher have to go outside to lock them? We looked at the hinges.
We looked at the makeup of the door, and we also looked at any windows immediately adjacent to the door, specifically sidelights or viewer lights in the door, and the reason for this is, fire code requires free egress out of classrooms, meaning the interior side of the door even when locked has to be able to opened with one motion.
If there is a pane of glass within 36 inches of that door handle, it's possible for an attacker to break out that window, reach in and unlock the door, so we gathered that data.
Our personnel who conducted these surveys, all the survey teams were overseen by Certified Protection Professionals.
This is the highest professional certification in the security industry, and one of our team members is a retired navy SEAL and a retired SWAT team leader, and he really brought a great perspective to our survey because the CPPs like myself, we were looking at keeping people out, and he was looking at, how easy would it be for law enforcement to get in to get to the assailant in an active shooter? Which is another consideration when beefing up the physical security of a school.
Our preliminary observations with respect to the physical security, all of the schools we looked at had at least minimal security measures.
That being locks on all exterior and classroom doors.
We did, however, find a lot of schools where they had doors directly between classrooms, and those doors may not have had a lock or had a lock that only locked in one direction, meaning if an intruder were to access the nonlocked side classroom, they could then move into the other classroom and have free egress because of that fire code doorknob situation.
What we discovered so far is that maintenance and funding is the most common issue in the schools.
Many of the deadlatches on the doors were broken.
We found classroom doors that while they had the ability to be locked, if you jiggled the handle, you jiggle it a couple of times, and the lock would give, and you could access the classroom.
We found doors that were in bad condition.
Dr. Sylvia mentioned the increased reliance on technology and that many of these technologies are untested.
If you're trying to improve security at a school, if you don't have the fundamentals there, the working door handles, door locks, doors that are good condition.
Then adding technologies like bullet-resistant whiteboards really does not, is really not adding to the security of the school.
Some other things we found, we found a commercial-grade doorknob, which is what should be on classrooms, had been replaced with a residential interior doorknob.
This would be the type of doorknob that you would find on your bedroom at home or your bathrooms at home.
It's really not designed for security.
It's mostly designed for privacy, to keep somebody from inadvertently walking in the room, so it was not appropriate for that particular application, and we actually found some jerry-rigged locking systems, and I want to actually share an example with you.
So at one of the schools we went to, in the band room, we noticed that these braces had been attached to the door, and they have a hollow tube area here.
And this metal bar -- And the shape of the bar you can't see it, but it goes up and then comes out and then goes across and back and down -- was put into those brackets, and then it used the mullion here between these two doors to prevent it from being opened.
This is actually a fire code violation because it eliminates that free egress I talked about earlier.
I'll give you another photo.
This is what it looks like with that metal bar removed, so the doors could then open.
So we looked at it, and we tried to figure out, why did they put this on this door? There weren't similar features on the other doors in the school, and what we discovered is, the latch on the mullion was broken.
However, all it needed was a screw to be replaced, and that would have fixed the problem.
So one of the things we've encountered seems to be -- And this is not the only example we've seen across the schools -- an issue of training of maintenance and custodial personnel on how to properly repair and maintain these basic security devices that are securing the schools.
Our barrier testing component, it was actually supposed to take place in summer 2020.
Unfortunately we had to delay that, and we are now working on getting it scheduled for the summer of 2021 due to the pandemic.
What we're going to be doing is, we have identified a certified ballistics testing laboratory in Maryland.
We are going to be traveling there, and they will be testing commonly used doors that we found in schools: solid wood doors, hollow wood doors, steel doors and hollow steel doors, and what we're going to be doing is testing them against three of the most commonly used munitions in school shooting events.
So we're going to test it against 9mm handguns, 5.56mm rifle rounds.
These are the sounds used in AR-15 rifles as well as 12-gauge shotguns.
Our plan is, we will empty the magazine of the weapon into a predefined area of the door that working with our SWAT team expert, we determined probably the most likely places an intruder would try to shoot the door.
We're not actually testing the door hardware yet.
Hopefully future research will look at that, but for right now we're just looking at the door, and what we're going to do is fire a magazine's worth into a defined area of the door.
And then we're going to follow that up with forced entry, brute-force attacks to simulate an intruder trying to kick in the door or use the butt of a shotgun, things like that to gain access to the classroom, and then record then the average penetration times for each type of door.
And we're defining a penetration as, we're actually using the US army physical security definition of a 96-inch square hole, which would be big enough for a small person to crawl through or for somebody to reach their arm in and possibly unlock the door.
But we're not only looking at penetration times.
We're also going to look at the spalling of the door that could injure people.
This is the fragments that come out when the bullets go through the doors, and we're going to perform the same tests on tempered glass, laminated glass and glass with smash-resistant film applied because those are the most common window-glazing materials in the schools.
So our next steps, we're going to determine relationship and safety perceptions with what our security experts have found.
We're going to use the penetration time data to develop a framework for an emergency response time analysis based on security design and compare the parent-teacher perceptions with those of our security experts.
And ultimately we hope to design a physical security guidebook for school administrators that will help with the training and help them identify cost-effective ways to improve security at their schools.
Now I will turn it over to Dr. Matthew Cuellar, and thank you for listening.
>> All right.
Thank you so much, Tom.
I really appreciate it.
So real quick, let me go ahead and pull up my screen here.
I'll pull this up, and we will just jump right into it.
Let me maximize my screen here.
So first and foremost, I want to say thank you so much for the opportunity to present today.
I've really looked forward to presenting my research here, and I want to say thank you.
I do realize it's almost 4:45 on day two of the conference, the last slot of the day, so I'll try to be brief, but if you have any questions, feel free to jump in, so.
So I want to do a couple of things here with this presentation, and before I get started, I want to give credit where credit is due.
I want to give a shout-out to the students that have worked on this project for me as well as one of my dear colleagues and coinvestigator, Dr. Samantha Quill, who's helped me tremendously with recruiting and building up this study even during times of COVID, which I'll talk about here shortly.
So I want to give an overview of the the study purpose just really briefly.
I'm going to talk about the methods in one slide, and then I'm going to talk about the results of three different studies, kind of smaller studies we've done within this project, and I do want to note that the project is ongoing right now.
We were a little bit delayed due to COVID.
We're about midway through our data collection so do keep that in mind.
So I'll talk about some findings.
I also want to highlight how we translate our data, so we do a very applied study here, and we try to translate our data in a way that the school administrators and the educators we're working with can use this data to make decisions, and then I'll open it up for discussion at the end of the presentation here.
And just real briefly, any of these pictures in the PowerPoint are from my school district that we're working in, so I do want to note that there.
Now for the study purpose here just really briefly, this study began as an idea in 2016, actually, and it actually began after receiving funding out of the CSSI in 2018, and the idea here behind this research, at the time there was very, very little research that looked at this relationship between physical school securities and school security measures and ultimately student academic performance.
Right? And understanding, you know, how do these things affect performance and behavioral outcome? And there was a lot of research that suggested these things exist.
There were these relationships here, perceived safety and metal detectors and all this kind of good stuff, but we were really interested in understanding, you know, how school safety, the things that students experience every single day, influenced these academic and behavioral outcomes that they're in some ways intended to and designed to address.
As I met with the school board of education that we worked with, we started identifying some other needs within the district, and particularly we were interesting in understanding racial and socioeconomic disparities in outcomes attributable to safety as well.
You know? Thinking about things like Paul Hirschfield's criminalization hypothesis, right, and understanding, you know, what is going on around school security measures? And how are they ultimately affecting students over time? All right? So that's really the big piece there.
Another big key component for this research was to help school administrators in our district use data on this topic to inform what they're doing with the children that they serve, so that was a big piece, and I'll talk about that when I discuss how we translate our information at the end of this presentation.
And really the context here behind this study was at the time that it was designed, there was very little research that looked at within-school variation and outcomes attributable to security measures and student interaction with security measures, and this is kind of a twofold issue.
So the first part being that a lot of the research that we were making decisions on at the policy level were really centered on school-level data.
When you think about things like the School Survey of Crime and Safety, at the time of this study, the 2015, 2016 was just being released.
In fact, I -- In fact, it was released just after this study started, and we were still basing on the 2010, 2011 release, 2009, 2010 release.
So data at the time, and that's just one example, was really looking at differences across schools, and we were particularly interested in looking at within-school variation.
Now the second part and the second need that this study addressed is that a lot of the research that did look at student-level variation, for example, the National Crime Victimization Survey School Crime Supplement asked about security measures, physical security measures, target-hardening authoritarian measures.
We'll talk about those in a moment.
It asked students about those but simply on a lot of times on dichotomous fields.
For example, yes, no, right, does your school have this? Yes or no, so even if we have the student level, we didn't have a lot of information as to student engagement with security measures, these physical security measures that were in place in school system.
So this study really attempted to use person and environment perspective to not only understand security and its effects on students but ultimately to think about different ways we measure school security, so instead of saying yes or no to the metal detector there, yes or no, I walk and I talk with the school police officer.
The idea behind this study was to ask students on the ground, you know, how often do you walk through a metal detector? How often on a typical school day do you see a security camera in your background? Right? How often do you see the school police officer? So really these two ideas just focused on within-school variation and understanding within-school variation and also using person and environment perspective to really measure student engagement with these security measures was really the idea and the key behind what we're doing in this research.
So to give you an idea of kind of what this study looked like, we were in North Jersey here, and we're doing work in Newark public schools, and this is a high-needs school district here.
Right? So we think about, you know, this being one of the highest-needs school districts in the state of New Jersey.
We broke this up into kind of two different projects within one larger project.
The first being that we looked at ninth-grade students who were enrolled in 2018, 2019.
Right? And then we were ultimately going to scale up and do ninth and tenth grade and eleventh-grade students the following year.
Then COVID happened, but now we're recruiting and working with ninth, tenth and eleventh-graders across our district.
So today I'm going to talk about findings from our initial kind of pilot cohort of students, our ninth-grade students, about 360 of them, 359 to be exact, from the 2018 to 2019 school year, and keep in mind.
We're talking about physical security measures, so this is things that were in school, in place not during COVID times and in an online environment.
But since then, we have been able to recruit almost 1,000 students into this study even during COVID times, which is really phenomenal.
Again, I have to give a shout-out to my team there.
In this -- In the [Indistinct] that we'll talk about today, we have eight schools participating across our district, all of which are public schools, all of which are located within probably about a 5-mile radius, if you will.
But moving forward here, we have been able to recruit an additional six schools, and we hope to build this sample as we kind of move through through project.
So, again, remember, this project is kind of midway here.
Data collection involves kind of two different aspects.
A lot of schools, we had surveys where you can get an example here in a cafeteria.
We'd meet the students.
We'd give them a survey.
We would talk with them.
We would sometimes do this by homeroom.
Sometimes we'd do this by classes, and other times we would do interviews with students so for example, one of our schools, you know, we'd go in, and sometimes we'd only meet a couple of students at a time.
We'd talk to them.
We'd speak to them about what's going on.
We'd get an idea of some of this qualitative information as it concerns school security and let them kind of tell the story but in any case, we collected data from these students, and we collected survey data on exposure to school safety, so we asked them how often they engaged with different security measures.
And we took these directly from the School Survey on Crime and Safety, but instead of an administrator answering them as they do in the SSOCS and an administrator saying whether or not their school has it, we asked the students simply, hey, you know, on a typical day, how often do you engage with or interact with this security measure in the school? So we have exposure to school safety and security measures.
We also asked them about their self-report engagement indicators of school violence so, again, just like the behaviors that are listed on the School Survey of Crime and Safety, you know, an administrator says, you know, this is the number of incidents for this type of incident, this type of incident.
We did the same thing.
However, we just asked a student, you know, how often on a regular day do you engage in this type of behavior? And we broke these up based on Na and Gottfredson's 2013 framework for serious crime and violence, violence, property crime, disorder and drug use, and I'll talk about that in a moment.
We also used the Maryland Safe and Supportive Schools survey.
That was provided by Catherine Bradshaw, and that we used domains of school climate and connectedness as somewhat control variables for some of these analyses that we're doing, and on the other side of this, you know, we will be looking at those variables in more detail.
But today I want to focus on physical safety here.
You know, the district also provided us data from the students that we worked with.
They provided us with GPA, and they provided us with attendance rate, and they also provided us with PARCC scores, which are the standardized assessment scores that focused on both English and math ability for the students.
So when I think about -- When we think about performance, right, we're going to talk about GPA, but we're also going to break this down into different aspects here and more focused outcomes that we looked at.
And then the attendance rate is the percentage of days completed by the students out of 181 required days, so to give you an idea, those things were provided by the district.
They were much more objective, and they were provided, you know, at the end of the project period based on their student ID number there.
Now for data analysis piece, we did a couple of things here, but most importantly we used M plus 8 to analyze our data, and we used type equals complex random with school as the cluster to account for shared variances within school.
Now remember, the focus here in this study was to look at within-school variation as to how these things differ.
You know? How does exposure to school security differ based on things like race and socioeconomic status? How did these things influence outcomes? And we used complex random as opposed to multilevel modeling because we're not trying to predict any level two variants so in this vein, we're really just looking at and wanting to control for the clustering of standard errors.
That is that similarity, right, between students within schools so we can compare them across schools so we can allow those slopes to vary across schools.
Now I'm going to talk about three kind of smaller studies within this larger project here, but, you know, they all kind of fit in to this kind of overarching goal, and we're continuing to do research as we move forward.
So, again, we're about halfway through this project here.
When we think about exposure to school safety, one of the first things that we did with the board that we're working with is, they were really interested in understanding kind of disparities as to, as attributable to race and ethnicity in engaging with school safety and school security measures.
And, again, these are things for the most part that are physical, that are in place, and me being a social worker, a school social worker by training, I did include some of the mental health aspects here because we do see them in place in our schools as preventative measures.
Right? They're not supposed to be reactionary.
You're not supposed to say, hey, a student gets in trouble.
Then they need to go to counseling.
It should be the other way around.
It should be preventative services to some extent, so I kept those in there just to kind of have some discussion here, but, you know, really the focus here is the physical piece.
So when we think about asking the question, does engagement with school security measures differ by race and ethnicity? We collected data, and just to give you an idea of some of the demographics of our sample here, we found that this is a majority minority school, and I don't even like really using that word, but it's a majority minority school district in which, you know, only about 11 percent of the student population reports as Caucasian.
It's predominantly African-American and Hispanic, so in our sample here, we actually had a little bit of an overrepresentation of Caucasian students.
I think it was around 25 or 26 percent.
We had underrepresentation of African-American students but overrepresentation of Hispanic students, and I'm happy to provide a link to any of our publications so you can get an idea a little bit more of the nuances and details around that there.
But when we look at this here, when we enter control variables, we wanted to look at and determine, do African-American or Hispanic students, you know, differ in their engagement with school security measures particularly in an urban school setting? Right? In this single urban school district, and we found some really interesting things here, and we used the reference group as the Caucasian students, and in this case here, we found that Hispanic students had a higher likelihood, about 1.5 times the higher likelihood of reporting walking through a locked or monitored gate.
We found that Hispanic students had a lower likelihood of engaging with school security personnel, the school security police officer, the SRO.
Depending on the school, it was different.
And then we found ultimately that African-American students were almost over three times more likely to report being searched randomly by a drug-sniffing dog, which is actually really interesting if we think about it because the inverse applied to the Hispanic students.
When we continue to think about physical strategies here, we saw African-Americans tended to almost two times report that they were within eyeshot of a security camera compared to their white counterparts when controlling for some of these other variables that would predict such encounter.
We also found that Hispanic students tended to be less likely to be restricted from social networking sites, which we think about this as possibly a socioeconomic issue.
However, we did try to control for that in these analyses.
Now, one thing I do want to point out here that's really fascinating, particularly from a social work standpoint, is that both the African-American and the Hispanic samples tended to report significantly less likelihood of engaging with pure mediation programs, which as we see in our school district, these programs are in every single school.
They're designed in a way, whether they're facilitated by the anti-bullying specialist, by the school counselor, they're designed to be preventative measures in our district.
So it's a little bit interesting to see this as possibly being a reactionary approach, maybe being something else, but there is still that significant difference in engagement, as reported by race.
Now, when we take this a step further and we think about physical strategies, all right, we're looking only at the physical strategies that I had listed at the top of the analysis I just provided, and we think about school security engagement in general.
We see some really interesting things here.
Now, as I mentioned, this is a longitudinal study.
We're continuously collecting data.
So we're only looking at two waves of data here.
So with full disclosure, it could be painting a little bit of a different picture than what we might see at the end of our surveys, but here, we see some really interesting things.
So first and foremost, we see positive associations between engaging with physical security measures -- we're focusing on physical things in the school, not counseling, not peer mediation, not group work, not any of those interactional pieces I just mentioned.
When we look at this and we look at general engagement with security, we see that when we simply control for school climate, nothing more, not look at this over as a product of time, we see that security is positively associated with things like non-serious violence, right, weapons possession and property crime.
Which really, we see all the way back from Miller and Schreck in 2003, we see this as existent, right? We see, you know, when students feel like they're being surveilled, when they feel like there are safety measures in place, we see that, you know, these are typically things that might occur in schools where this is more likely to happen, right? Which way is the association going? But time can help us kind of shed a little bit of light on this and help us understand this.
So when we look at change scores, right, we look at the difference over time, right? So we look at indicating more engagement with physical security measures over time, less between time one and time two, which is over the course of the school year, lower indicating less engagement with security measures over that period of time, that same school year.
And when we see this, we see that we have negative associations between this, meaning with more engagement, right, we're seeing lower -- with more engagement security, we're seeing lower incidences of things like non-serious violence and lower incidences of things like weapons possession, which is an interesting kind of finding because we think about at this as the inverse of "oh, this might be an association between these things being in place and these things happening." And remember, we're talking about urban school districts here.
We're talking about an urban intercity district, if you will.
When we think about adding lag-dependent variables, and we think about controlling for the auto-correlation between time one and time two, we see that relationship between security change and non-serious violence and in weapons possession still exist.
And when we want to factor some of those disparate outcomes that we saw and engagement with security measures that I had mentioned in the previous slide, we can look at differences by race, right? Are these things in these relationships differing, depending on if you're Hispanic or if you're African-American within this intercity school district? And while we didn't necessarily find anything within the African-American group of students, we found some interesting findings between the Hispanic students, and they're illustrated here on the right in our inner actions.
And what we're seeing here is that for Hispanic students, this relationship is a little bit different, right, than it is for non-Black or Hispanic students.
And this is really interesting because when we look at this, particularly for things like nonserious violent crime, right, we see that engagement with security measures tends to go down for the Hispanic students when controlling for these factors over time, wave one behavior.
However, for non-Black or Hispanic students, we seem to see the opposite.
And similarly, we see the same thing with outcomes such as property crime, and we see the same thing with outcomes related to disorder, right? So this is interesting because on a very topical level for this presentation, we're seeing kind of differences in Hispanic students, and while a lot of the literature focuses on African-American students and particularly in intercity school districts, it is important to think a little bit about what this might look like, whether this might be a product of a Trump culture and some of those administration's efforts in immigration and policy reform.
Perhaps that could be reflected in this.
You know, maybe Hispanic students are a little bit less likely to show up, you know, due to things.
For example, with the school police officer engagement, maybe they're less likely to engage with that authority for whatever reason.
There's other things going on here, and it certainly warrants further research.
When we think about intercity school districts, we really need to think about Hispanic students and unique effects whether they're attributable to culture or attributable to policy action, whatever is happening.
Now, the final piece here I want to discuss, and this is just really from more of a perception standpoint, is to look at and think about how perceptions of safety influence these academic performances, again, amongst all students, right? And we can talk a little bit about differences, but a lot of these control variables allow us to look at these general effects within schools across the district.
And we look at this here, we see there's some really interesting things.
In general, when we talk about lack of perceived safety -- and this is operationalized as indicated by the MDS3.
When we think about generally lack of perceived safety, when students feel like they're not safe in their school in general, we don't really see an effect on GPA and overall performance, but we interestingly enough see some differences in these PARCC scores, in these standardized assessments that really warrant further investigation.
So for example, when students generally feel less safe in their school, we see that that is associated with lower scores in English and language arts abilities, lower scores in math and associated with lower retention scores, so less days attended, right? You don't feel safe in your school? Okay.
You know, I'm not going to be as engaged.
I'm not going to be attending.
So that's what we're seeing across the board.
We see this specifically when we break this up into different domains, as outlined by the MDS3.
So for example, when students perceive bullying and aggression as a problem, we see very similar outcomes, right, where we see students tend to do less well on their standardized performance scores when controlling for various factors here, and they tend to do at an engagement school, at a 10 school less.
And then when we think about things like drug use, while they might not have an attendance factor there, we still see it associated with lower performance.
So these are unique characteristics when we think about a high-needs school district in an intercity setting that warrant much, much further investigation, and we hope that over the next two waves of data collection with our study, we can shed some more light on this as it concerns growth over time.
Now, let me jump back over here to the user, right? So when we talk about research, particularly in social work, you know, we talk research -- we're not just doing research for research's sake, right? We don't want to just say, "Hey, I contributed a theory that's great," but rather, we want to figure out a way where we can provide this information just to the educators and the administrators so they can use it, right, and I think that's really important.
So one thing we've done with this project is, we've created a dashboard to provide data to the educators and the administrators, the leadership teams within our district, so they can get an idea of what is going on in their school.
And they can get an idea of, for certain groups, for example, if it's African-American students or Hispanic students, if it's those who receive only free lunch, whatever the situation is, they can get an idea of the risk and protective factors within their school.
So this comes twofold.
So the first part of this is that we can look at students within school, and we can get an idea of percentages compared to the district or compared to any other parameter that the user would like.
So it might be, for example, in our area, parts of the city are broken up into wards, right? So maybe you want north ward versus south ward, you know, versus central.
You know, whatever that it might be, you know, the user can do that.
So they look at this, and they can get an idea of what the data -- the survey that we're providing is telling them, and they can compare it to a reference group.
And again, they set that reference group, so there's some barriers there that we're still working through and some kinks that we're working through, but the idea here is that they can flag kind of indicators of risk within their school at the student level, a they can identify, you know, specific numbers of students that might say, "Hey, you know, I'm engaging in gang activity," or, "How often have you engaged in physical attacks or fighting?" In this example, you'll see in this one school, it's much, much higher than across the district, right? So a user can look at that and say, "Hey, that's a problem," right? "That's something that I need to address." Whether it's through a program or whether it's through, you know, parental or community outreach, they can develop that, and they can kind of identify that kind of specific piece there.
And we do this for all of the data in our surveys, so we provide, you know, information on behavior and arrests, exposure to school security and safety measures, school engagement as protective factors, various different things that the user can use.
Now, the second part of this is on the school level, and the user can take a look at this, and they can get an idea of the risk and the proportion of risk or students at risk within their school comparative to whatever they want to compare.
And this is another really powerful tool in that it allows the user to kind of think hey, you know, what proportion of my school is at risk versus protected? And if you look at different schools, it's really, really fascinating because some, you know, they might be 2 or 3 miles apart from one another.
Some might even be on the same block, and one school will have a very, very, very low risk profile, and another school risk will have a very, very, very high risk profile.
And what that allows our educators and administrators to do is, they can take a look at this, and they can say, "Hey, you know, this domain that my school might be a little bit lower in or a little bit higher risk in," so we can target that type of behavior, or we can target those types of issues that we're experiencing in the school.
So it just gives a way to kind of translate this data, and you know what? We have this really complex survey data.
You know, the last thing we want to do is say, "Hey, you know we published the articles, and we're on our way," but rather we want to make sure that this is sustainable for the educators and administrators that are doing this and that are working with us and are collaborating with us in this study.
So real quick, just a brief discussion here on kind of these points here.
So what this research is getting at -- again, we're right in the middle of study.
We still have two more waves of data collection.
We're in the middle of our third.
You know, COVID, of course, everyone can understand COVID kind of putting a delay on some things here and there, but we're trying to do our best in this unique time, but, you know, we're still working through our studies.
So the idea here -- Things can look very different by the time we're at wave four.
We followed our cohorts, you know, for 3 or 4 years at a time, but really, what this is doing is providing unique within-school variants, perspectives on a district, right? So we think about a district as being a single body.
We think about school sometimes and policy research as looking at differences across thousands of schools on a nationally represented data set.
That's absolutely terrific and super helpful, right? But at the end of the day, that within school, individual student-level interaction is really, really key to understanding school safety.
How are they engaging, right? What are they doing with their school, and how is that environment affecting them because as a social worker, I truly believe in that person and environment perspective.
That environment is influencing them, and the extent in which they engage in it is what's going to cause change, not necessarily whether or not something is in place from a policy standpoint.
So provides a unique perspective on that within school variance.
What we need to do is, we need to figure out a way to continue assessment of disparate outcomes within schools because it's not just happening, "Hey, this school it's in, but this school it's not." You know, it's happening at the student level.
You know, we're seeing different levels of engagement, and, you know, you can take them for whatever they're worth, but we're seeing different levels of engagement.
Those are going to have different outcomes.
You know, whether we want to say, "Oh, it's okay if they don't interact or if they don't see a camera as much.
That's no big deal," but it's going to lead to different outcomes down the road over time.
So it's important that we continue to assess these things and think a little bit about, hey, how do we identify these disparate outcomes, and what can we do to address them as quickly as possible, to make sure we promote equality within our school? I think two pieces here that we're seeing, that we really didn't think about at the beginning of this study, were, A, that, you know, there's differences with Hispanic students.
And of course, in hindsight, it's, "Well, duh, of course," but, you know, at the end of the day, we're seeing these unique outcomes amongst our Hispanic students, and this is something that's really, really fascinating and warrants further investigation as a focus point.
You know, even all the way up here in greater New York City area, this is something that's really important to focus on.
And then I think another piece here is that the perception of safety, however that's influenced by engaging with security measures, is really important to student performance.
And we hope to see over time -- we hope that we can paint a better picture of the trends and how, you know, engaging with school security over year after year after year, walking through that metal detector, you know, morning after morning after morning, seeing those cameras and engaging with polices, going through and being searched.
You know, we hope that we can get an idea of how that influences student outcomes over time to the best of our ability.
Now, three other pieces I want to note before I kind of wrap up here is one, we definitely are going to continue doing this over time.
Again, we're in the third wave of our surveys.
We have scaled this up quite a bit, you know, even despite COVID, so we're really excited to see what some of our data are showing us, but we really hope that we can eventually evaluate the use of this data with our districts.
So down the road, you know, we want to see how our leaderships and how our leadership teams and how our folks, you know, educators, teachers, principals using this information and what are they doing with it, right? So we can yield the information as researchers, but what do the people on the ground do with it? So we really want to evaluate that to some extent beyond this work, and we want to figure out the best ways to do that.
One thing that my team has really been working on here is, how can we do this in real time, right? We go and we collect surveys.
We talk about surveys.
We collect them.
We enter them into a computer, and we document them and we store them.
But over the course of 2 weeks or so, you know, doing this with hundreds of students, you know, and having to track down each of our students, that takes a lot of time.
So we're really trying to come up with innovative ways to be able to do this in real time, so we can share this data with educators and administrators in a way that it's meaningful and it's timely, and they can make decisions pretty quickly based on, you know, what's going on with their school and not just their students but their school climate itself.
So we're really trying to work on that and develop those tools and kind of merge innovation and technology in school safety in that regard.
So with that said, that is kind of the brief overview of 4008.
So if you have any questions, I'm happy to answer them.
I'll give it back.
>> All right.
Well, thank you all for your presentations.
It was really insightful and enjoyed them.
Dr. Collier, it did look like you have right now currently one question about the student survey.
Someone asked, "Was that student survey an active parent consent survey for PPRA standards?" >> Yes.
It was active consent by parent, and that was one of the challenges with our response rate.
So in New Jersey here, we have to get consent of the parent, and then we have to get consent of the student as well.
And with that, we can't do an opt-out, right? So we can't send it home and the parents say -- if the parent doesn't respond, then the student can be enrolled.
The parent actually has to sign and consent their student, and then we can consider the student and their consent into the studies.
So that was certainly a challenge with the response rate, but that was one of the protocols that we had to follow.
Thank you for your question.
>> And Dr. Foley, I know you mentioned this in a chat and you might not be able to answer this directly, but participants are asking, do you have an idea of where your research will be published or, like, where they can access it or, like, the best way to find it once your studies are complete? >> So we're expecting a number of academic journal articles out of this, as well as a guidebook that's targeted to school administrators and officials.
We plan to distribute that online through the Embry-Riddle website as well as it'll be available via the NIJ with our data and our final report.
>> Thank you.
We do have about 10 minutes left.
If there are other questions, feel free to enter them into the chat or the question and answer.
Dr. Silvia, I know you talked about having the 37, like, top performing schools, and then you also mentioned that there was many -- the rural states -- there's many states, like you said, like one in five states didn't have really good breakdowns.
So I'm just curious on the 37 schools that did do really well, were these -- did it transition down? Like, were they in the states that had really good recommendations, or were these schools that just despite not having high guidance were able to, like, be well-prepared, if you know that? >> I don't remember.
I think we did do that mapping, but it would be more directly comparable to the district level standards than the state because, you know, there's a closer correspondence with whatever that state or that district that the school belongs to is doing, in terms of safety.
So the 37 schools were reflective of those districts, which were already top performing model districts.
All right. So currently, we don't have any other questions in the Q and A.
If there any questions from presenters to each other, you're more than welcome to ask them as well.
Well, we'd like to thank everyone for participating.
I know it's the last session of the day, and we really enjoyed the conference and everyone's participation in it.
I encourage everyone to attend tomorrow, starting at 11:00, and I thank everyone for coming today.
>> Thank you.
>> Thank you, Michael.
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.