Bureau of Justice Statistics: Survey of Law Enforcement in Public Schools - Roundtable Discussion, NIJ Virtual Conference on School Safety
Review the YouTube Terms of Service and the Google Privacy Policy
On February 16-18, 2021, the National Institute of Justice hosted the Virtual Conference on School Safety: Bridging Research to Practice to Safeguard Our Schools. This video presents a roundtable discussion from that conference.
>> All right.
We'll go ahead and get started and then don't have a ton of time here today, but we are excited.
For everyone joining, I'm Michael Applegarth, representative with National Institute of Justice, and today we are having roundtable discussion on the Bureau of Justice Statistics Survey of Law Enforcement in Public Schools, and Kevin Scott and Elizabeth Davis will be managing this discussion, so we'll turn the time over to them and thank everyone for joining.
>> All right.
Good afternoon.
Thank you to everyone for joining our session.
I'm Beth Davis.
I am a statistician in the Law Enforcement Statistics Unit at the Bureau of Justice Statistics, and I am Program Manager for the Survey of Law Enforcement Personnel in Schools, more affectionately known as SLEPS.
Also, here is Kevin Scott, who is Chief of the Law Enforcement Statistics Unit at BJS, and in this session, I will be providing an overview of the SLEPS project from its inception to where it stands today.
So to begin with a brief background, SLEPS came out of the 2014 Comprehensive School Safety Initiative.
This report noted that there was a lack of national-level data on the extent of law enforcement involvement in the nation's primary and secondary schools and in their typical roles and responsibilities.
To address this lack of data, NIJ entered into an interagency agreement with BJS, which tasked BJS with initiating a new data collection which focused on collecting information on the activities, roles and responsibilities of law enforcement agencies and the personnel who work in public K-through-12 schools.
Two main goals were identified for this new data collection.
The first goal was to identify a national roster of active law enforcement agencies that have law enforcement personnel operating in some capacity in the nation's K-through-12 public schools.
The end goal was to generate detailed, accurate and reliable national statistics that describe the scope, size, characteristics and function of law enforcement personnel that work in and interact in a school environment.
To accomplish all of this, BJS put out a solicitation for a data-collection agent and selected RTI International, so since 2015, BJS has been working with RTI and its partner, PERF, the Police Executive Research Forum, to develop this new data collection.
Tied to the first goal of developing a national roster was frame development, which was a challenging and lengthy task.
The project team started out with another BJS collection, the Census of State and Local Law Enforcement Agencies, also referred to as the CSLLEA, and it's a complete enumeration of all publicly funded state, local, county and tribal law enforcement agencies that operate in the United States.
The CSLLEA provides personnel counts for the approximately 18,000 law enforcement agencies in the country and asks agencies how many of their full-time officers served as school resource officers or had primary duties related to safety in K-through-12 schools.
Initially when SLEPS development began in 2015, the frame was based upon a combination of the 2008 and the 2014 Census of State and Local Law Enforcement Agencies' data collection.
RTI conducted verification calls to those agencies that had missing or inconsistent information from 2008 and 2014.
Ultimately the 2014 CSLLEA collection was determined to suffer from irreversible data-collection issues, and at that point, the 2008 data was going on 10 years old, and so the SLEPS project team decided to delay SLEPS until the 2018 Census of State and Local Law Enforcement Agencies' data collection could be completed so that could serve as the frame for SLEPS.
In the meantime, the project team convened an expert working group in April of 2015 to bring together SROs, supervisors of SROs, academics and other federal stakeholders to discuss data needs and uses.
Based upon the discussion of the expert working group, the SLEPS team determined that the best way to collect the desired information was through conducting two surveys, one at the agency level and one at the officer level.
This was a key decision because the SLEPS Officer Survey is the first ever officer survey for BJS.
BJS has extensive experience conducting agency-level surveys, but determining how to best conduct an officer survey was a new challenge, and I will talk more about that shortly.
The project team developed two survey instruments, as I mentioned, an agency and an officer survey, and then we conducted cognitive testing on these two surveys.
Between November and December of 2016, we completed 20 interviews with agencies and 18 interviews with SROs.
Based on the feedback on the SRO survey, BJS made only a few minor revisions to the officer survey, such as slight wording changes to make questions and instructions clearer.
The project team received a lot more feedback on the agency survey, which led to significant changes in order to streamline the instrument and to improve the flow and navigation through the instrument.
The original agency survey asked about three types of officers.
We asked about sworn officers with general arrest powers, sworn officers with limited arrest powers and nonsworn officers.
The survey had rather large tables asking about all three types of officers in a single table, and so, based on the results of cognitive interviews, as you may guess, participants found this overwhelming and recommended that we find a way to break this down to make this more easily digestible.
Also, based on the results, we saw very little -- We saw very few participants actually use the limited-arrest group, and so, based on that, we narrowed it down to use only the sworn and nonsworn officer types.
We also split them into two different sections, the sworn and the nonsworn, so that if an agency reported that they did not have one type, they would just be skipped through that section.
Given the extensiveness of these changes, we conducted cognitive testing on the revised agency survey, and we took the second round of testing as an opportunity to gain more insights into the process of obtaining an officer roster from agencies that we could then use to select officers for the officer survey.
In the first round of cognitive testing, agency participants were asked if they would be willing to provide an officer roster, and in the second round of testing, we elected to go further and ask if participants would actually complete a roster and provide the full list of officers to us.
So we conducted the second round of testing in May of 2017, and it concluded in August of 2017.
We completed 17 interviews, and 14 of the participants provided a roster.
Our next step was to conduct a pretest of our planned data-collection protocols.
The project team conducted a pretest with 250 agencies from November 2017 to January of 2018.
From the roster portion of the agency survey, we developed a frame of 639 SROs from which we sampled 475 SROs for the SRO pretest in March and April of 2018.
We achieved a 77 percent response rate in the agency portion of the pretest.
Ninety-one percent of the eligible agencies, which were those agencies that reported that they employed sworn SROs, those provided a roster, and overall, 70 percent of sampled agencies responded.
Looking at the SRO response rate in the pretest, we had 78 percent of the sampled officer respond to the survey.
Aside from the response rates, there were two other key takeaways from the pretest.
First of all, we saw that the nonsworn section was not applicable to the majority of respondents, so the project team decided to drop the nonsworn section from the survey.
The second key takeaway was related to administering the SRO survey.
In the pretest, the end of the agency survey asked if the agency would prefer to designate a point of contact who would serve as the distributor of SRO materials to each officer or if they would prefer that the project team contact the SROs directly.
For those that preferred the single point of contact, we asked the agency to provide that person and their contact information, and everything would be addressed to them to then distribute to officers.
In the case of direct contact, there was a column on our officer roster that asked the person filling it out to provide an e-mail address for each SRO so that we were able to e-mail the materials directly to the selected officers.
So upon the conclusion of the pretest, we saw that the majority of agencies opted to designate a point of contact, and beyond that, the SRO response rate was significantly higher in the agencies that designated a point of contact as compared to those who opted for direct contact of officers, so given that, the project team decided to remove the option for direct contact and with the national data collection move forward with only asking for a point of contact to manage the SRO data collection.
So when we moved on to the national data collection, it was based on the 2018 Census of State and Local Law Enforcement Agencies as I mentioned.
That concluded in the summer of 2019 at which point the project team selected a sample of approximately 2,000 agencies to receive the SLEPS agency survey stratified by the type of agency and the number of full-time SROs employed by the agency as reported on the 2018 CSLLEA.
Agency data collection began in September of 2019 and concluded in January of 2020.
SRO data collection ran from March to July of 2020.
The project team had decided to conduct two waves of the SRO survey, and the first wave of the SRO survey would mail out to SROs from agencies that provided an officer roster by the end of October 2019.
The second wave would be comprised of those that provided a roster from November 2019 to the end of the agency collection.
The goal of this was to limit the amount of time between when the agency provided the roster and when the officer survey materials were mailed out, which, in turn, we hoped would limit the opportunity for officer turnover during that time.
So to see a sample of what we were asking our respondents, you can see that, on the LEA survey, we had a few sections.
It was across 32 questions that we asked about SRO program characteristics, SRO policies and responsibilities as defined by the agency, SRO recruitment, training and supervision, SRO staffing, the training that was required for sworn SROs, activities performed by sworn SROs and then lastly, as I mentioned, a request for the agency to provide a list of their sworn SROs that we could then use to select a sample of officers to receive the SRO survey, and then the SRO survey had 31 questions on it covering the characteristics of the SRO, the training that they have received throughout their career, their regular law enforcement mentoring and teaching activities that they conducted as part of their SRO duties, and lastly we asked about the characteristics of the school to which they were primarily assigned.
And here we can see the agency response rates broken down by the agency type and the number of SROs employed by the agency.
We achieved a response rate of 82 percent across all agencies with a response rate of over 80 percent in all but one [Indistinct], and in three of these, we saw a response rate of close to or over 90 percent.
Turning to look at the SRO survey, an important note here is that this is wave one only and that it is an overall response rate.
It does not account for agency nonresponse.
Here we see that, in wave one of the SRO survey, 81 percent of SROs responded to the officer survey.
Among local police departments, the SRO response rate was highest in agencies with 25 or more SROs while, for sheriff's offices, the SRO response rate was highest in agencies with 10 to 24 SROs.
So you may notice that the previous slide only had wave-one response rates, so that is because we experienced a delay initially and then chose to cancel wave two of the SRO data collection.
In the spring of 2020, we initially decided to postpone wave two of the SRO data collection until the fall of 2020.
Mail out was scheduled to occur at the end of March 2020, and because of the shutdowns that began a couple weeks prior to that mail date and shifting priorities for LEAs as they implemented response plans to the pandemic, we decided to postpone.
In the summer and the fall of 2020, we decided to conduct an agency-verification effort.
The point of this was to assess the accuracy of the officer rosters that we had from these agencies because these rosters had been collected between November of 2019 and the winter of 2020, so we wanted to make sure that these rosters were still accurate, and additionally we wanted to ask agencies about the plan at that time, again, in summer and early fall, about their plans for going back to school in the midst of the pandemic.
So we included questions in the verification effort asking if the plan was for schools to return in person or to have a virtual model or if it would be a hybrid of these two.
Based on the responses that we received during this verification effort, we decided it was best not to conduct the second wave and therefore canceled the second wave of the survey.
So as it stands now, we are currently analyzing the agency-level data, and we are determining a waiting plan for the SRO data given that we have a completed wave one, and we elected not to pursue wave two of the SRO data collection.
We anticipate publishing reports on these data later this year and also archiving the data for public use.
And so that concludes the overview of this project, and Kevin and I are happy to answer any questions that anybody may have on the life of this project.
>> So prior to the session, I know people haven't been able to respond, but now if you do have a question, you can feel free to unmute yourself and ask it, or also, if you are more comfortable, you can type it into the chat, and we can answer questions from there as well.
>> Hi.
I'm wondering if you could talk a little bit more about the questions you asked about SROs' roles and activities in schools.
>> Sure.
So we asked about that on both the agency survey and the officer survey, so the intent on the agency survey was to find out what may be included in an agency policy or with or within MOUs or other agreements that agencies may have with schools or school districts.
So in looking at the activities portion, we asked about -- We broke it down between typical types, so we had law enforcement activities, and we had mentoring activities, and we had teaching activities, so within law enforcement, we had a list of -- We had quite a list of activities, including conducting searches and conducting security audits and assessments of the campuses, other law enforcement activities like confiscating drugs or weapons, making arrests, issuing criminal citations and then also patrol duties around school campus.
As far as mentoring, that covered things such as advising staff, students or families, whether it was one on one or in a group, and we also asked about coaching athletic programs or chaperoning field trips, and additionally we asked about PBIS, asking if the SROs and the schools had any sort of participation in positive school discipline.
And as far as teaching activities, we asked if the SROs, again, under policy or with an agreement with schools had as part of their duties administering special safety programs, conflict resolution or providing presentation, whether it was to the faculty and staff or to parents.
And then, on the SRO survey, we asked about these activities, and we put -- We included a reference period of if they had done these.
Let's see.
Let me double-check.
Okay, so it says, "Please indicate whether each law enforcement activity is required," so again, it goes back to the policies with the agencies and then with the SROs, more so getting at these particular responsibilities day to day, and then that allows us by having them on both surveys to kind of look between the policy and the practice and see how those line up.
We have any other questions? >> I'll jump in with one real quick, so first of all, I really enjoy you sharing this and looks like a fantastic survey that y'all have put together.
I'm wondering if you can speak a little bit about longevity and what this might look like in the future.
Obviously COVID, it seems, disrupted, you know, planned kind of second iteration of it, but are there any plans for thinking about integrating these questions into ongoing longitudinal data collections and of thinking of things like school survey on crime and safety or efforts kind of within education to collect data on SRO presence through, like, the Civil Rights Data Collection? I can really see the value of this, you know, if it was continued over a long time period.
>> So this was intended as a one-time data collection with the idea of, you know, that we could possibly repeat it in the future.
At this point, it is planned to be only a one-time data collection, but as you mentioned those other data collections, we consulted those as we developed the survey so that anything that we were asking could be more -- could be complementary to that rather than a duplication of effort, especially since that could be burdensome to respondents, and also if we could get the information elsewhere, lowering the burden for our respondents is obviously a bonus, and it also helps get our respondents to provide their data if they don't feel like they are being asked the same thing multiple times.
So again, we consulted with those other -- with some people from those projects and also reviewed their survey instruments to see what they were asking, and additionally, the last question on the SRO survey asked for the SRO to provide the name of the school that they are primarily working in so that we have the potential in the future to maybe think about linking data with other data collection based on the geographic information that we have and having the name of the school.
We don't currently have a plan for what exactly that looks like, but that was information that we wanted to collect just in case it seemed like it would be useful in the future for data-linking purposes.
Any other questions? >> Thanks for those details.
I'll actually ask another question if there's nobody else that's got one at the moment.
So I think this is super useful.
If I'm understanding correctly, it's providing a very robust kind of national perspective on what SROs are doing and so forth, but it strikes me that, in the current context or perhaps in any context, there's also this importance of local conversations and local understanding of what SROs do, right? So I'm continually sort of compelled or see in my work, right, that there's wide variations, so I can say something about SROs generally, but then, you know, people might say, "Yes, but the SROs in our schools are different, or they do things in this way," right? And communities, you know, kind of implement differently with great heterogeneity, so is -- I guess first question is, the data you're using, could it be used to actually look at, like, individual schools and get responses there, or are you only sort of reporting things in the aggregate averaged up, right, or do we have a mechanism for getting some of this granular, interesting information on process but in a way that could identify schools and allow local stakeholders to take that into account, say, when they're having, you know, debates about defunding school police as many districts were, you know, in the past few months? >> So we will be reporting on a national level.
I mean, that was truly the intent of this was noticing, again, with the CSSI report noticing the lack of national-level data, so our primary goal was to address it at the national level.
That said, you are correct.
We heard and saw the same things as we were developing all of this, and that was one of the challenges, one of the many challenges in designing all of this was the fact that there are so many differences, and there is such variety based on locale and the variety of programs, and so we did our best to come up with a survey that was generalizable to as many programs as possible, and then, that said, we will be archiving the data when we -- once we've completed our analyses, and we'll have it available for public use, and so we'll be able to go and see all of the data that we have.
That said, some of it will be deidentified to obviously protect information.
You won't be able to determine, you know, specific SROs, and so we haven't worked through the details of that, but, you know, there are concerns, of course, if we know of an agency that has a single SRO.
There are confidentiality concerns there, so we will be deidentifying things so that that single SRO's responses cannot be traced back to him or her, but we will be, as I mentioned, publishing the data for both the agency and officer surveys.
>> Samantha, I wondered if you could elaborate on your question in the chat a little bit.
Typically what BJS does when they provide data, they provide the data publicly.
They may require IRB approval to access the data, but once those data are accessed and the deidentified version has been created, a public use file that is, you can download it, and it's -- You know, you can store it on your machine.
>> Yeah, so I'm asking about a restricted version of the data, so it's common at least in education data sets that NCS makes a restricted version available, so I'm wondering if there'll be a restricted version that identifies the schools and agencies for researchers who have the capabilities and apply to use that restricted data.
>> Certainly, I think that's something we're willing to consider in part because we recognize that linking the agency data with the officer data so linking the two data sets and then linking those data out to either the schools in which SRO serves or kind of broadly to the community characteristics would require identification in some cases, so we may -- I think that's something that we're considering and something that we'd like to be able to do.
>> All right.
Do we have any other questions? >> I know you said you were planning on releasing it later this year.
Do you have, like, an idea of when the reports are going to be released or when that, like, information is available, or is it all still pretty in flux right now? >> Do you want that one, Kevin? >> Sure.
We don't have a great -- We don't have a firm -- It would probably be in the final quarter of the year.
BJS has a lot of reports it wants to get out this year, and so kind of finding the resources to complete the report and to get the report all the way to publication, we're, you know, we're going to see what we can do.
>> All right.
Anybody else? >> If there aren't other questions, I would just -- I know Kevin put a link in the chat for the questionnaires or the surveys so encourage people to go there if they are interested in looking at them and seeing what those are.
Okay.
Well, if there is -- I mean, maybe one last call for questions, is there anybody else that would have one? And then, if not, you can let people go and prepare for the next session, but we do appreciate everyone coming here and Kevin and Elizabeth on sharing this information with us.
Okay.
Well, thank you all for coming.
If, Kevin or Elizabeth, you have anything else you want to add, please let me know.
If not, the next sessions do start at 3:30, and so we encourage everyone to attend those, and we appreciate everyone tuning in.
>> Thank you.
>> Yes, thank you, everyone, for attending.
Disclaimer:
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.
- The Hidden Costs of Reentry: Understanding the Barriers to Removing a Criminal Record
- School Safety Implementation Challenges - Roundtable Discussion, NIJ Virtual Conference on School Safety
- Tragedy to Transformation: Preventing School Violence with Proven Programs - Plenary Presentation, NIJ Virtual Conference on School Safety