Taking Stock: An Overview of NIJ's Reentry Research Portfolio and Assessing the Impact of the Pandemic on Reentry Research
Over several decades, the National Institute of Justice (NIJ) has made significant contributions to the field of reentry, specifically what works for whom and when. In recent years, however, the global pandemic has made it increasingly difficult to conduct research on and with populations involved with the justice system. During this time, many researchers assessing various justice-related outcomes were unable to continue their inquiries as planned due to a lack of access to their populations of interest, forcing many to pivot and rethink their research designs.
During this panel, an overview of NIJ's reentry research portfolio will be provided and researchers will discuss their NIJ-funded reentry projects, the impact of the pandemic on the ability to conduct research, and lessons learned.
ERIC D. MARTIN: Welcome, everybody. I see--I think the last of the participants are starting to trickle in. I think it's safe to go ahead and get started. Welcome to a webinar presentation titled, An Overview of NIJ's Reentry Research Port--Portfolio and Assessing the Impact of the Pandemic on Reentry Research. Hello, everybody, my name is Eric Martin. I'm a Social Science Analyst with the National Institute of Justice. And I work to support corrections research, and a part of that--main part, actually, is our investments in reentry research and program evaluation. Before I begin, I want to thank my colleagues at the Bureau of Justice Assistance for hosting this event and giving us some space to share with us--share with you all rather, some exciting research and our experiences from our researchers in the field of how they've navigated the pandemic and some important updates on that front as well.
Before we get started, I would like to let you know that this webinar is being recorded. We are set to go through presentations until 4:00 PM, and we alloted around 15 minutes for question and answers for the panelists. We always have some really great questions from those folks who listened in and watched the presentation. And I'll be facilitating that. I'll direct the questions. Notice, on that front, if you have questions for the panelists, we're going to take them at the end. Please put them in the Q&A feature. You can type them all out, and I'll direct them where I think they should go. And, of course, if--the more information you put as to who your question is for or if it's general, you know, you can say that. It's just helpful as we go through the questions at the end. And then if you're having any technical issues while our webinar is going on, if you could put that in the chat feature, we have staff that are monitoring that and can help you with those issues. And as I just said, this is--session is being recorded, and it is actually hosted by the National Reentry Resource Center, which is supported by BJA. And we're here as part of an ongoing series and a number of exciting presentations and publications and information all around Second Chance Month. This is focused largely on reentry and really highlights a lot of good work that BJA and their sister agencies, NIJ included, and OJJDP, and, of course, all of you in the field who are working every day to really promote the successful reintegration of individuals as they transition back to the community from incarceration.
Today, we're going to talk about some important NIJ research that we have ongoing. NIJ is committed to understanding the factors that promote successful reentry. We have a dedicated research portfolio just on evaluating innovative programs in the field. My colleague, Marie Garcia, who's attending the presentation right now, we hosted four years of just seeding the field with evaluations on innovative programs and practices. We have 16 ongoing. And the researchers who are here today are part of that cadre of folks doing cutting-edge, gold standard RCT research in areas where it matters most. How to successfully intervene with these individuals. How can we help them reintegrate to the community so they could then, in turn, help the--their communities back and make them stronger? And as we learned in the last couple years, none of this happens in a vacuum. There are things such as global pandemics. And right in the middle of all the work we are doing, COVID-19 came and, of course, disrupted so many lives, and how we work, and how we interact with each other. And it's pretty fascinating, you know, if we separate all the pain and hardship that COVID-19 has caused, and I acknowledge that, but it is kind of fascinating, from a scientist's perspective, of how something like research could still carry on. But what's exciting and, you know, we're pleased at NIJ to talk about, is research has carried on and the people are--well, want to share their experiences with you and help us understand how that happened and what we can expect in the near future.
So the lineup today, we have Dr. Christine Lindquist, will be starting us out. She is from RTI International. And then she will be followed by doctors Holly Swan and Walter Campbell from Abt Associates. And then a presentation will conclude our webinar today from Dr. Matthew Smith from the University of Michigan. And, again, just to reiterate, if you have any questions for our panelists, we're going to take questions at the end, please put them in the Q&A box. And of course, if something happens to your experience while you're listening in, if you need quick technical assistance, please put it in the chat feature. And definitely check out these links right here from the National Reentry Research Center, and they're going to be putting out a number of useful information and there's a whole schedule of exciting events coming up for this month. So with that, I will turn it over to Dr. Lindquist. Thank you.
CHRISTINE LINDQUIST: Thanks, Eric. And thank you, everyone, for joining us today. I'll be presenting on our grant, to evaluate the role of technology in reentry planning for a randomized controlled trial of a web-based reentry planning tool called Pokket. And I'd like to acknowledge the contributions of my co-authors, Sam Scaggs and Megan Nyce from RTI, as well as our partners at the North Carolina Department of Public Safety and Acivilate, the developer of the Pokket tool. In this presentation, I will provide a brief overview of the technology being evaluated in our study. I'll summarize our key research questions and study design. I'll update everyone on where we are with study implementation and the impact of the pandemic on our research. And then I'll share some preliminary findings on Pokket usage and the impact of the pandemic on reentry planning.
So, first, a few slides summarizing the Pokket tool. This is a web-based reentry planning and management tool. It was developed Acivilate specifically for people who are incarcerated and reentering. Importantly, it's not meant to replace in-person reentry planning. It's actually designed to encourage clients to take more ownership and engagement over their reentry planning. It's not only used by people who are incarcerated, who we're calling clients, but also all of the service providers, facility case managers, and community supervision officers who work with them. So it's designed to promote collaboration among the various agencies that are supporting clients as they reenter.
In the next couple slides, I have a few screenshots that show what the Pokket tool looks like on an iPhone, but it can be accessed on any internet-connected device. It has a number of security and confidentiality features which are important for sharing HIPAA-protected data among various service providers. One of the screenshots here shows the Dashboard feature which is what clients see when they login to Pokket. And it shows immediate tasks that are coming up and anything that's overdue or any appointments that they have on their calendar. In the Tasks panel on the far right, clients can create tasks for themselves. They can share them with providers. And important for reentry, this is actually a place where they can upload and store secure electronic documents. And this is really good for reentry when clients are working to develop documents while they're incarcerated and they need a safe place to keep them and then be able to access them upon release, which you can do in the Pokket tool.
One of the key features is messaging. this allows clients to message service providers or supervision officers. They are also given a Pokket subscription to support the clients that they're working with. Clients also have a Treatment Plan segment where they can set goals for themselves and develop a reentry plan. And this also lets them track where they are on their progress towards the goals and they can actually share their reentry plan with service providers. And then a final feature I wanted to show is the Services and Referrals. The Pokket app actually allows clients to search for services in their communities of return. It has a geographic search feature and it's filled with providers that have been vetted by the partnering DOC and uploaded by Acivilate. And they can find service providers and actually request appointments through the tool.
So several DOCs are using Pokket, as well as individual jails and prisons, but it hasn't been empirically tested. So this study is the first empirical test of Pokket. We're implementing a randomized controlled trial with fidelity, process, outcome, and cost components. We're doing this in partnership with Acivilate and the North Carolina Department of Public Safety. Our study is taking place in six adult DPS facilities, five male and one female. And specifically we are focusing on individuals who are participating in specialized reentry units called R-STEP units. And that's because these units meet the technological requirements for our study. They are wired for Wi-Fi access and they allow the use of tablets for programming. And within the R-STEP units, we are focusing on individuals who are about five to eight months from release and who have signed an APDS, or American Prison Data System, tablet agreement indicating that they are willing to take care of the tablets and will use them in the authorized manner. And I'll talk a little bit more about recruitment and randomization in this slide.
So one reason that individuals have to have a tablet account to be eligible for our study is because we actually use electronic messaging through the tablets to recruit individuals who have been identified as eligible for our study. They then click on a link that takes them to a web-based consent form, which they read, and then if they agree to participate in the study, we randomly assign half to receive standard reentry planning and the other half to receive reentry planning plus the Pokket subscription. So both the treatment and control group gets individual reentry planning from their case manager. They also participate in core and elective classes that are offered to R-STEP participants, and they're able to access educational programming such as workforce preparation programming on the tablets. But only the treatment group is assigned to receive a Pokket subscription. And this is a 12-month subscription, which prior to release, they access through the tablets and then post-release, they don't take the tablets with them but their subscription stays with them and they can log on to the Pokket site using any internet-abled device. So during incarceration, the facility case managers set up their accounts and they teach them how to use the Pokket tool, and they also work in the Pokket tool for the clients that they're working with. And then probation officers and service providers who work with the treatment group are also encouraged to use Pokket to support their reentry planning, and they have Pokket accounts as well.
So our study has, first, a fidelity assessment and process component where we're looking at the extent to which Pokket is used as intended. And this is both pre and post-release, as well as for clients and providers who are supporting them. And then, also, we're looking to see whether Pokket does in fact increase client engagement and reentry planning as intended and whether Pokket is in fact associated with greater collaboration among staff and service providers who are supporting clients. And we're also looking at what the benefits and drawbacks of the Pokket tool are from the perspective of clients as well as staff and providers. In terms of data sources, we are receiving very detailed administrative data from Acivilate on Pokket usage, and I'll show you a sample of this in just a minute. We're also able to field two waves of web-based surveys to all treatment and control group members through the APDS tablets, and these surveys primarily focus on client's engagement with their reentry planning and their preparation for release. We're doing in-depth stakeholder interviews ideally through in-person site visits, where we will get perspectives from Pokket users, both pre and post-release, as well as the other staff and service providers listed here. And then we're also fielding two very brief satisfaction surveys to individuals in the treatment group. Again, we're able to do this through the APDS tablets.
For the outcome study, we are assessing whether the treatment group had better outcomes than the control group based on administrative data that we’re able to obtain from DPS, including recidivism, probation outcomes, and employment. And then we're looking at, among the treatment group, did those who were more engaged with Pokket, so this is based on how often they logged on, how many features they used, whether they completed the goals they set for themselves, et cetera, whether those individuals had better outcomes than those in the treatment group who did not use Pokket. For the cost study, we are documenting the cost of Pokket above and beyond standard reentry programming. We're looking at the cost of the outcomes achieved by both the treatment and control group, and then assessing whether the monetized benefits of Pokket outweigh the cost of implementing it. So for this component, we'll be doing cost interviews with Acivilate and DPS staff listed here, as well as using the data that we collect for the outcome and process study.
In terms of our study progress to date, we began this study right around the time when many of us actually went into lockdown because of COVID. This was April of 2020. During that time, our partners at North Carolina completed the wireless infrastructure and all six study facilities. We refined all of our study design and data collection procedures and obtained our IRB and human subjects protection approvals. DPS and Acivilate got all of the facility case managers trained in Pokket. And then we began enrollment in around November. But we almost immediately had to put the study on pause due to the continued effects of the pandemic, which I'll get to on the next slide. And we restarted the study in August of 2021 and reenrolled our first clients in the fall, and that meant retraining all of the case managers who had been trained in Pokket because so much time had passed. Each month since then, we receive a list of all R-STEP participants from DPS. We screen for eligibility, we recruit those who are eligible, and then we randomize those who consent. That group then receives their Pokket subscriptions. They began working in Pokket. Every month, we obtain administrative data on Pokket usage from Acivilate, and we just began doing virtual interviews with facility case managers to document their preliminary thoughts and reactions to Pokket.
So we've encountered two main study implementation challenges, which I'm sure many of you who do reentry research are familiar with. The first is fewer than expected study participants, and this is largely because of the impact of COVID. Many court operations were suspended for quite some time, and that affected the pipeline of people entering DPS facilities but then, also, there were huge issues with transfer. So even when people begin to get sentenced, they often were not transferred to the facility where they would be serving their sentence. And even within the study facilities, there were difficulties getting people transferred into R-STEP units because of pandemic-related protections. We also had a North Carolina-specific challenge where a number of individuals who would have been eligible for our study were released early because of the terms of a lawsuit settlement. And then, also, we found that in the R-STEP units, which are diminished in terms of the number of eligible clients, some people who would be eligible for the study, do not have tablet accounts, either because they didn't choose to sign the tablet agreement or because case managers or other staff just have not been able to work with them on getting it signed, again, because of the pandemic.
The second main challenge is that even among the individuals who have been enrolled in our study and were assigned to the treatment group, we're seeing a "lower than expected" intensity of the Pokket intervention. So we've had two of our study facilities close temporarily because of staff shortages, so those clients had to get transferred to other facilities where they didn't have tablets and didn't have access to Pokket. And even in the remaining facilities, whenever COVID cases reach a certain threshold, facilities have to, you know, take measures. A lot of times these prevent in-person contact between clients and staff so they can't work in Pokket. And then also, you know, tablets are only able to be accessed in limited locations, the locations that are wired for Wi-Fi access, and they do require staff time to distribute, inspect, clean them, charge them, and store them. And in North Carolina as in other states, staff shortages are very significant, and these predated the pandemic but were certainly exacerbated by the pandemic. And then, also, a few individuals, more than we expected, who were in our study and in the R-STEP units have gotten transferred out because of disciplinary infractions or changes to their home plan, and because of this, they haven't been able to use the Pokket tool.
So we recently did virtual interviews with facility-based case managers to understand their preliminary perceptions with Pokket and the responses can be grouped into four broad themes. The first is overall satisfaction with Pokket. Staff are optimistic. They found the training to be straightforward. They found it to be very easy to navigate and use. And they are particularly excited about the messaging feature. And there's a quote from a case manager to the right, talking about the value of clients being able to message their probation officer through Pokket. But at the same time, staff--in terms of Pokket usage, staff are not seeing many clients using Pokket. They have seldomly logged in to Pokket themselves and they've not been able to use a lot of the Pokket features. And this quote on the right illustrates the fact that staff shortages directly impact client's ability to be able to work with Pokket because of staff time required to distribute the tablets. And then, also, facility case managers we're seeing not many community-based probation and parole officers using Pokket, but this is partially because very few of our study participants have been released so far. We've only had one.
The third theme are pandemic-related challenges to reentry planning, which also affect the use of Pokket. Staff noted the staffing shortages, the facility lock downs and shutdowns, and how those have limited the ability to do in-person reentry planning. That time for clients to use the tablets is limited. And this includes the other software, not just Pokket but the educational software. They also observed that very few service providers in the clients communities of return had been uploaded into Pokket, which sort of limits its utility. And then they were thinking ahead to when clients get released and expressing concerns about clients ability to access Pokket on their cellphones and noting that that would be a waste of resources and that that could hopefully be addressed through community corrections.
So, finally, case managers had a number of recommendations that directly get at these challenges, getting community corrections more involved in Pokket, which should happen as more clients get released. And there's a quote to the right that illustrates that. Having more local service providers uploaded into Pokket. Trying to set up cellphones with the Pokket app upon release so that clients don't have that delay, and then providing refresher trainings for staff in times of shutdowns or lockdowns where they're not able to use Pokket.
So, finally, I just wanted to share some quantitative data that sort of backs up these perceptions. As I mentioned, we get very detailed administrative data from Acivilate, and this slide on the right, at the bottom, it shows the timestamps. So we literally get, for each of our study participants, timestamp data showing every single time they have logged in. And we have plotted this by month in terms of client logins, case manager logins, referral requests, and service provider connections. And what we're seeing from the time that we resumed the study is that each study month, we do have some clients connecting with service providers in Pokket, which is great, but when you look at logins, we had several logins in November but then no additional logins until February, and that coincided with the time when several study facilities had to go on lockdown because of COVID conditions so people were not able to use it. The same thing with case managers. Some initial logins but then very few logins in the next few months and little referral request activity, as well as little activity on some of the other features of Pokket, such as goal-setting and messaging.
So in terms of next steps, we are continuing to plug away in North Carolina. We're monitoring the pipeline of cases very closely. Our partners at DPS are monitoring this and doing their best to get people who are eligible into R-STEP and identify eligible people for the study. They're also working on increasing Pokket usage. In terms of our activities, we are getting ready to start our data collection for cost--the cost study. We hope to be able to do in-person site visits to document Pokket implementation and keep doing what we're doing, but at the same time, because it doesn't appear that we'll be able to meet our enrollment targets in North Carolina, we're also talking with additional DOCs about their interest in participating. So if any of you are aware of correctional agencies that would be interested in using this type of technology to support reentry, please do feel free to share my email address at the bottom of the screen here. And then, also, we're working on an interim brief to document some of these challenges, because technology that can be client-driven seems ideal in a time where many DOCs have staff shortages and certainly where contact between staff and client is limited because of public health conditions, but at the same time, it does require staff bandwidth, and we hope to be able to share some recommendations and lessons learned based on our experiences in North Carolina. And with that, I will turn it over to the next presenter, Holly Swan from Abt.
HOLLY SWAN: Thank you so much, Christine. Right. So great. Thank you. Thank you also to Eric for his introduction. My name is Holly Swan and I will be presenting with my colleague, Walter Campbell, on our study of the impact of different community supervision, field practices, and the impact of the COVID-19 pandemic on our study and various other things. So let's see.
First, I would like to just acknowledge our funders, the National Institute of Justice, and also wanted to acknowledge our partners, the American Probation and Parole Association. The APPA has been a sort of integral partner through the body of this work that we've been doing for the past several years. So really grateful to them and their partnership on this research.
I will first provide an overview of the study and then I'll turn it over to Walter to talk about the impact of the pandemic. But just quickly, the study that we're doing is a experiment--an experimental test of the impact of field contacts that incorporate evidence-based practices or rehabilitative techniques on recidivism. And when we say field contacts, what we mean is essentially any contact between a community supervision officer and the people that they supervise some place other than their office. And so, typically, this is a supervisee's home or residence. It can also take place at an employer, a treatment provider. Basically, any other public space. Essentially, any contact that's happening not in the office or virtually. And by evidence-based practices, this is also, intentionally, a broad term that we use in our design but it incorporates a lot of various things. Some examples from the sites that are participating in our study include things like skill building and practice, use of carry guides, use of thinking reports, motivational interviewing, and basically any other sort of evidence-based practice that falls under the sort of risk needs and responsivity framework that shows a responsiveness to identify risks and needs of the supervisees.
So to provide a little bit of detail on our study, so this is actually--builds on an exploratory study that we recently completed at the end of 2018, where we did a descriptive exploratory study with a quasi-experimental aspect to it. And in this study, we identified that fieldwork, in general, is positively related to recidivism. We found reductions in recidivism as a result of being involved in field contacts. And, specifically, we found that there's a strong association between the use of evidence-based practices and recidivism. There is sort of a preliminary finding that there seems to be a relationship there. We also heard from the officers and agencies that we worked with that they perceived the impact of EBP--a positive impact of using EBPs in the field, but that they came at a cost, both in terms of time but also logistical and safety concerns were a factor. And so to build on some of those findings, we designed this study to be a more explicit experimental test of the use of EBPs during home and field contacts. And so similar to the study we just heard from RTI, our design incorporates a randomized control trial, or an RCT, a complimentary process evaluation, and also a cost assessment. The two states that are participating are Ohio and Minnesota. These are the two states that participated in our initial study. And after that study was complete, we worked with the agency leaders in those sites to identify specific sites to implement the RCT, and we landed on two regions in Ohio and one county in Minnesota.
A little bit more about the RCT. So our two conditions in the trial are the treatment condition, where essentially any case that's randomized to the treatment condition, all cases in the treatment condition will receive an EBP during every field contact. And this is compared to the control condition where every case in that condition will not receive any EBPs during any of their field contacts. And so we're not testing the impact of a particular EBP or even of EBPs, in general. What we're testing is the impact of conducting EBPs in the field. And so that piece is important. And, importantly, any participant in the study, any of the cases can receive EBP--EBPS in other contacts. So for example, if you're in the control condition, you can still receive EBPs in an office contact, for example, or a virtual contact, and similarly for the treatment condition. The thing that is varying between them is receiving EBPs in the field or not. Randomization is going to occur as soon as a new case gets put on to community supervision, and they have been--received an assessment and been identified as moderate or high risk. That's the eligibility criteria for our study. And they will then be randomized into one of these conditions. We are randomizing within an officer's caseload, and that's--the reason for that is twofold. One is to sort of reduce burden and kind of disperse cases more broadly through--between the officers so that the burden doesn't fall, for example, on one officer having all treatment condition cases and another having all control condition. And the other reason is to try to mitigate any biases introduced from an officer effect, in terms of their ability to administer EBPs, and so we are randomizing within officer case loads.
We are also doing an intent-to-treat analysis. And this is because we anticipate that we will not have perfect compliance with conditions. As Eric said in his introduction, you know, there are things that impact research that are sort of out of our control, and we want it to be flexible in our study to be able to allow that to happen. For example, if somebody in the treatment condition who is supposed to receive an EBP, for some reason, a safety issue or some other concern, they're not able to do the EBP at that time, that's okay. So for that reason, we're doing an intent-to-treat analysis.
Our outcomes of interest, we have--our primary outcome is recidivism. And we have two measures of that, re-arrest and re-incarceration. And we are looking at one and three year follow-up rates of those. And then our secondary outcome of interest is actually program completion. In addition to just looking at recidivism, we wanted to look at success on community supervision, and so this is a sort of exploratory outcome to determine whether there's a relationship between receiving an EBP during field contacts and successfully completing programming as part of community supervision.
For the process evaluation, we have several different data collection points for this. And this is intended to provide sort of contextual information for the findings of the RCT and to really understand more about what this looks like in the field. We have some of this from our early study but we wanted to unpack it even more for this one. So the first data collection point is what we're calling Contact Checklists. And essentially for these, we--we're asking officers for every contact that they have with the people they supervise, regardless of their condition and regardless of where the contact occurs. We'd like to--them to complete the checklists. And the components are listed here that we're capturing on those checklists. We had incorporated these in our initial study and we had great success in implementing these. They were well-received and we got a lot of really good information. So looking forward to collecting more of that data. We also had initially planned on doing some observations and ride-alongs during fieldwork and other contacts that officers make with the people they supervise. We have since shifted this a little bit, and I will let Walter talk about that, as a result of the pandemic. But those, we are planning to do around quarterly over two years, so eight sort of observations over the course of two years. And, similarly, we're going to be doing focus groups with community supervision officers, again, to just try to get a little bit more nuanced and more of the information than what we can capture on a checklist. And these are also going to be done quarterly over the course of two years. And then at the end of the study, we're going to be doing interviews with agency leadership to just get their sort of reflections on this study. And so in addition to sort of collecting this qualitative data, we also are using each of these as a means to sort of check on fidelity to the experiment and adjust as needed.
Our third component is the Cost Assessment. And we have two measures of costs that we're looking at. One is the measure--the--a measure of fixed costs off field contacts with EBPs, and this includes things like trainings for officers and any equipment or other materials that are required. And then we also have a measure of the variable costs of field contacts with EBPs, and this is essentially a combination of time and salary information. On the checklists, we have a field to capture time spent on the contact but also time spent between contacts, because we've learned in the first study that it just takes a lot of time to do field work. And so we wanted to try to capture that on the checklist, and we will also be discussing that in the other qualitative data collection components as well.
We anticipate hopefully having several products for the field, both interim and final, presentations at professional meetings. Also, publications and peer reviewed journals, but also policy briefs and white papers along the way. And then we're really excited to work with the APPA on developing a set of fieldwork guidelines that we're actually going to develop from both studies that we've completed in this area to try to use sort of the evidence that we found to provide guidance to the field. And, finally, this is a five-year grant. So we are planning one year of randomization and enrollment into the RCT and then we will be following the participants for three years in order to have enough time to observe our outcomes. We've been very much delayed because of the coronavirus pandemic. Similar to the RTI study, we were sort of due with this--we were awarded this grant in the fall of 2019, and we were due to launch it in the spring of 2020 when everything shut down. So we are now anticipating starting in the next couple of months, and really excited to kind of get off the ground. And with that, I will turn it over to Walter to talk about some of our delays and adjustments that we've made.
WALTER CAMPBELL: Thanks, Holly. Hi, everyone. I'm Walter Campbell. So, as Holly mentioned, we are delayed a bit because of the pandemic, and I want to talk a bit about some of the reasons we're delayed, just to kind of give a sense of the scale and why we had to wait a little bit to actually get going. So the pandemic affected a few aspects of our study. The first is it affected the actual intervention. So, as Holly mentioned, we're studying field contacts, we're studying in-person contacts between an officer and the person they're supervising. These--for both of the agencies we worked with in Ohio and Minnesota, at the height of the pandemic, they were doing no field contacts. They were doing no in-person contacts. They'd stopped field and office contacts. That eventually changed and they started to do field contacts with cases that were a higher risk. And that has changed even more recently, which has allowed us to kind of--to revisit the study and get it going again. But, obviously, we can't study field contacts if there are no field contacts. So that was a pretty critical element for--waiting for that to change within both agencies.
As I mentioned too, they also stopped office contacts, which is sort of the other primary form of in-person contact. That's also changed, although it's--and office contacts have come back a bit, although it seems as though within both sides that field contacts are kind of the preferred in-person contact now post-pandemic, and they're doing fewer office contacts. In place of both, they started doing virtual and telephone contacts, both of which were fairly rare before the pandemic and are now quite common. And so, you know, in addition to sort of changing our treatment condition, it also changes the ways in which people in the comparison condition are receiving EBPs with the many more receiving them through virtual contacts than office contacts, at least, more than they would pre-pandemic.
The pandemic also affected our ability to sort of have a sample size that we could, you know, within the enrollment period of the year that would actually support the study, simply because people were not being enrolled onto community supervision at either of our sites, at least not at a normal rate. This is because at both sites, the courts were pretty much shut down for a while. And then when they resumed processing, it was mostly online courts, and it was a slower system. And it's only fairly recently that they've kind of processed cases more quickly. And what that means practically for the agencies we're working with is that one of the agencies gets a lot of their cases from probation, and so if nobody's being sentenced to a probation, they're getting no new cases to work with. In addition, both of our sites have supervisees that come out of incarceration and are on a post-incarceration supervision sentence. And for a period of time, throughout the pandemic, very few people were being released in either of the states we worked with, right? So while there's a big reduction in the prison population throughout the pandemic, a lot of that--and especially in the two states we worked in, it was driven by reduced admissions, not by increased releases. And so we weren't seeing--they weren't seeing, I should say, many people that stopped supervision. That has also changed. So that's allowed us to sort of continue to study at this point.
The--another thing that was in--was affected was kind of referral to treatment services. So, you know, probation parole officers, our community supervision officers can provide treatment themselves, but they can also refer someone to treatment. That's one of sort of the EBPs that both sites elect to use, which is, in certain cases to make an effective referral. But many of those treatment services weren't occurring or when they were occurring, they were in very different format. They were online services. And there were sort of less evidence about how effective an online form of ex-treatment works. Whereas there might have been, you know, fairly good evidence about how effective that treatment was in person. Again, that's also changed; those treatment services are functioning a bit more normally. And then finally, the--our outcome, our primary outcomes were affected. So as Holly mentioned, our two primary outcomes were recidivism and--or measures of recidivism and none of--again, again, according--in general, but specifically within our sites all of the agencies associated with what gets recorded that are associated with recidivism weren't sort of functioning normally, right? There's a period of time early in the pandemic where law enforcement and both sites were not making many arrests, so that--so that sort of initial stage of recidivism wasn't occurring. As I mentioned, the courts were really slowed down. And then prisons and jails were admitting few people. So, you know, sort of any form of recidivism's really not getting counted there.
So that delayed us for about two years. And the reason we are able to start up now is because a lot of those things have changed. A lot of those things are sort of back up and functioning. The sites are doing field contacts, but it's a different form, you know, all of these things have still kind of permanently changed, but they are back to some sense of normalcy, at least in that they're functioning in a way that we can sort of measure the impact of field contacts. One of the ways we know this is because during those two years, we continually checked in with the sites about every two months, and this was to--both to assess, you know, sort of in the immediate, do we think we can re-launch and for much of the period, that answer was "no" and recently it's been "yes." But part of that was also just to keep track of what was changing or to keep track of how all these changes occurred, right? And I think, you know, for example, one of the progressions that we saw was the progression, you know, away from, you know, the rapid increase in virtual and telephone contacts, and sort of those remaining in place, even after in-person contacts came back and sort of the greater reliance on field contacts over office contacts, as sort of the primary form of in-person contact was something that we sort of saw build over time within each agency.
But there were still--even with our sites now kind of functioning at a more normal level, something that we can actually assess, there's still a few just practical IRB and safety complications we had to overcome. So the first is--simply is, you know, research staff comfort with traveling and also Abt’s, you know, our company's comfort with people traveling for until fall of last year, traveling was, you know, you had to sort of get special permission to travel for a project. And it was pretty uncommon. And so we’re really restricted in sort of any travel we could do. That was lifted, I said--as I've said, fall of last year. In addition, though, any IRB packet requires a pretty--any IRB packet that involves in-person data collection requires a pretty rigorous COVID safety plan. And there's also just the comfort of sort of staff members and traveling and how they feel about pandemic-related risks. But in addition, there is risks to the study sites and participants, meaning sort of the community supervision officers and their supervisees. We don't want our staff coming in and putting them at risk to do in-person data collection. And this is especially the case with ride-alongs, right? So Holly mentioned focus groups and interviews and, you know, there's social distancing, and things like that can occur in a focus group and interview but in a way, they really can't in a ride-along. And specifically, in both of our sites, they often did two-person field contacts before where two officers were going to field contact, and they'd discontinued that in--for COVID safety reasons. And so it seemed a little odd for us to then sort of insert ourselves into the car with them if they weren't allowing their officers to go out together.
So, to launch this spring, summer, the next basically month or so, we had to make one final change, which was to move a lot of our data collection online, or at least all of our qualitative data collection online. So, we're doing our focus groups and interviews, which we planned to be in person, are going to be virtual. This is probably a shift that, you know, that many other researchers on the call are familiar with. This has been fairly common throughout the pandemic to kind of move to virtual focus groups or interviews. One of the changes that was--took a little more time to think through was how we handled the ride-alongs. And the goal there, as Holly mentioned, or the observations was--the goal with these observations was to get more detail on the types of items we collected on the checklist. And to add to sort of--yeah--to get sort of richer detail on what goes into a contact both one with EBPs and one without.
And so to do that, instead of being there in person and being on the ride-along, we opted to have these sort of daily check-ins with an officer over the course of a week where essentially they'll--at the end of each day or throughout the day, depending on whatever the officer is comfortable with, they'll recount what they're doing. So they'll talk to one of our staff members and there's a series of questions we'll go through and they'll sort of go into detail about everything they did. So, unlike sort of the, you know, focus groups where we're asking them to sort of think back on the last couple months, we're asking them to just tell us exactly what they did throughout the day. Both the activities, the time spent on them, and all of that to get just sort of richer detail about--and hopefully, a cleaner, clearer memory of what each contact looks like.
One last thing I want to note is that we've sort of left open both of these to potentially be in-person data collection if it becomes safe and feasible. But, you know, for now, we are comfortable, you know, or have plans to, you know, complete the study with these as virtual--as virtual data collection efforts if needed. But obviously, there's, you know, always a little bit of added benefit if we can be in person at some point. And with that, that's the gist of our presentation. So I'll pass things off to that Matt to present.
MATTHEW SMITH: Just saying, hi, everybody. I'll be presenting on our randomized controlled trial, the study of Virtual Job Interview Tool with the Michigan Department of Corrections Vocational Villages and, of course, I'd like to first and foremost thank the Michigan Department of Corrections for their partnership and strong support throughout this project.
So our study is an intent-to-treat RCT to evaluate the effectiveness and concurrent implementation of this virtual interview training tool in 18 to 24 year-old returning citizens. The setting for this study occurred in two prison-based prerelease employment readiness, training programs preparing returning citizens for a trades focused job after their release. As I mentioned, this program is called Vocational Villages, and they have a strong historical employment rate within 12 months of release at 65%. Our first aim is focused on the effectiveness of virtual job interview training at increasing in employment beyond that 65%, as well as looking at the potential relationship with reducing recidivism. The second aim focuses on more of a statistical approach to the mechanisms of employment, as well as whether employment might be a mechanism for reduced recidivism. And then our third aim is focused on evaluating the implementation process outcomes of the virtual interview training, such as barriers to delivering the tool within the prison employment readiness program, and factors that may help promote its use within the program, as well as cost-effectiveness and the potential scalability of the program.
And so you might be asking yourself, so what is virtual reality job interview training? And so it's a program that we actually call Molly for sure, because we see that virtual reality job interview training's a mouthful to keep saying over and over again. And we call it Molly because Molly Porter is the kind of the star of the show. She's a virtual hiring manager. It's actually an actress that recorded thousands of lines of dialogue, and returning citizens get to repeatedly interview with her where they get feedback on their responses to her questions. And they get a summary of performance feedback on their job interviewing skills. There's also a real-time job coach in the corner of the screen that gives nonverbal feedback to trainees as they're using the tool. Molly even has nine different personalities to provide a variation in the hiring manager. She can be friendly and helpful throughout the interview, or she could be even inappropriate or mean or dismissive during the interview. And though--although Molly is typically accessed through the internet, we were able to adapt the program and install it on local laptops that were used within the prison.
So concurrent with our NIJ funded RCT, we actually conducted an internally-funded pilot study focusing on returning citizens who were older than the NIJ study who are 25 years and older, where we were able to successfully recruit about 10 to 12 participants per month. A major takeaway from this pilot was that due to a breach in confidentiality, where a deputy warden required review of video data, one of our outcomes is job interview skills that's assessed with a video recorded role play. And so there was a requirement to review that data prior to exiting the prison by this research staff. And so we worked with MDOC administration to figure out a way to overcome that, where the staff carried a hard copy of a letter from an MDOC administrator to prevent that from happening in the future. And so a breach like this actually delayed both the pilot and the NIJ trial by about two months, as we needed to, you know, obtain--go through the process of obtaining the personalized letter, and then go back through the IRB and NIJ approval process to change the study procedure.
And knowing that the pandemic created a lot of delays across the NIJ funded study, we were able to finish our pilot study, so just happy to share a few results here. But basically the brief summary of the results from the pilot are that we enrolled and randomized 44 returning citizens and found a significantly higher employment rate, reductions in interview anxiety and increases in job interview skills, as well as increases in motivation to work among those who practiced virtual interviewing as an enhancement to their already strong employment readiness training.
So that being said, the main NIJ funded study started off a bit slow with recruitment between September of 2019 and March 2020. Basically, we noticed that there was an unanticipated lower level of interest in the research by younger, eligible returning citizens in that 18 to 24-year-old age range. And as I mentioned the pilot with the older returning citizens were able to recruit about 10 to 12 per month. And so to facilitate improved recruitment, we actually increased the upper age limit after the completion of the pilot so that we could feasibly recruit more participants. And then COVID-19 hit before we could implement this change and then after it hit, we tried to be forward-thinking about how we were going to conduct the trial and started preparing for a remote data collection and partnering with the prison programs to do the study remotely. And then further long the timeline, you know, due to on-going pandemic-related stacking shortages and reductions in the sampling frame due to infected staff and infected potential participants in April and May of 2020, we actually only recruited one participant during that time and in June while the prison programs was still accessible before they were fully shut down. And so July 2020 was met with a lack of enrolment as the program started closing and they were, you know, temporarily closed because everybody was trying to figure out what was going on with this pandemic. But they weren’t fully closed. And between August and September 2020, we were able to recruit a pool of participants at both of our prison sites that were consistent with the pre-pandemic recruitment results of about 10 to 12 per month. And then we finished off our some cohorts in October 2020 but then that’s really when the prisons were mostly, you know, shut down fully and we weren't able to continue the study. But then that's really when the prisons were mostly, you know, shut down fully and we weren’t able to continue this study.
And so, you know, during this time to try and help facilitate stronger recruitment, the NIJ supported our removal of the upper age limit. And the aim--and the age limit was due to the request for applications that we first submitted the grant to where the studies needed to be focused on 18 to 24 years old. And so part of this change in the age range was making sure we are aligned with the NIJ requirements. And then in May 2021, one prison did reopen a recruitment, but the other remain closed due to on-going staffing shortages and issues related to the pandemic. And so then, in July 2021, we did enroll a cohort in one prison with about 50% of fewer eligible participants compared to the pre-COVID rates. The second prison requested that we moved in-person visits actually, because of the staffing shortages. So then we started the process of going through IRB and NIJ revisions to actually go onsite to do that work during the pandemic. And so one takeaway is that the one prison with the staffing shortages, they really wanted the study to work, you know, they were really strong with their communication. And we spent, you know, a fair amount of time brainstorming with them how we can make this happen, you know, and so through the NIJ review and with their permission, we did agree to remove some of the onsite prison staff responsibilities for the training and provide more of a remote training orientation for returning citizens to learn how to engage with the interview tool and start using it more independently. And then prison access did reopen again in October 2021 for a small cohort but then was reclosed pretty quickly after the newer COVID-19 variant emerged around that time and then infection numbers went back up.
And so at this stage, we are really preparing to recruit our next cohorts in mid to late April. You know, we've been in touch with the prison administrators. We are set to go live at the end of April. So we're really excited about getting the study up and running again, the local--during the pandemic, the local programming employment readiness program had lost all of its vocational students. And so they had to repopulate that program in order to get them through their coursework before they would be eligible for enrolling in the trial.
And so then, you know, one of the highlights of our webinar today are lessons learned from, you know, trying to do this research within the pandemic. And so, you know, some of these lessons learned, you know, although 18 to 24 year olds were in the program, they tended to say no, and part of this was they just wanted the choice. And so having a choice to say no to something was empowering for them. And they, you know, they, you know, since they typically don't have that choice to make, you know, that's something we've learned through just talking with some of the participants, you know, who heard through the grapevine. In terms of other participants who said no to enrolment, because we're only enrolling around 50% of those that we tried to recruit, you know, they were worried that they would be tardy to later daily activities after their study participation, which would then result in a possible infraction for their file. And so they declined to participate. You know, once we learned--it took us awhile to learn this. And so, once we learned it, we did, you know, have some conversations with the MDOC administrators for the program and we're able to, you know, get an agreement that, you know, the participants could potentially show up later to other activities, and that kind of get in trouble for it. So that was, you know, part of that strong partnership with MDOC and then having a great understanding of what we needed to do for the study.
And so, you know, not surprisingly, retention, in this study, post release has been a challenge. Thankfully, the Department of Corrections has been a great partner. And they are, you know, we're supporting our access to administrative employment and recidivism records. And we also were consulted with returning citizens on our advisory board and implemented several strategies to try to increase retention, such as, you know, sending postcards, birthday cards--just trying to make more of a personal connection with them after the release, so that they would be more likely to talk to us afterwards, trying to keep that separation between research and corrections. Also, increasing incentives for completing study follow up. The amount of the incentives was something we learned as well.
And so, you know, that being said, additional lessons that we've learned, you know, involving state level administrators and prison leadership correspondents to help facilitate ongoing communication and responsiveness, you know, staffing shortages, when the prisons reopened triggered the need for our team to develop a remote implementation procedure that required IRB and NIJ review and approval. And the review and approval process, as I mentioned, takes, you know, a couple to three months to get through, with back and forths between the IRB and NIJ. And though we budgeted for support in our study for the graduate students to go onsite and collect data, one of our lessons was that pandemic-related educational and research guidelines changed a lot. And these barriers actually prevented the proposed graduate students from doing this data collection and required us to use full-time research staff instead, which we didn't budget for. And so that was, you know, one of the challenges we had to face and figure out how to overcome.
And so, although we've only enrolled 27 participants due to the pandemic interference, we know that we can reliably recruit 10 to 12 participants per month prior to the pandemic. And, you know, that's our target as we emerge from the pandemic. And so with two years of funding left, our goal is to optimize our enrolment and randomization over the next 12 months, after which we will transition to the follow-up and data processing and analysis. You know, that being said, though, you know, we were forward thinking about budget, and we were able to minimize our spending during the pandemic, which, you know, allowed us to have additional resources for a potential no cost extension if it's approved by NIJ.
And then I'm not going to go through this slide in detail but it represents that although pandemic stalled opportunities to recruit there were still several on-going roles and responsibilities that required staffing in order to try and keep up with the pandemic-related research guidelines and the uncertainties surrounding access to recruitment from the prison programs where, you know, we were on a week to week--we were not sure--and we, kind of, had to wait it out and it was pretty unpredictable. So you, kind of, always had to have staff ready to go if there was going to be an opportunity to engage in recruitment. And so--but that being said, you know, we did take away a lot of lessons learned that I, you know, happy that we were able to get a chance to share with you today. And so what I would say from there is, you know, thank you for the opportunity to present on our study and where it stands right now. And, again, we want to acknowledge all of our partners at the Michigan Department of Corrections, the University of Michigan, and our other various university partners.
ERIC D. MARTIN: Thank you. And thank you to all the presenters this afternoon. These were very informative presentations. I learned a lot, and I've been familiar with all the projects that were featured today. So again, thank you. And for the participants listening in, I want to remind you please use the key and--Q&A feature if you have any questions. We--it looks like we have about 10 minutes left over for questions, so it worked out perfect. I do see one question in the queue already and this is for RTI, Dr. Lindquist. "Must clients have internet access to use the Pokket tool? And if yes, is this realistic for people who are currently incarcerated?"
CHRISTINE LINDQUIST: Yes, the answer is yes. Pokket can only be accessed via the internet and so it does require some commitment and infrastructure related to technology for a DOC partner or an individual facility. Many jails and prisons today do use tablets regularly. And even if they don't use tablets, many do have computers that are set up for educational programming that allow internet access in certain areas of the facility. So there are, you know, many correctional facilities that already do have this infrastructure or can make it happen. Pokket doesn't have to be used on a tablet. It could be used just on a regular computer that has, you know, Internet access that's already being used for educational programming. And, you know, what we had to do in North Carolina was get the Pokket website whitelisted, so that it could be securely accessed. And I believe that's how it works in other states as well, just like other educational programming. If it's internet-based, then you have to get that site whitelisted. And most DOCs have a very extensive security assessment that, you know, you would have to submit, like Acivilate would have to work with the agency to submit all of their like security details, and it would have to like pass a security assessment before it's approved to be whitelisted.
ERIC D. MARTIN: Thank you. We just had another question pop in. I believe this was to everybody. "Has anyone submitted for a no-cost extension? If so, what was your experience and outcome?" It doesn't sound like that happened yet. Good question.
MATTHEW SMITH: Eric, I don't mind mentioning it real quick. Typically, the no-cost extension is something you'd apply for in your final year of the funding and I think most of us aren't there yet. So I know for myself, I'm just starting year four. And, you know, it's not something that I'll be thinking about until--in terms of the actual, you know, submission. I'm already thinking about it but, you know, we won't submit that until probably the last six months of year five.
HOLLY SWAN: Yeah, we're in the same boat at Abt. Similar, we'll consider that towards the end of the time. You know, we haven't even launched yet, so we're definitely hoping that that will--that will happen. But we have been in routine contact with the science officer to make sure, you know, NIJ is aware of the delays and why. And, you know, we've been documenting everything in our progress reports so that when the time comes, hopefully it will be a smooth--a smooth process.
CHRISTINE LINDQUIST: That’s true for us as well…
ERIC D. MARTIN: Yeah, I think...
CHRISTINE LINDQUIST: Ours was a five-year grant, so it gives us a little bit of wiggle room. But hopefully there is a possibility for a no-cost extension, as long as conditions improve.
ERIC D. MARTIN: And I would just say, you know, we treat every request on a case by case basis obviously, but the extent of the interference due to the COVID pandemic on the research portfolio, especially the corrections portfolio in particular, was extensive. So, you know, we're figuring that out as well, you know, and that's part of why we gathered you all here today just to gather our thoughts and lessons learned.
While we wait for another question to pop up, I actually had one and this was kind of brought to my mind by something Dr. Campbell said, but it's for everybody. In his presentation, he talked about how you know, certain practices in the community correction phase that emphasize face to face visitation, you know, had to be modified, obviously, due to the COVID pandemic. And it appears that they may not go back to pre-pandemic levels in terms of the number of face to face contact versus virtual or telephonic contacts. And I was just wondering for everybody, do you see--are there any practices that really stand out that have been changed indefinitely because of COVID, like, maybe the community corrections has found a better way to operate on certain things and the pandemic drove innovation, or I was just wondering if anything stood out to you on that front?
HOLLY SWAN: I can respond to that and say, I think one thing that we noticed that I'd like Walter to chime in, too, if he has other thoughts on this, but one thing we did notice in the Community Corrections Agency, and I think Walter might have touched on this briefly, was a shift away from less office contacts. I mean, you know, community supervision before was almost--not exclusively, but it was a lot of office contacts, you know, having people come in and do their meetings with their officers. And of course, fieldwork was a part of it. But office contacts were a big part of community supervision that, and at least in the sites that we're working in, that has been significantly reduced and doesn't look like it will be going back up to normal volume. And I think sort of relatedly, there was a serious increase in virtual contacts, which were really not a thing before, but now it's looking like those will be, you know, a big focus. And that was something that we tossed around a lot. And--while we've been kind of waiting for things to shake out is whether we wanted to explore looking at virtual contacts. And to what extent can we work that into our design. And we have incorporated that now to try to capture more of that. It's not our primary design, because we're still very much interested in the field piece of it, because that's going to be increasingly, I think, what is in place of office contacts. But we did want to incorporate virtual contacts. So we added that as a component to our data collection as well to capture when it's occurring virtually. But I don't know, Walter, if you had anything to add to that before we turn it to our colleagues.
WALTER CAMPBELL: No, I think that covers the two sort of big changes that seem to be--to be sticking that we saw. I suppose the only other change that I don't know how long this will stick is that it's sort of in the field contacts, it seems that both sides have been doing, you know, when possible when they don't need to go into the home, have been sort of talking with people outside basically meeting people talking with them on the porch and things like that. I don't know that that will stay if it becomes, you know, safer and safer to go inside, but that seems to be sort of a change that's stuck at least for now. Yeah.
HOLLY SWAN: And one thing to note about that, too. One of our sites is Minnesota and so they--there are several months of the year where they can't do things outside or strongly desire not to do things outside, so it will be interesting to see. And it sounded like over the past couple of years, they really focused their field sort of field contacts on sort of those more highest risk kind of people who really need the sort of regular supervision. And so the use of risk assessments and sort of stratifying that way I think has been an interesting approach as well.
CHRISTINE LINDQUIST: And, Eric, we haven't yet interviewed any community probation or parole officers in our study, because very few of our clients have gotten released, but it is something we will certainly ask about when we do our site visits, which hopefully will take place this fall.
ERIC D. MARTIN: Excellent. We did have another question come in. I think this will probably be the only question we have left that we have time for. This is again, for Abt Associates. You mentioned one of the measurements in the study was probation outcomes, in addition to recidivism. Could you discuss that a little bit? How are you operationalizing or defining these probation outcomes?
HOLLY SWAN: Do you want to take that one, Walter, or do you want me to...
WALTER CAMPBELL: Sure. Sure. Yeah.
HOLLY SWAN: Go for it.
WALTER CAMPBELL, PH. D.: So the--again, these are sort of secondary outcomes. The recidivism is our--is our primary focus and these are--will vary a bit by site. So we're sort of letting, you know, the site to find these based off of kind of what they expect out of supervision and also, you know, we have specific higher risk cases. So there--you know, the expectations there may be different from other cases, but it's--essentially it is kind of completion of sort of whatever, you know--you know, the expected programming for each individual. So, you know, the idea is to see--is to sort of look at some positive outcomes in addition to sort of the lack of a negative outcome that is recidivism. So can we see some, you know, positive outcomes in terms of, you know, those people who are, you know, assigned to, let's say, you know, substance use and misuse treatment, you know, how many of them stick with that programming throughout supervision or complete that programming. So, outcomes like that, and I think it's going to be the exact programs will vary between the two sites based off of what they--the types of programming that they either require or, you know, recommend the supervisors--supervisees undertake. But, yeah, it's those sorts of outcomes.
HOLLY SWAN, PH. D.: And to clarify, those will be outcomes that are collected by the probation and community supervision agencies that we're working with. We're not going to be collecting any data from any community treatment provider. So it's whatever our agencies are sort of collecting to sort of monitor and track success and supervision is what we'll be collecting as well.
ERIC D. MARTIN: Well, thank you. Well, seeing no other questions and noting the time, I want to thank you all our presenters for spending some time with us, sharing your thoughts and your insight on what was a truly monumental experience we've been through the last couple years. I want to thank my colleagues at BJA and their partners who run the National Reentry Resource Center. And again, please go to those websites that you see on your screen right now, subscribe for updates, and then also check out the schedule of events we have coming up for, uh, Second Chance Month. It should be a very informative time. And I want to thank you all, all the listeners who are with us today. I hope you learn something and you're able to take back to your jurisdiction or to your own research. And with that, I wish you all a great afternoon and thank you.
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.