U.S. flag

An official website of the United States government, Department of Justice.

Webinar Transcript: Fiscal Year 2024 Evaluating Strategies to Advance the Implementation of Evidence-Based Policies and Practices Solicitation

This webinar provided an overview of the NIJ FY 2024 Evaluating Strategies to Advance the Implementation of Evidence-Based Policies and Practices solicitation, in which NIJ seeks rigorous research and demonstration projects to increase the impact of existing crime and justice research evidence. Specifically, NIJ seeks to fund research that leverages implementation science knowledge and promotes evidence-based policy and practice—most directly by developing, supporting, and evaluating efforts to improve research evidence use by policymakers, agency leaders, intermediaries, and other decision-makers who shape justice outcomes.

Transcript

STACY LEE: Good afternoon, everyone, and welcome to the Fiscal Year 2024 Evaluating Strategies to Advance the Implementation of Evidence-Based Policies and Practices Solicitation Webinar. It is my pleasure to introduce Dr. Tamara Herold, a Senior Advisor at the National Institute of Justice.

DR. TAMARA HEROLD: Good afternoon and welcome, everyone. My name is Tamara Herold. I am a Senior Advisor at the National Institute of Justice. And we're joined today by NIJ Director Dr. Nancy La Vigne and Senior Advisor Liz Groff. Today's webinar will offer an overview of our exciting new evidence-to-action solicitation, which we've titled Evaluating Strategies to Advance the Implementation of Evidence-Based Policies and Practices. Here's just a quick run of show for everyone. We're going to briefly cover the goals, topic areas, and deliverables. We're going to provide information related to the submission of applications and the review process, as well as available support and resources for applicants. We're really providing this overview to facilitate the question-and-answer opportunity for attendees that will close out this event. But before we dive in, we are fortunate in that Director La Vigne has joined us, and she's going to kick off our exchange with introductory remarks. Director La Vigne, the floor is yours.

DR. NANCY LA VIGNE: Thank you very much, Dr. Herold. Thanks to all who have joined this webinar. There's a robust attendance for this, which I'm very excited about. Whenever we release new solicitations that have never been introduced by NIJ before, we're always curious about what kind of interest we can garner with the field. This one in particular is a huge priority for us at NIJ, and it's actually one of the key priorities I brought to NIJ when I joined the Institute as Director almost two years ago now. That is What is the science behind how we can better ensure that research evidence doesn't just get in the hands of the people who can use it to make positive changes to promote safety and justice, but they're inspired to do so and have the resources and the capacity to make those changes? That's a thorny question and it's one that's grounded in this thing called implementation science that is not very far along for the criminal justice discipline. But in other contexts, in medicine and public health and a lot of other spaces, scholars have been toiling away at developing models for various ways that implementation science works - various mechanisms through which practitioners and policymakers learn about evidence and trust the evidence and are inspired to make changes because of it. As Dr. Herold - Tamara - will tell you, she's identified many, many models out there but very few have been tested. And so what we want to do with this solicitation is invite proposals to study the impact of different evidence-to-action mechanisms. This can be any strategy under the sun. Ideally, it's a strategy that has some kind of documentation in the literature, some kind of grounding in theory. 

I'm just going to give you a few examples, but please do not interpret me as saying that these are the ones we want to hear from you about. It could be anything. But one example would be translation. And we do a lot of this at NIJ. So there are great findings but they're buried in journal articles. Maybe it's a test for how you summarize those findings in ways that make sense for the people who are in a position to make changes as a result of that evidence. Another test could be around shoring up what they call “credible messengers” or tapping into thought leaders in a field. We know something about how people are more likely to listen to facts or strategies if they're delivered by someone who's in their orbit, in their discipline and is highly regarded or respected or has walked in their shoes. Or you could test the value of building capacity within an agency to be able to implement evidence. Or you could test a multipronged strategy. We tried to make it as broad as possible. 

I don't want to steal Tamara's thunder because she'll get into the details after I finish up, but suffice it to say that we think this is really important. Of course, here at NIJ, we're dedicated to building evidence on issues of safety and justice writ large, but that evidence is useless if it isn't used to make changes on the ground. So we're trying to build that research evidence so that we can see how other evidence actually takes shape and form in the field. As you think about your proposals, I do want to underscore some other priorities that we have at NIJ. 

One is that we are proponents of what I call inclusive research. We give priority to proposals that take the time to include, at least in some component of the research methodology and the sharing back of findings, engagement with the people who are closest to the issue or problem under study. And that's going to vary based on the topic. As I hope you've noticed, this solicitation is agnostic on what topic or discipline or agency you might care to work with. But building that into the research process, ensuring the findings come back to the people who are best positioned to understand them, provide context around them, and ultimately implement the findings — that's paramount to evidence to action. We also invite people to be aware of and engaged in these issues of the disparities in the justice system, racial and other, that we know are baked into data and tend to be institutionalized in a lot of different ways. So depending on the topic you study and the way you're studying it, we invite you to think about those issues in ways that you can study and mitigate them. 

We are big fans of interdisciplinary research teams. And this is a great topic and a great solicitation to invite partnerships from various disciplines, so we hope you'll consider that. Because we care so deeply about evidence to action, we also prioritize proposals that include a meaningful share of their budget towards dissemination activities. 

With that, I will turn it back to Tamara. I'm going to go off camera, but I'll be back on for the Q&A. Again, thank you for joining the webinar. And thanks to Stacy Lee for organizing it and others on her team, because we've been doing a lot of these, maybe once a week, and it's a lot of work and we're really thankful for their support. Thank you.

DR. TAMARA HEROLD: Thank you so much, Director. We really do have the very best teams at NIJ. And this really is a project of passion for us. Ever since the Director announced evidence to action as one of her key priorities, we've been working toward developing and releasing this solicitation, so this is really an exciting day. The way that we'll work through this is we're going to start with a succinct summary of the purpose and goals of this solicitation. I think it's the most important part of my exchange with you. I would caution, please remember that there are additional details and other important information that's contained in the solicitation itself. So this webinar is not meant to replace a very, very careful read of the solicitation, but the overarching goal of this solicitation is to strengthen the impact of existing crime and justice research evidence. 

We seek to fund projects that will achieve this directive through one of two topic areas. And the first involves conducting research that will help us better understand how to improve evidence use across the justice spectrum. So as the Director mentioned, we want to know more about strategies that can effectively reach and influence individuals who develop and implement justice policies. We need to know how to best message evidence to the field in ways that will actually impact practices. For example, the Director mentioned translation. We need to know what types of narratives resonate with our intended audiences, what communication mediums best promote understanding and successfully reach our audiences. What techniques help us to promote trust in the evidence that's being shared and then subsequently increase practitioner uptake and use of disseminated information? These are all really critical questions that I think there's much to learn in this space. We're also interested in how to nudge behaviors and decision-making through existing and emerging technologies in ways that infuse evidence-based knowledge into everyday decision-making and practices with the ultimate goal of improving justice outcomes. So this first topic area is really focused on learning more about effective broad mechanisms for reaching and influencing our justice policy change agents. We want to promote understanding, promote trust, promote use of evidence across our discipline. 

The second topic area dives more deeply into a science that, again the Director mentioned, has been developing for decades across various disciplines and it's known as implementation science. I suspect that some of you on this webinar are familiar with implementation science. But for those who are perhaps less familiar, implementation science generally is the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, again, with the goal of improving outcomes. 

The difference between this topic area and the first is that we're looking to fund research that specifically leverages existing knowledge, the TMFs, which stands for theories, models, and frameworks, that have been developed and introduced as part of the implement science literature over time. Over the past several decades, and again the Director mentioned this, there's been this proliferation of theories, models, and frameworks, hundreds across various disciplines, including public health, medicine, agriculture, social work, education, and so on. We want to know which of these TMFs, which of these theories, models, and frameworks, hold promise for particular evidence-based justice programs and practices. So we're seeking to fund evidence-to-action demonstration projects. In this solicitation, we provide some ideas about how existing implementation TMFs or the theories, models, and frameworks might be used and tested to improve our understanding of their utility. So, for example, when we use a specific TMF, are we better at communicating, helping to implement, and helping to sustain particular evidence-based policies and practices? And are some of these theories, models, and frameworks more effective than others? And, relatedly, will we find that implementation fidelity or the degree to which an intervention is delivered as intended, will we find that it's higher when we approach implementation or structure our implementation attempts around particular implementation science TMFs? And what are the entity impacts of introducing and using implementation science theories, models, and frameworks when working with justice entities to implement evidence-based programs and practices? When we use TMFs, how does it influence the entity's larger approach to learning about and using evidence? How do these models impact the capacity for entities to use and conduct research? So really thinking about capacity-building, long-term sustainability, and some of the other features that we know are critical to really true implementation that's going to last long-term. 

As the Director pointed out, these are relatively broad questions and categories and we believe this provides great opportunities for promising projects and exciting innovative approaches. We really have a great deal to learn in this space, and this is our opportunity with this solicitation. But we do have specific parameters around proposals that will not be considered. So please note that we are not looking for researchers to develop new implementation science theories, models, and frameworks to add to the overwhelming body that already exist but we are looking to test this knowledge. So that's sort of the summary, the overview, the purpose, the goals, the objectives of this solicitation. 

We'll move a little bit more quickly through the additional details of this solicitation but I think some of these are important to point out. In terms of deliverables, applications will have standard grant reporting requirements such as semi-annual progress reports and quarterly financial reports. Award recipients will be expected to send a final research report with a draft due 90 days before the end of the award period. They'll also be expected to submit to the National Archive of Criminal Justice Data all data sets that result from funded work, along with associated files, and this includes any technology or software developed. NIJ also expects scholarly products to result from each award in the form of peer-reviewed scientific journal articles or other appropriate products that's listed in this solicitation. Of course, given the nature of this solicitation, NIJ expects that there'll be equal efforts made to make research findings accessible to practitioner and policymaker audiences. Applications must contain particular elements to pass the initial review and be moved to the peer review phase for consideration. These are outlined in detail in the solicitation. You'll find them starting on page 18. These include some of the content listed on this slide (Slide 6), including particular forms, narratives, budget details, questionnaires, and CVs or resume for key personnel. In addition to minimum requirements, you'll find other required materials listed in that same section. 

For those of you who are unfamiliar with the process, NIJ uses a two-step application process. It requires submitting different information in two different systems, each of which have their own deadline. So, first, you'll need to submit two particular forms in Grants.gov. The deadline for this is May 14 of 2024 at 11:59 PM Eastern Time. Second, you'll submit your full application, including all applicable attachments, in JustGrants, and the deadline for this is May 28, 2024. But please note the deadline on this is 8:59 PM Eastern Time, not the end of day. Just a couple of additional important notes. Applicants must register with Grants.gov and with JustGrants prior to submitting an application, and so please beware of processing delays. We strongly encourage applicants to register several weeks before the application submission deadline. So even if you're not yet ready to submit your application, please register early. We also urge you to submit applications, at least, 72 hours in advance of the deadline. This will allow time to verify that all materials were correctly uploaded to JustGrants. Although this is a little thing, this is a little thing with major implications in some contexts. Please label your documents and attachments with filenames that reflect the contents of those documents. It's going to make it much easier to upload the documents to the corresponding sections in JustGrants and avoid any unnecessary delays or confusion over your application's submission. Should you encounter any issues, please reach out for help from our support services. Numbers and email addresses are listed here (Slide 8), but they can also be found in the solicitation itself. 

As mentioned previously, applications will first be subject to a BMR, or basic minimum requirements, review to ensure that all required materials have been submitted. Applications that meet BMR are moved to external expert review. Peer reviewers will evaluate applications using the criteria that's outlined, starting on page 29. Applications are also reviewed by NIJ science staff and leadership and other federal SMEs as appropriate. All funding decisions are made at the discretion of the NIJ Director, so it might be useful to read the insights that she shared in her Dear Colleague letter that was posted back in February. The link is posted. You'll have access to these slides but you can also Google and find this. 

Here is the weighting scheme that will be used along with, just general, recommendations that address some of the concerns, some of the common concerns that are commonly raised during the peer review process. So just keep in mind, in the Statement of the Problem section, be sure to identify gaps in the literature in a way that demonstrates a comprehensive understanding of the current research, and this includes an awareness of previously funded NIJ projects. In the Research Design section, ensure that your research strategy is clearly articulated. The research question should be clearly derived from the literature review and the research design should flow logically from literature and the questions. It should be appropriately ambitious, meaning that it is feasible, and it should propose the most rigorous design possible. Under the Capabilities and Competencies section, it should be clear that the staff involved and listed in the proposal possess the necessary proficiency to conduct the proposed analysis. Under Potential Impact, we're looking for strong, innovative, and specific dissemination plans, including efforts to reach non-academic audiences. As mentioned previously, it should also be clear that budgets are sufficient to support the proposed activities. 

As you'll see in this solicitation, there's approximately $3 million allocated for awards within this solicitation. The performance period will begin January 1st of 2025. Applicants will be expected to complete the work proposed within a three-year, or thirty-six-month, period of performance. The peer review process will take place close to the solicitation closing and award notifications are generally made in the Fall. Just as a reminder, foreign governments, foreign organizations, and foreign colleges and universities are not eligible to apply and new awardees cannot start work until forms are submitted and approved by NIJ's Human Subjects Protection Officer and the awardee's institution's IRB. 

We'll end by reminding attendees that there are supports and resources for applicants. A few are listed here (Slide 12). There are more listed throughout the solicitation, including on the final page, page 38 of the solicitation. So please as you're moving through the application process, you identify more questions, you need assistance with anything, please take advantage of these resources. At this point, I think we can open this up for Q&A to our audience.

DR. ELIZABETH GROFF: Question number one. “Will you accept projects that touch on both priority areas?”

DR. TAMARA HEROLD: As long as the research design and the research itself is clearly articulated, as I mentioned before, I think that there's natural overlap between the two areas and so I think the answer to this is yes, but I will defer to the Director. We've worked closely [together] on designing this.

DR. NANCY LA VIGNE: Wait. You might need to name which one [which priority area] because, didn't this come up with the community perceptions one last week where you said, “Yeah, it can overlap but essentially choose a lane just so we can get it in the right bucket for peer review?”

DR. ELIZABETH GROFF: Right. So does the preponderance of the proposal deal with one or the other? And that determines which one it goes into.

DR. NANCY LA VIGNE: Does that make sense in this context?

DR. TAMARA HEROLD: I think so. And I think if a demonstration project is involved and you're really working with specific agencies regarding this specific evidence-based program practice, you'd probably go [apply] under Area 2. Understanding that you're going to touch on some of those elements that we identify in Area 1. Again, it's a natural fit. They kind of go together. But, yeah, I think pick a lane, and that's good advice.

DR. NANCY LA VIGNE: And there's some strategy involved too, but I can't really tell you what I would do in that regard. But having spent a long time seeking grants, I might think, “I bet  they’re going to get more under Category 1 so I’ll put it under Category 2.” But I don't think that'll make a difference because we parked the money under the entire solicitation, so we're really going to focus on just the top-ranked proposals, regardless.

DR. TAMARA HEROLD: Correct. Yeah, I don't think it would help to think too deeply into how to game the system in this regard because of the way that we set it up. So it's perfect, yes.

DR. NANCY LA VIGNE: There is one way to game the system though. Submit a really strong proposal.

DR. TAMARA HEROLD: Yes. Perfect. Great advice.

DR. ELIZABETH GROFF: Okay. We have another question. “Curious, why no new implementation TMFs?”

DR. TAMARA HEROLD: I have to just share this. I spent my first many months working with the Director of NIJ reading through the implementation science literature across every field imaginable, and it was such a gift and such a privilege because I have not done that since graduate school, just really sit and learn and read. But at the end of that, I found myself completely overwhelmed because there were literally, and I mentioned this in the description, hundreds and hundreds of theories and models and frameworks. And in working with other experts in implementation science across various disciplines, I think the Director and I, in our conversations, decided the focus should really be figuring out what works in this space at this point. Even though in the justice space, we seem to be a bit behind with some exceptions, there has been efforts to work in the implementation science arena in our space. But compared to other disciplines, we're kind of behind. 

DR. NANCY LA VIGNE: Woefully.

DR. TAMARA HEROLD: Yes. So just looking across what had been produced in terms of implementation science generically across all these different disciplines, my favorite article described the state of implementation science as being in a state of disarray, conceptual disarray because there were just simply so many theories, models, and frameworks. And so the idea isn't to create more of those. We didn't want to continue to build upon that. But, Director, please.

DR. NANCY LA VIGNE: Let me just clarify. We're looking at testing models. We're not looking at developing a new theory of change or a new model. So it could be that you have a proposal that's competitive where you've spliced together a couple different theories or strategies and you're going to test those and you might call it a new framework - call it what you like. What we didn't want to do was invite a whole other "this is how it works" kind of effort because we think there's been a lot of those efforts. We want to test things rigorously.

DR. TAMARA HEROLD: Yes. And to that point, I think the work that has been done in the justice space has done that, has sort of adapted or combined models and then tested those.

DR. ELIZABETH GROFF: Next question. “Do you have an idea of the overall scale of projects in terms of budget that you're likely to find?”

DR. NANCY LA VIGNE: It's the most commonly asked question and it reminds me of when you're given an assignment in school, the teacher wants you to do a paper, and you ask “How long should it be, teacher?” Teacher says, “As long as it takes to cover the information concisely and thoroughly.” That’s my way of answering that I don’t know. We don’t know what we’re going to get in terms of applications, all we know is how much money we've parked in the solicitation. We could get a dozen very small, discrete but compelling proposals for evaluations that test different aspects of implementation or we could get one big demonstration that eats up a fair amount of the budget but it's so strong that we're like, “We got to do it.” We don't get very prescriptive at this stage. We don't want you to hold back nor do we want you to pinch your pennies. Just focus more on the topic, the research question, and what it will take to do it well.

DR. ELIZABETH GROFF: Okay. Great. Thank you. I think you just answered the next question. So let's go to the one after that. “For Area 2, are we testing the evidence-based strategy or the selected implementation TMF?”

DR. TAMARA HEROLD: Great question, because they're related. Obviously, we're looking at testing the TMF. But part of that test will involve looking at the outcomes. Do outcomes improve as it relates to using or applying these frameworks to help guide implementation? So it would be impossible to answer the first without looking at outcomes, but the primary goal is not to test, again, one of the evidence-based practices that we know work necessarily. That's already part of the existing crime and justice evidence base. It's more about pulling from that evidence base and then saying, when we're looking at implementation practices, what works and which frameworks hold the most promise for helping us to implement these in ways where we have implementation fidelity and we improve the quality of outcomes. Director, did you want to add to that?

DR. NANCY LA VIGNE: No, I thought that was very well said.

DR. TAMARA HEROLD: Thank you.

DR. NANCY LA VIGNE: You're welcome.

DR. ELIZABETH GROFF: Terrific. All right. “If a proposal submitted under a prior NIJ/DOJ administration was rejected, I assume there should be a mention of that prior rejection, but would you consider this as a completely new proposal? This would have been in 2019.”

DR. NANCY LA VIGNE: It feels like ancient history to me, but you can address it any way you wish. There's an opportunity for you to include a cover memo and say “This is a resubmission. It was submitted in 2019 under thus and such solicitation and we received the following feedback (I always include the positive when doing these):  the reviewers really like this and not the other, but they took issue with these two things. We've addressed those in this current proposal.” That is a general good strategy to use when you're resubmitting. It's a little questionable when it's a completely different solicitation. So what you know and I have no way of knowing is, what the content of that proposal was and how it squares with what we're asking for here. I would implore you not to use the solicitation as a mechanism to shoehorn a resubmission that isn't responsive to what we're seeking in this particular solicitation.

DR. ELIZABETH GROFF: Okay. I think that was a very good answer. If there are additional questions, they can go always go to the OJP Response Center. All right. Let's see. Do you have to be…

DR. NANCY LA VIGNE: While you're reading, Tamara, what does TMF stand for again?

DR. TAMARA HEROLD: Theories, Models, and Frameworks.

DR. NANCY LA VIGNE: Right. So that's shorthand to say if you can test an entire framework or you can test one strategy or anything in between, right?

DR. TAMARA HEROLD: Correct. 

DR. TAMARA HEROLD: Some of these are subjective distinctions between the Theories, Models, and Frameworks, but that's a whole another discussion and if you know the implementation science literature, this will all make sense.

DR. ELIZABETH GROFF: Okay. Do you have to be aligned with the university to conduct the research or can you work with a law enforcement agency?

DR. NANCY LA VIGNE: I don't remember what the solicitation says. I think you might be allowed to submit as an agency, but we normally see partnerships with research entities, universities, research institutes. It should be in the eligibility at the very beginning.

DR. TAMARA HEROLD: I'm looking right now.

DR. ELIZABETH GROFF: That's right. There's also that brand new one-page summary of all of our solicitations that's up on the solicitation page that you can look at and it list all the people who are eligible on there.

DR. TAMARA HEROLD: Under eligibility, we have pretty much everybody except those exceptions that I noted at the end of my discussion, so state governments, city or township governments. County governments, so on and so forth. Non-profits. So that should include pretty much anything. Obviously having a research partner strengthens the submission as the Director points out but, yes, you can work with a law enforcement agency.

DR. NANCY LA VIGNE: Or the agency could be prime. I will caution that because of the requirements we have to ensure human subjects protections, universities and research institutes are in much better position to serve as the prime applicant because they know how to navigate those issues to ensure that human subjects' rights and data confidentiality are protected.

DR. ELIZABETH GROFF: Great. Well, this is close to a previous question, but I just want to make sure there's isn't a nuance that it would be slightly different. “Will you accept projects that include testing existing capacity building frameworks, which include designing new or adopting existing implementation strategies to address capacity needs/gaps?”

DR. NANCY LA VIGNE: I guess what I heard makes me want to say, as long as you're testing the impact of it but while you're doing that, I see that Tanya Vandyke, forgive me if I mispronounced your last name, Tanya, wrote in the chat just reinforcing the solicitation allows for a co-PI. We absolutely welcome partnerships where one Principal Investigator is at a university and another is at a law enforcement agency or any other non-academic entity. That's perfectly fine.

DR. TAMARA HEROLD: And in regard to the question, I just found it.  We accept projects that include testing existing capacity building frameworks. Yes. And in fact, capacity building is one of our primary objectives as it relates to leveraging implementation science to help improve outcomes. So the answer to that is yes.

DR. ELIZABETH GROFF: All right, great. “In regards to the new investigator/early career opportunity section on page 10, does the proposed early researcher need to meet all three bulleted qualifications or just one or two?”

DR. TAMARA HEROLD: I believe it's all three. Director, you can correct me if I'm wrong, but this is called a non-tenured assistant professor appointment at an accredited institution also having completed a terminal degree or postgraduate clinical training within six years prior and that have never previously received NIJ funding as a PI. So I believe it's all three.

DR. NANCY LA VIGNE: All three? I feel like I need to look at it.

DR. ELIZABETH GROFF: Yeah, I'm not sure.

DR. TAMARA HEROLD: We can respond as we check.

DR. NANCY LA VIGNE: Yeah, we'll respond.

DR. TAMARA HEROLD: We can respond in writing.

DR. NANCY LA VIGNE: I just have to read it. Sometimes I have to look and listening isn't good enough. [Note: This question is answered later in the webinar.]

DR. ELIZABETH GROFF: I understand. So let's move on. We can always come back to that. “Regarding topic area number one, are there particular audiences for translation and dissemination of evidence that are of interest to NIJ?” And then they give some examples, high-level policymakers, local policymakers, agencies, community-based organizations.

DR. TAMARA HEROLD: We are interested in all of them. We kept it broad. We are interested in what reaches the change agents. Those who help to craft and implement policies. We're looking for change agents in the field and we are not particularly interested in any specific audience.

DR. NANCY LA VIGNE: Right. And the only thing I would add to that is that depending on the topic, you're going to have a different array of those audiences or stakeholders. So we don't want to prescribe one over another, not knowing what the topic is.

DR. ELIZABETH GROFF: Okay. “Are there limitations on foreign consultants, partners, et cetera if the prime is US?”

DR. NANCY LA VIGNE: Not to my knowledge, as long as it goes through a US entity for the funding and they have an ability to issue a sub-award or consulting agreement to that person or entity, then it should be fine.

DR. ELIZABETH GROFF: But remember, every situation is unique and so you might want to contact the OJP Response Center with your specific situation, right?

DR. TAMARA HEROLD: Yes, for clarification.

DR. ELIZABETH GROFF: All right. Next question. “Within topic area two, does the evidence-based practice/program being implemented need to be independently funded?”

DR. TAMARA HEROLD: I'm imagining that that's sort of like, just thinking in the policing sandbox, would we pay for officer overtime to implement the strategy that then is being evaluated in that regard? I'm sorry, go ahead, Liz.

DR. ELIZABETH GROFF: Caleb, please add to that, but I think that sounds plausible. Tamara, if you want to continue your answer?

DR. TAMARA HEROLD: No, no, please, please, please. I think it's better if we understand what Caleb's asking.

DR. NANCY LA VIGNE: I just put the text question into the chat and just to underscore Tamara's interpretation, it looks like it's all three of those. [Applicants must meet all three requirements to qualify for the early career investigator award.]

DR. TAMARA HEROLD: Great.

DR. ELIZABETH GROFF: Okay, great. This is the early career investigator answer. Thank you. All right. And Caleb says, yes, Tamara's frame of the question was correct. So please continue with that, Tamara.

DR. TAMARA HEROLD: Yes. So I could imagine that that would eat up a massive portion of the budget, right, Director? So what would be your response to that?

DR. NANCY LA VIGNE: We're not at liberty to say. It's yours to decide, implementation expenses are allowable costs, but it's how you choose to use your budget, understanding that the main point of the solicitation is to invite rigorous tests of various methods, various strategies, whatever you're calling those things.

DR. ELIZABETH GROFF: All right, great. Next question. “Are we encouraged to have multiple demonstration sites to test the framework or focus on just one?”

DR. NANCY LA VIGNE: The choice is yours.

DR. ELIZABETH GROFF: All right. “When understanding uptake/engagement of a particular model, what stage of the program would be most relevant for the solicitation? For example, could this support stakeholder engagement in a pilot?”

DR. TAMARA HEROLD: Yeah. I think from A to Z, thinking through how we get evidence into practice, so stakeholder engagement would certainly be appropriate. Director?

DR. NANCY LA VIGNE: But yes, I just want to caution that when we say that would be appropriate, we're not saying that we prefer that over other stages.

DR. ELIZABETH GROFF: Okay. Great. “You prioritize interdisciplinary research. How do you define interdisciplinary? Would an anthropologist, sociologist, and political scientist be considered interdisciplinary or are you looking for not only social scientists?

DR. NANCY LA VIGNE: I would say that example sounds like the beginning of a joke that all three of them went into a bar. And it would be a really geeky joke that we would find funny and most people wouldn't. That example would be interdisciplinary and an example that combines the social science with the hard science perspective and discipline would be also qualifying. So we're pretty loose when it comes to how you describe interdisciplinary.

DR. ELIZABETH GROFF: Okay. “How many letters of support are too many within our application? While I want to show stakeholder support, this proposed project would affect quite a few departments/entities.”

DR. NANCY LA VIGNE: There's no such thing as too many. Put them in an appendix. They don't count towards the page number limit.

DR. ELIZABETH GROFF: Okay.  “Is this for one award up to $3 million?”

DR. TAMARA HEROLD: I think the Director answered that previously. This is our total. These are the funds that we have available for the solicitation. We're willing to entertain several smaller, really rigorous proposals or if there was one that took up a significant proportion of that money and it was just spectacular, we're willing to entertain that as well. We have a lot of leeway here and discretion and we're just looking for the strongest proposals possible that will have the greatest impact in the field and generate the most knowledge.

DR. ELIZABETH GROFF: All right. “Is there any extension on the six years requirement for the new investigator due to some reasons such as pregnancy or childbirth?”

DR. NANCY LA VIGNE: Yeah, I'm not aware of any exceptions for that. We can bring it back to others in the office. I'm going to guess no, but we'll track it down and it might lead to revisiting the policy in the future. But that won’t happen quickly enough to be applicable to this solicitation.

DR. ELIZABETH GROFF: Right. But also in the transcript, if we get any more information, we'll put it in brackets in the transcript, so.

DR. NANCY LA VIGNE: Right. We'll update it. [Note: This question is answered later in the webinar.]

DR. TAMARA HEROLD: There's a question. “Is it $3 million total available per year or for the three-year performance period?” And that would be not per year. This is the total performance period.

DR. ELIZABETH GROFF: Is there anything that you two didn't say that you wish you had said? 

DR. NANCY LA VIGNE: I really don't want to see one proposal for $3 million. There, I said it.

STACY LEE: Okay. Do you want to wrap up?

DR. NANCY LA VIGNE: Sometimes there are some questions that trickle in. So we'll give it another minute.

DR. ELIZABETH GROFF: Yeah, the poor person's on the fence. Should I really ask this?

DR. TAMARA HEROLD: Do it. If you're on the fence, please ask the question.

DR. ELIZABETH GROFF: You are getting some thanks for your time and support.

DR. NANCY LA VIGNE: That's nice.

DR. ELIZABETH GROFF: We have a question. “Do you have a summary paper on implementation frameworks?”

DR. TAMARA HEROLD: There are citations in the actual solicitation that will guide you towards some really useful resources that will give you insight into the implementation science literature. Obviously, there’s an enormous amount of literature that's out there, but we did try to be strategic in terms of pointing you to relatively clear, relatively comprehensive resources. I hope that that's helpful.

DR. NANCY LA VIGNE: Understanding the literature is an important part of the proposal, demonstrating that whatever it is that you're testing has been a grounding and why, at least theoretically, it should be an effective model or strategy.

DR. TAMARA HEROLD: I think we also cite some helpful websites in the solicitation if you’re looking for what might be the most promising for whatever evidence-based policy or practice you’re looking at. It should help guide you and give you—give you some ideas, give you some direction.

DR. ELIZABETH GROFF: I’ll read it to you so you can see it here too. “You all are providing such insightful information in a clear and collaborative manner, which is lovely. Thank you. Will we get access to you during the process through the resources provided or is that another team of folks?”

DR. NANCY LA VIGNE: During open solicitation period, we cannot engage with you on your ideas, as much as we would like to. Doing so would not ensure a level playing field. Some people know that they could ask. Some people don’t. Also NIJ, because we are not like NFS and other funders that have lots of staff go around, the same people who develop the solicitations manage the process of reviewing the applicants. So we don’t have any way to have a firewall. So unfortunately, we can’t provide guidance to you on specific proposal ideas.

DR. ELIZABETH GROFF: Thank you. Okay. We do have two other questions.

“What grading systems will you accept for what constitutes an evidence-based practice? For example, if you’re looking at juvenile justice and child welfare systems, do you accept practices rated on the California or Title-4E clearing houses?”

DR. TAMARA HEROLD: Yeah, the proliferation of the Theories, Models, and Frameworks is very real. So are the clearing houses, they’re everywhere and there’s lots of these. And so I think that the idea is not that we would prioritize one evidence-based practice over another or say that there’s not enough evidence around this to suggest that this would be useful. But just the idea that it has been tested previously, rigorously. We found that it works, at least in some particular context is enough to suggest that that practice is worthy of assisting with this particular exploration of what helps to implement that practice. Director, would you agree?

DR. NANCY LA VIGNE: Perfectly said, yeah. Thank you.

DR. ELIZABETH GROFF: All right. Great. And here’s another question. “You’ve mentioned before that the current field of hundreds of models and frameworks across various disciplines and implementation science, especially in fields like public health. To what extent should we be prepared to discuss this literature when discussing current gaps in the statement of the problem?”

DR. TAMARA HEROLD: I’d love the Director to also add to this. But I think from my perspective, it’s understanding what’s been done around the particular theory model or framework in terms of where it’s been applied and what they have found so far in other disciplines in terms of leveraging that framework and why you think this might be appropriate for the particular evidence-based program or practice. I don’t think that any of us anticipate with the page limitations that you have a complete summary of all the implementation science literature that’s out there. But strategically, as you’re thinking through how you want to describe why you’re approaching something in a particular way using that model and framework. Being familiar with that literature around that theory or model is going to be absolutely critical and also why you believe this can now be applied in this new context and the literature around that in the evidence-based practice. Director?

DR. NANCY LA VIGNE: Yeah, I mean, Tamara, perfect. I don’t know what I would add other than just be strategic about how you use the scarce resource called your page limit. I would remind you also to look in the solicitation around what percentage of weight goes to different components of the proposal from the problem statement, to the methodology, the projects, capabilities, et cetera that can help guide you so that you’re not spending too many pages on that and not talking enough about methodology or whatever it is. I often share that lit review should really be kind of a story that makes the case for what you’re proposing to do makes sense and where it fits in to the future knowledge and how it will contribute further to that knowledge base.

DR. TAMARA HEROLD: It’s a really good question because there’s so much out there. So, thank you for that question.

DR. ELIZABETH GROFF: One of our senior colleagues weighed in on the question of whether we can extend the time for a secure framework investigator. And they said we cannot under current NIJ policy. [This answers the previously posed question concerning who is eligible for early investigator status.]

DR. NANCY LA VIGNE: Thank you for confirming, whoever that was.

DR. ELIZABETH GROFF: I'll tell you later. Okay. Another question from an attendee. “Does research on systems and organizational process that supports the delivery of evidenced action fit in this solicitation?”

DR. TAMARA HEROLD: I don't know what systems or organizational processes but my immediate response to this is yes.

DR. NANCY LA VIGNE: Sounds like a yes. Not knowing any more detail.

DR. TAMARA HEROLD: And probably aligned with having a large-scale impact, which is good.

DR. ELIZABETH GROFF: Let's see. “I've been waffling on this since I think it might be an embarrassing question. This is my first grant application. Here's the actual question. Are local government entities still competitive applicants? Our scale of impact as a result of the project may not be as large as a city state agency and I'm not sure if it's a factor during the review process.”

DR. NANCY LA VIGNE: This is an excellent question. I'm so glad you waffled on the side of posing it because we at NIJ have waffled over the years for good reason on whether we want to fund one-site studies because one-site studies have what's known as limited generalizability. So we know it works in one place, how do we really know it works in another because it's going to be a different place, different context, different populations, etc. I think that the field has backed away a bit from the notion that we need the three to five-site studies for it to be credible. Certainly in the case of this solicitation, we're really interested in just seeding tests of different strategies and a lot of those strategies play out at the local agency level at a single site. So I think it's absolutely responsive to focus on agency-level intervention.

DR. ELIZABETH GROFF: We've gotten to the end again.

DR. TAMARA HEROLD: I know we received some thanks. Thank you to the attendees. This has been great.

DR. NANCY LA VIGNE: But I do want thank you, Tamara, again, and Liz and Stacy and who else? Daryl has been supporting us behind the scenes. It takes a lot to put these webinars on. It takes even more to craft these solicitations and shepherd them through the process, so. Thanks for team NIJ.

DR. TAMARA HEROLD: Yes. And again, thank you to all the attendees and those of you who have interest in this solicitation. We're really excited to see the applications.

DR. TAMARA HEROLD: Thank you for the thank you.

DR. TAMARA HEROLD: Everybody, thank you.

DR. ELIZABETH GROFF: Yes, thanks for coming.

DR. NANCY LA VIGNE: Thank you.

STACY LEE: On behalf of the National Institute of Justice, thank you for joining today's webinar. This will end our presentation.


 [A1]Please link to https://nij.ojp.gov/sites/g/files/xyckuh171/files/media/document/nij-dear-colleage-letter-to-potential-applicants-2024.pdf

Date Published: April 19, 2024