Archival Notice
This is an archive page that is no longer being updated. It may contain outdated information and links may no longer function as originally intended.
This webinar provided an overview of the NIJ FY 2024 Measurement of Community Perceptions solicitation, in which NIJ seeks applications for research on, and demonstration projects of, improved approaches to measuring community perceptions of policing and public safety. Specifically, NIJ seeks to fund research that develops and tests innovative methods to provide representative samples that are accurate across microgeographic areas, cost-effective, able to be implemented frequently, and scalable for use in municipalities of various sizes.
Transcript
STACY LEE: Good afternoon and welcome to the Fiscal Year 2024 Measurement of Community Perceptions Solicitation Webinar. It is my pleasure to introduce Dr. Elizabeth Groff, a Senior Advisor at the National Institute of Justice.
DR. ELIZABETH GROFF: Good afternoon and welcome. I'm Elizabeth Groff, a Senior Advisor at NIJ. And I'm joined by our Director, Nancy La Vigne, and Senior Computer Scientist, Dr. Joel Hunt. The agenda for today's webinar is as follows. We're going to begin the webinar with some introductory remarks from Director La Vigne. Then we'll discuss the motivation for the solicitation and describe the set of desired characteristics we have for any proposed method. From there, we'll explain the structure of the solicitation and provide some information about how your proposal will be evaluated. We'll end with answers to the written questions received in advance and provide the opportunity for you to ask questions right now. At this point, I'm going to turn it over to Director La Vigne for her introductory remarks.
DR. NANCY LA VIGNE: Thank you, Liz. And good afternoon, everyone or, I guess it's still morning for some of you all in the West Coast. I am Nancy La Vigne. I'm Director of the National Institute of Justice. I'm so pleased that we're delivering this webinar today about this particular solicitation. We're aware that when we release new solicitations, there's often more questions from the field than for those which we repeat year after year. This one is a little bit different, and I know that Liz will be tracing the history and describing a little bit the impetus behind why we thought to invite proposals around measuring community perceptions. So, Liz, I hope I'm not stealing your thunder, but I do want to share a little bit more about this one because it's a topic I care very deeply about.
It’s related to some research I've done in the past and some challenges I've experienced, and then just overall observations of where the field is and where we need to go to have better measures of community perceptions around crime and safety generally but, more specifically, how they're viewing the police, their trust in police, their belief that police are interacting with members of the public in ways that are objective and unbiased and fair and procedurally just. All these issues that we know comport with the kind of policing that we want to see.
Yet, if you look at ways to measure community perceptions of policing and public safety, they tend to be very coarse in nature and imprecise as well. I think the most common measurement of public perceptions of policing is through police satisfaction surveys. These are surveys that are administered typically by the police agency itself. Given resources, it's not surprising that they take the path of least resistance, which is basically to send a survey in the mail or by email, but that would be the only people who already subscribed to follow the law enforcement agency or perhaps have links to a place where you can go online. We know that all those methodologies underrepresent people who are in the highest crime, most heavily policed communities.
Those same biases exist even in very rigorous surveys. Of course, we can go to great pains to look at responses and weight them based on differences in response rates by demographic characteristics, but we're still not getting at variations by local geographies. And this actually culminates in the combination of some of the research I've done in the past as well as that of Dr. Groff, in that we know that all crime is local and we also know that a lot of police responses are very concentrated in small geographic areas. So experiences are going to vary, certainly within jurisdictions, but even within places that are called “communities” which are still very large. We know that because those patterns are different, likely responses are different too, but we're not capturing them. I think it's important to capture this, and capture it over time, in a way that's affordable, because we want to see how changes in police policies and practices relate or don't to changes in public sentiment.
I think it's also important to look at whether changes in public sentiment are maybe immune to local police practices but very influenced by national events that might not take place anywhere near the community at all. You have an event like the one in Ferguson or the killing of George Floyd and you'll see public sentiment change regardless of the location. So I think there's a lot more we can learn, both in getting more precise about representing people who are often not heard, but are arguably the most important types of stakeholders when it comes to policing high crime areas, which largely tend to be communities of color, as well as to get at more microgeographic levels when we are able to collect this data and do so with a sense of competence and rigor. That's the goal of this solicitation, very broadly. Again, Liz will get into a lot more detail on that. I will be lingering to answer questions along the way.
But I did want to signal some of the priorities that I brought to NIJ as Director, almost two years ago now, that we've been embedded into all of our solicitations, this one included. One is focused on what we call inclusive research. This is making sure that whatever the methodology is, whatever the research question is, that there's some way that we're taking time to engage with the people who are closest to the issue or problem under study. I think that's really important. It doesn't mean that we're talking about full-on qualitative or full-on community-based participatory research. In fact, we like to recommend more of a mixed methods approach. But we do think it's important to include those perspectives in research undertakings.
We also like to invite people to focus on research that takes a racial equity lens - and really an equity lens writ large across all manner of the demographics and identities. We know that a lot of issues of racial disparities are kind of baked into the criminal justice system, dating back to this country's history with slavery and Jim Crow, mass incarceration, heavy police presence, all of these things that feed into the data that we use and how we interpret it, and even our own methodologies. We encourage you to keep those in mind as you prepare your applications.
We're also very interested in what we call “evidence to action”. Proposals that really take the time to focus on creative dissemination activities that don’t just bring the evidence to light - the evidence that’s generated through the research - but also ensure that that evidence gets in the hands of people who can make a difference on the ground. And in the case of this solicitation, for example, you can combine that goal of evidence to action with the goal of inclusive research. So if you're proposing a way to collect information about community sentiment in small geographies in ways that are credible and useful and affordable, and you can do that measurement over time, how are you bringing that information back to the community? How are you bringing it back to city leaders, to stakeholders, to police executives, and, importantly, to the residents of those communities?
I'll stop there and turn it back to Liz. I will stay on to take part in Q&A at the end. Thank you.
DR. ELIZABETH GROFF: I want to start by saying that this webinar is providing an overview and a chance for you to ask questions, but all the details are in the solicitation. So, I urge you to read it carefully as you are thinking about developing a proposal and as you are submitting that proposal. Let's begin with some information about our motivation for offering this solicitation. First, consistent rigorous measurement of community perceptions can provide critical feedback to police, city managers, and community members. Such feedback can support the development of new policies and programs, as well as the evaluation of current practices.
Although probability surveys have’ long been the gold standard, they’ve traditionally been expensive and time-consuming. Unfortunately, this meant that they were not deployed frequently, if at all. In addition, technological and cultural shifts are leading to lower response rates and making probability surveys more vulnerable to nonresponse error. This has led to the greater use of non-probability surveys, but they also have significant drawbacks when determining sample representativeness. At the same time, new methods have emerged that leverage the increased accessibility of big datasets to gauge community perceptions. NIJ's goal with this solicitation is to develop methods and analysis techniques that lead to more accurate measurement of community perceptions. Next, let's discuss the core characteristics we are looking for in the proposed methods.
New methods proposed under this solicitation should have the following characteristics. They should accurately represent the characteristics of the community on the key dimensions of race, ethnicity, age, and gender. They should be cost-effective to allow for frequent measurement. They should produce estimates at small geographies to reveal differences in the patterns of perceptions within a city. They should be capable of administration on a systematic basis to produce a longitudinal record of changes in perception. They should be scalable for use in jurisdictions of various sizes.
Now, let's turn to the structure of this solicitation. Proposals will be accepted under two topic areas, survey methods and existing data analysis. Within the survey category, the focus is on developing innovative survey methods, sampling strategies, and survey technologies. All proposals under this topic should use a survey instrument. Within the data category, topic two, the emphasis is on gathering information without directly contacting members of the community. Entries for this might use social media, administrative data, or other proxy data. They could use a variety of different tools and techniques such as natural language processing. We encourage you to bring us your best ideas. Within the survey category, all proposals should use a survey instrument that is already undergone pilot and cognitive testing. Two such instruments are from the Bureau of Justice Statistics. One is the Local-Area Crime Survey and the other is redesigned NCVS instrument. There are questions on both that have to do with community perceptions of public safety and demographics.
Examples of responsive proposals include ones that develop new approaches or adopt approaches from other fields. Create new methods or analysis with the goal generating representative estimates. These include techniques for weighting, imputation, and nonresponse bias analysis. Responsive proposals might pair the BJS Local-Area Crime Survey questions 10 through 20 in demographics with new methods that they're proposing. Also, responsive proposals use of technology solutions that do one or more of the following, test survey administration techniques or create new sampling strategies, generate more complete sampling frames, or increase survey response rates.
[Successful proposals will include the deployment of the instrument and testing to quantify the level of representativeness achieved. They are not exclusively qualitative but can include qualitative elements. They are encouraged to include community engagement when appropriate. Recall that the point of this solicitation is to improve the quality of community perception measures with the goal of deployment by local governments.
Turning now to topic two, existing data. We expect all proposals under this topic to pilot the methodological innovation they're proposing and provide rigorous testing of the results. Proposals should propose methods that accurately measure key constructs of community perceptions, such as, but not limited to fear of crime, police accountability, police bias, trust in police, and community policing. They should be accurate across microgeographies no larger than census block groups, be cost effective and scalable, and be capable of frequent implementation. We are interested in proposals that identify novel data sources or that develop methods and technologies for processing data to enable the application of new forms of data mining. At the same time, applicants should discuss potential harms and risk to individuals from the use of existing data and provide a plan to mitigate such risk.
Shifting gears from the substantive elements of this solicitation to the scenario in which your proposal is funded. There are several required deliverables. You will have the standard grant reporting requirements, such as the semi-annual progress reports and quarterly financial reports. Additionally, any recipient of an award under this solicitation will be expected to submit a final research report and provide a draft 90 days before the end of the award period. Award recipients will also be expected to submit to the National Archive of Criminal Justice Data all datasets that result, in whole or in part, from the work funded by the award, along with the associated files. This includes any technology or software developed. In addition to these deliverables, NIJ expects scholarly products to result from each award under this solicitation, taking the form of one or more peer-reviewed scientific journals or other appropriate products as listed in the solicitation. NIJ expects that there will be an equal effort to make such research findings accessible to practitioner and policymaker audiences.
Now, on to the nitty-gritty of submitting. There are certain elements of an application that must be submitted for the application to be considered. If even one of these elements is missing, the application will not move on to the review process. The following elements must be included in the application to meet the basic minimum requirements and advance to peer review and receive consideration for funding. These are on page 19 of the solicitation and include forms SF-424 and SF-LLL — and these go to Grants.gov — the Program Narrative, budget details and narrative, and the CVs or resumes for key personnel. In addition to the minimum requirements, there are many other requirements that need to accompany an application for full consideration. I've listed a few here, but they are all listed on page 20, where a full explanation and additional application checklist can be found.
NIJ's moved to a two-part application process that requires submitting different information in two different systems, each of which have their own deadlines. First, you'll need to submit that SF-424 and SF-LLL in Grants.gov by May 2 at 11:59 PM. Second, you'll submit your full application, including all applicable attachments in JustGrants. The deadline for that is May 16 at 8:59 PM, not the end of the day like it was for Grants.gov. So please be aware of that and make sure that you submit ahead of that time. A couple of important additional notes, applicants have to register with Grants.gov and JustGrants prior to submitting an application. There have been processing delays up to several weeks, so make sure that you look to register for these in the next couple of weeks just so it's done and off your plate. If you do not submit your forms in Grants.gov by the deadline, that May 2 deadline, your application in JustGrants will not be accepted. When submitting your application, we're urging that you submit at least 72 hours prior to the deadline to allow time for you to receive a validation message, yay, or a rejection notification from Grants.gov. This way, you'll be able to correct in a timely fashion any problems that may have caused a rejection notification. This will also allow you time to verify all materials were correctly uploaded to JustGrants. We recommend that you label your documents and attachments appropriately with the filename indicating what the document is and upload the document to the corresponding section in JustGrants.
There are many ways to get support and they're listed up here on this slide (Slide 12). They're also in the solicitation, as well as the contact information for each of these ways of getting support.
Let's turn to the application review process. So what happens after you turn in your proposal? In the application review, we first determine whether the applications meet Basic Minimum Requirements or BMR. That's just making sure that the Program Narrative, Budget Detail and Narrative, and CVs have been submitted and they have content. All applications that meet BMR are moved to external expert review. Peer reviewers provide us with scores and comments. Applications are also reviewed by NIJ science staff and leadership and other federal SMEs as appropriate. All funding decisions are made at the discretion of the NIJ director, who you met earlier. For additional insights beyond what she said this morning, or this afternoon, see her Dear Colleague letter, and it's easy to Google if you don't want to type in the URL that I've provided here. (https://nij.ojp.gov/sites/g/files/xyckuh171/files/media/document/nij-dear-colleage-letter-to-potential-applicants-2024.pdf)
The next two slides flag some common substantive critiques we see raised during the peer review process across the hundreds of applications submitted to NIJ each year. In the Statement of the Problem section, sometimes applicants don't identify gaps in the current literature or demonstrate an understanding of current research, the lit review is sometimes insufficient, and the scope of the proposed research is extremely limited or conversely too ambitious. In the Research Design section, sometimes we see that the research questions are not derived from the literature review or are inadequately specified. Sometimes the Research Design is too ambitious and too complex to be feasible or fails to demonstrate the proposed research and analysis plan.
Under the Capabilities and Competency section, sometimes the proposed staff does not demonstrate familiarity or proficiency with the proposed analysis. In the Potential Impact, we see instances where the dissemination plan lacks specificity or is not innovative or there's no plan to reach non-academic audiences. For this solicitation, it's important to make a case for how the research results will meaningfully contribute to measuring community perceptions and eventually to improving safety and justice. As well, the dissemination plan should detail how the research results can be made actionable by local governments.
Overall, some key takeaways from these critiques are that research proposals need to be well-written, feasible, impactful, timely, innovative, and clear. Proposals should demonstrate an understanding of the current needs, the existing literature, and the work that NIJ has already funded. The application itself should be easy to read and explicit, leaving no mystery around what's being proposed or how it'll be achieved. The research design should be as rigorous as possible and the sampling strategy should be backed with demonstrated relationships or letters of support that will make it attainable. The application should articulate the extent and importance of the project's impact on the field. That is applicants should ensure that the findings from the proposed research will have the potential for high external validity beyond the focus of the study itself. Applications should also demonstrate how research independence will be maintained.
Now, I'll address some frequently asked questions. The total amount to be awarded under this solicitation is $1 million. The period of performance will start January 1, 2025. Applicants will be expected to complete the work proposed within a three-year, or 36-month period of performance. The peer review will start close to the solicitation closing and award notifications are generally made in fall. As is typical, foreign governments, foreign organizations, and foreign colleges and universities are not eligible to apply. And, also, new awardees cannot start work until the forms are submitted and approved by NIJ's human subjects protection officer and the awardee's institution's IRB. If you have any questions about these types of issues, you should go to our OJP Response Center, and here's that information again. A list of resources is included on page 40 of the solicitation. But I'm also emphasizing them here.
To ensure costs that you're proposing are allowable, we strongly encourage applicants to review the Funding Resource Center for additional information and helpful guidance. We also encourage applicants to review the department's Grant Financial Guide and take the online training. Finally, project descriptions and an overview of the portfolio are available on the NIJ website. Review this information for an idea of the types of projects and awardees that NIJ has funded in the past.
At this point, we're going to turn to questions, and I'm going to start with the questions that were submitted in writing ahead of the webinar. The first question is, “Do surveys utilizing only survey questions from the LACS or redesigned NCVS need to be piloted?” No. We chose them to allow investigators to concentrate on testing their proposed methodological improvements. However, they still need to be administered.
Question two, “Can projects intending to use LACS or redesigned NCVS items include additional demographic questions from the NCVS or LACS questionnaire? Or is it limited to only those items listed in the RFP?” Yes. Demographic questions can be included from those questionnaires and should be included. The purpose of specifying questions was to focus respondent attention away from question development and towards survey sampling and analysis rather than to specify specific questions.
Question three, “There are two questions labeled as number 30 on the NCVS questionnaire, BJS is not going to like that but, we should confirm that both questions could be used and count as questions 21 to 33?” Yes. That's absolutely right. They can.
Another question, “Could NIJ clarify what they mean by testing representativeness? For instance, would using a technique like multilevel regression and post-ratification be considered a test of representativeness? How might researchers ensure representativeness for populations for whom nationally available population estimates are limited?” Answer, primarily, tests of representativeness concern the extent to which the sample respondents are like the target population, for example, across dimensions like race, ethnicity, age, and gender. The types of tests will vary based on whether it's a probability or nonprobability survey. References are included in the solicitation for each type. This solicitation seeks to find survey designs targeting within municipality variation and community perceptions. A national-level survey would not be responsive.
I think we're up to question five. “Regarding topic one, we see clear focus on small geographic units, the other thoughts jump to sampling and measurement of one or more group that is not first defined by geography. While we would like to include a geographic component, city or county being the most logical and likely, we worry that focusing on such a small geographic area would preclude what we see as an important line of study that otherwise is responsive to this solicitation. We hope you can advise if such a proposal would be considered responsive or nonresponsive.” Answer, proposals that produce estimates for cities and other larger geographic areas would not be responsive to the solicitation.
Six, “I'm wondering if there's any openness at the NIJ to funding scalable qualitative data collection initiatives. My guess is that this opportunity is more focused on quantitative approaches but thought I'd inquire before assuming.” Good strategy. Always good to inquire. Scalable qualitative or mixed method approaches may be responsive if they are also cost-effective.
Question seven, another multi-part, “We have a project plan that would involve software development and community engagement and we are concerned about the feasibility given the budgetary constraints. Are we correct in interpreting that the $1 million is for the whole program, not per individual effort? And if it is for the whole program, how many awards do you anticipate?” A total dollar amount of $1 million is available for the solicitation. We do not have a specific number of awards in mind. We are seeking the best return on investment. If a single proposal provides that, we'll fund one. If it takes multiple proposals, we'll fund multiple proposals.
Question eight, “How much focus is on projects specifically targeting leasing perceptions versus public safety and or other aspects of the system such as courts, role, and safety?” The focus is on perceptions of public safety generally ( the suggested questions in the solicitation). And again, we're interested in methodological innovations with this solicitation.
Which leads me to the final question, “Is this specifically focused on survey methodology?” and the answer is yes. So that concludes the questions that we received in advance, and I'm going to get some help from my dear colleague, Dr. Joel Hunt, at this point, to do the rest of the questions you're asking in the session.
DR. JOEL HUNT: Yes. I've been trying to keep up with all of them and put them in the order that we've received them. So the first one is, “How does this solicitation align with new bills passed, like the one in Utah, State Senate Bill 124, which for people not familiar with it is predominantly around law enforcement early intervention systems?”
In the sense that that bill is predominantly looking at trying to better measure individual officers, whereas in this solicitation, we are measuring community perceptions of public safety and police organizations. So while there might be some relationship between those, we are not trying to measure perceptions of individual officers.
DR. NANCY LA VIGNE: Correct but to be clear and to emphasize Liz's point, we're trying to see how we can measure community perceptions at a local level. My understanding of that piece of legislation is that it is officer specific. Relatedly, I think our challenge in communicating our goals for this solicitation is that we don't want to invite proposals that say, “We're going to create a survey.” You might have a survey in hand, in which case, fantastic [as long as it’s a survey that has already been piloted and undergone cognitive testing]. If you don't, we offered up one for you to use. But we don't want to focus on survey instrument development. We really want to focus more on innovative methodologies to get at some key concepts, key perceptions with safety, key perceptions of general concepts of procedural justice. Those don't seem to be aligned with that particular legislation, which is really more officer focused.
DR. JOEL HUNT: The next one is, “Is this an appropriate place to seek funding if I'm conducting research on individuals' perceptions of judges who show empathy towards offenders, with offender characteristics manipulated?”
DR. ELIZABETH GROFF: That does not sound responsive.
DR. NANCY LA VIGNE: It doesn't sound responsive to me either.
DR. JOEL HUNT: I agree.
DR. JOEL HUNT: “For the Topic Two ‘Existing Data,’ I assume we should use some existing datasets, then what accounts for identifying novel data sources? Does that mean we can find new datasets not commonly used in the current practices?”
DR. ELIZABETH GROFF: Yes. That's what I was thinking when I wrote that. It shouldn't preclude data collection if you have a way to do it that doesn't involve a survey instrument. I know I call it existing data because it was hard to come up with a good label for that. But it's really ways of collecting information about community perceptions without asking them via survey. [The intent of this solicitation is to identify data sets (most likely but not only existing) that either have not been used effectively in prior research or have not been used in prior research at all. That is what makes them novel.]
DR. NANCY LA VIGNE: That's right. And I was actually going to clarify, and in the process of clarifying, I found this exact question come in which is, can a proposal can include both original data collections through surveys and the use of existing data? The answer is yes.
DR. ELIZABETH GROFF: Yes. And I believe that we specify they're supposed to apply separately to those two different topics.
DR. NANCY LA VIGNE: But there is an option to do a combo, to the applicant has to choose which lane, right Liz?
DR. ELIZABETH GROFF: There's a statement in the solicitation, I know there is because we went back and forth on this so many times. While we're thinking about another question, let me pull up the solicitation.
DR. NANCY LA VIGNE: Okay. Thank you.
DR. JOEL HUNT: “Does NIJ have an estimate on the number of awardees that it anticipates granting on the solicitation?” And Liz went into this at the end that it could be one, it could be two, it could be three, right? Depends on what is the best way to spend our million dollars, essentially.
DR. ELIZABETH GROFF: Excellent answer.
DR. JOEL HUNT: Which kind of answers the next question, “Is the award $1 million across all accepted applications or up to $1 million per application?” So again, you can come in at $1 million dollars and if that's the best investment from NIJ's perspective, it is a possibility.
“Could you elaborate on how this solicitation is different from the Community Perceptions Challenge last year?”
DR. ELIZABETH GROFF: Sure. With the Community Perceptions Challenge, we were just looking for ideas. Here, we're looking for an idea and a demonstration of that idea.
DR. NANCY LA VIGNE: A pilot.
DR. ELIZABETH GROFF: Yes. [A pilot of the idea in the field or using real data].
DR. JOEL HUNT: How do you propose a community's cultural views and values are represented in a survey?
DR. NANCY LA VIGNE: That's not an area that we would go into. Again, just to reiterate the point that we are not inviting survey developments per se, but if you choose to include some way of accounting for cultural views and values in your proposal, you're welcome to do so.
DR. JOEL HUNT: Does a firm have to be, and I believe by firm, they're meaning applicant, have to be a nonprofit organization or does it have to be a government organization? I don't think there was any qualifiers on who's an eligible applicant other than being inside the United States, correct?
DR. ELIZABETH GROFF: I think federal government agencies cannot.
DR. JOEL HUNT: Okay. “Are you looking for a proposal for a single site, like a pilot, which can then be rolled out to the nation as a model as implemented, or a proposal which addresses the vast majority of US communities?”
DR. NANCY LA VIGNE: There's no preference. Both would be welcomed.
DR. JOEL HUNT: Okay. To clarify your response to previous questions, can an already existing local survey that includes questions from the local crime survey and NCVS be used?
DR. ELIZABETH GROFF: Yes, as long in one of your appendices, you demonstrate how it's been tested in the past, so we know that you have a strong instrument.
DR. JOEL HUNT: “What is the expectation around partnering directly with the justice system around data collection?”
DR. NANCY LA VIGNE: It's entirely up to you.
DR. ELIZABETH GROFF: I don't think it's necessary to partner with a criminal justice agency in order to respond to the solicitation.
DR. NANCY LA VIGNE: But you might choose to.
DR. ELIZABETH GROFF: You might choose to. I'm just saying this is about survey innovation and other data innovation.
DR. JOEL HUNT: “Are for-profit startups strong candidates for consideration?”
I don't think applicant type plays a factor in this solicitation.
DR. ELIZABETH GROFF: Correct.
DR. JOEL HUNT: “Has anyone discussed phone or text surveys? These can cross all boundaries.”
DR. ELIZABETH GROFF: If you can't tie it to a small level of geography, then it's not appropriate for this solicitation.
DR. JOEL HUNT: All right. Follow-up from a written comment, to clarify response to my question, “Would a proposal focused on individuals who are homeless within a city be considered responsive?”
DR. ELIZABETH GROFF: No. This solicitation is targeting community perceptions of public safety.
DR. NANCY LA VIGNE: The homeless population is a part of a community. It's just too narrow a slice. So we're looking at representativeness at small geographies. If you proposed a methodology that was aiming to be representative and included the homeless population, terrific.
DR. ELIZABETH GROFF: Better answer. Thank you.
DR. JOEL HUNT: “If we have some unique questions that would be central to innovative methods, are they allowable even if they are not identified in existing tools?”
DR. ELIZABETH GROFF: I understand you may not want to say too much about your unique contributions here on this general webinar, but all I can say to that is, we are very interested in unique, innovative ideas [about survey methods and use of existing data to measure community perceptions of public safety].
DR. NANCY LA VIGNE: That's what we're trying to do, is spur innovation. But without knowing more details, we can't tell you how viable they are. And even if we knew more details, we couldn't tell you how viable they are until we see it in the proposal.
DR. JOEL HUNT: “I'm still a bit confused on the geographic area. What I'm hearing is that city-level measure is too large, but Nancy spoke about national level items as well. It seems to me city-level measures is at the agency level and would be responsive but perhaps I'm wrong.”
DR. NANCY LA VIGNE: Let me clarify. When I made reference to national level, I was talking about national incidents that might influence local community perception, which is just a question I have in my head. That's one of these research questions that I wonder, “How could we answer that?” But we are pretty explicit with this solicitation that we want geographies well below the entire jurisdictional level. Because we know that the population is not even across an entire jurisdiction, that crime isn't, that policing isn't. So we're trying to get smaller geographies here.
DR. JOEL HUNT: So Liz, I don't know if you've had an opportunity to look up that answer, if their proposal can include both survey methods and use existing data?
DR. ELIZABETH GROFF: Yes. The sentences that I was referring to don't exactly address that issue. So let me provide additional guidance. If your innovations are primarily an area of survey, then you should put it under topic one, even if it includes some existing data or some novel data. Put it under the topic that encompasses most of your proposal.
DR. JOEL HUNT: Okay. “I'm wondering how you plan to use the findings from the grant and/or does it make sense to approach this grant as a partnership where we can improve methods for the NCVS and similar surveys?”
DR. ELIZABETH GROFF: NIJ is hoping that we're going to end up with really game-changing ideas out of this that will allow local governments to survey their populations periodically, so let's say every year. That's the main focus.
DR. NANCY LA VIGNE: Again, we're trying to spur innovation. I think an important thing to emphasize is we want you all to propose strong methodologies to assess whether these innovations are yielding their intended impact. That's a really important component of what we're seeking in proposals. Whether that leads us to take one or more of the funded pilots and bring it to our partners at BJS and include it in that methodology, that remains to be seen. It would be a lovely outcome but we're not explicitly focusing on that outcome.
DR. JOEL HUNT: “If an innovative technology has been piloted with a different population or different problem set, will it be responsive to the solicitation?”
DR. ELIZABETH GROFF: I don’t see why not. There's text in the solicitation that explicitly says you can bring in novel methods from other disciplines and apply them here. I think what you’re asking falls under that.
DR. JOEL HUNT: “If we use an existing survey, can we also use a survey that is developed with the input of stakeholders and community members?”
DR. NANCY LA VIGNE: Yes, but that's not the point of this solicitation. We don't want focus on the survey questions. We want focus on how we elicit those answers. We're talking about domains that are represented in the sample survey we shared in the solicitation or might be covered in other pre-existing surveys that have already been developed and fielded in some way. We also would welcome proposals where you engage with members of the community. I have some ideas on how that could be achieved but I don't think I'm allowed to share them. But I just want to make sure it's not a focus on developing a whole new instrument.
DR. JOEL HUNT: “In the solicitation on page 11, there's a section on evaluation research. How much weight do you place on this optional component especially in a submission for topic two?”
DR. ELIZABETH GROFF: I looked at that and I was like, “Hmm, it's confusing..” The only evaluation that we're asking for is an evaluation of whether or not your method can produce representative results.
DR. NANCY LA VIGNE: Right.
DR. ELIZABETH GROFF: For low cost at small geography. Right. I hope that's a helpful answer.
DR. ELIZABETH GROFF: Some of this language is…
DR. NANCY LA VIGNE: It's boilerplate.
DR. ELIZABETH GROFF: Thank you. I don't know if we were allowed to say it was boilerplate.
DR. JOEL HUNT: “Can you say more about the non-survey type of innovations you would be interested in?”
DR. NANCY LA VIGNE: I don’t want us to signal some and then mislead you to think that those are the ones that we think would be the most winning. I would invite people to look at the results of the prize challenge that we did just to get a sense of some of the innovative ideas that other people came up with.
DR. JOEL HUNT: “How do you define community?”
DR. ELIZABETH GROFF: For this, we're defining it geographically. We're not going to get into a debate of what a community is. I think we used the term municipality. Areas within a municipality, so that varies by state. They all have different sizes and things but that's what we're defining as a general community. So a city you live in, that's typically the community.
DR. NANCY LA VIGNE: Right. So the city is the community but we're trying to get at micro-geographies that more accurately represent community when it comes to the types of dynamics.
DR. ELIZABETH GROFF: I'm sorry, I left that part off. Think about census block groups. People have been fighting over that in academia for 50 years whether that actually represents a community but that's the unit we're looking for.
DR. JOEL HUNT: Now there's been some discussion in the chat and I'm not sure what it's meant. They've asked, “What might the laws be in cell-phone text-based surveys?” I'm not sure what they mean by the laws here. If the person who's asking can clarify that or are you asking what is the policies within our solicitation in regards to phone and text-based surveys or what exactly?
DR. NANCY LA VIGNE: That's up for you to decide whether you're proposing something that's within the law.
DR. JOEL HUNT: They're asking about the FCC rules and laws now regarding the phone and text surveys.
DR. ELIZABETH GROFF: Since we didn't specify request phone and text surveys, we didn't do any research on that. So that's really on the applicants.
.
DR. ELIZABETH GROFF: We all do really, really appreciate everyone who came and attended the webinar to hear more about this solicitation. We're all very passionate about this solicitation and what we hope that it will provide.
DR. NANCY LA VIGNE: That's right. And I will just say, I know it's a tough one. It was hard for us to write. Having spent many years being on the grant-seeking side of things, I recognize that there's a lot of questions about, “What are they really looking for?” So I hope that this webinar was helpful in getting a better sense of that. And I wish you all the best of luck.
DR. JOEL HUNT: Yeah. We did have an additional one come in. “Can we send the draft or proposals for you to seek your comments?” Unlike other agencies like NIH where the scientists are not also the solicitation developers, we develop the solicitations as scientists. We help them in the review process including the internal review and briefing the Director, so we are unable to provide comment. Once our solicitations open, we cannot provide any direct substantive feedback like that.
DR. NANCY LA VIGNE: Again, I would just invite you to look at the “Dear colleague” letter that I sent out just a few weeks ago.
I know it has my name on it but it actually comes from the collective wisdom and observations of so many NIJ scientists who have made note of what makes for winning proposal and where some of the traps and the flaws are. I really encourage you to look at that carefully and think about it as you craft your proposal.
DR. JOEL HUNT: This webinar will not itself be posted but the transcript and the Q&A will be posted on NIJ website. There is a little bit of a turnaround time. I don't know if Stacy wants to answer.
STACY LEE: About two to three weeks it usually takes to post.
DR. ELIZABETH GROFF: You can register with Grants.gov and JustGrants before you see the transcript. [It is very important to register early to avoid delays that might keep you from submitting your proposal.]
DR. JOEL HUNT: It's the slides and the transcript, including the Q&A are all posted.
STACY LEE: That's correct. We're going to wrap it up. On behalf of the National Institute of Justice, thank you for joining today's webinar. This will end our presentation.
DR. NANCY LA VIGNE: Bye all. Thank you.